A merged reality experience blending real world objects with virtual world interactions
Technology: Unity, MQTT + Feather, HTC Vive, Leap Motion
Role: Lead Developer
To showcase Second Story’s Merged Reality (MR) capability, a small cross-geo team of talented developers and designers built a MR experience called ‘Firefly’, an engaging merged-reality experience which combined the HTC Vive headset, a Vive-tracked physical jar, hand tracking with Leap Motion, MQTT, and haptics to allow users to capture virtual fireflies in a physical jar in an ethereal VR world in Unity.
In order to easily imbue a physical object with MR capabilities, I leveraged the messaging protocol MQTT and wrote a plugin for Unity I called "Mqttify", which worked alongside an Arduino sketch running on a Feather to create relationships between physical and digital objects.
After the initial small-scale test above, I built a room-scale MR demo in Unity where simple physical objects placed around the room, such as a button, a desk fan, and a lightbulb, in the real world had virtual-world counterparts located in the corresponding virtual location. When the physical objects were interacted with in the real world, the objects in the virtual world would react as well, and vice versa.
Finally, working alongside a cross-geo team of developers and designers, I used the room-scale prototype as a starting point to build an engaging merged-reality experience which combined a Vive tracker, a Leap Motion, and haptics to allow users to capture virtual fireflies in a physical jar in an ethereal VR world in Unity . My role was to build the Unity environment and set up the corresponding physical environment, program the firefly and 3D jar behaviors in Unity, connect the physical jar hardware with the MR world using my Mqttify plugin, set up and support the Atlanta build, and own the master build for all studios.