Localized Space Display

Built with Vincent Chen

HEAD TRACKING; UNITY SPRING '17

Motivation

VR fails to provide an adequate tool for the daily user in the workplace environment, where users typically sit alone, use the computer several hours a day, and multi-task across multiple applications. In order to manage tasks and communicate among teams effectively, workplaces require a new 3D paradigm that does not fatigue, disrupt workflow, or necessitate additional hardware. We have created the Localized Space Display (LSD), an unencumbered automultiscopic parallax-based head-tracking display that enables users to switch quickly between 3D and 2D interfaces. Though the 3D effect (motion parallax) is less pronounced than in a HMD, it is sufficient to provide benefit for designers and engineers working with 3D modeling software. With LSD, any traditional desktop monitor can be transformed into an augmented display. LSD enables users to easily switch perspectives, whether between keyboard and hand-recognition or 2D and 3D. For example, an engineer can grab a 3D model off a workbench, inspect/edit/annotate the model, and then upload the 3D model into a group chat.

Head Tracking

We used the Kinect SDK for Windows to implement face-tracking for our system. In our implementation, we accessed the local coordinates of the based on the location of the Kinect, which we located at a position relative to the monitor of the monitor. This was an important metric for us to keep track of, because we would need to later calibrate these relative values based on the position of the Kinect. In addition, we implemented a filtering mechanism to avoid tracking of multiple bodies by the Kinect. Our device only needed input based on the closest user, so we made it a priority to find the closest user, and consequently only track that user.

Gestural Recognition and Rendering

To implement gestural recognition, we used the Leap Motion SDK, which was a very straightforward way to integrate into Unity engine. The Leap Motion SDK provides basic hand-tracking based on the relative position of the physical hardware device. Consequently, we were able to import the hand models and manually calibrated their locations in relation to the real-world monitor, such that the experience felt genuine. We ran into a few challenges with the hand-tracking of the Leap Motion. First, the Leap Motion has limitations in hand-tracking, especially related to occlusion. For example, when a user’s hand was flipped upside down, the Leap Motion would lose its ability to track fingers. When a user’s hands were clasped, the device would lose its ability to track discrete fingers. In addition, we needed to carefully consider the placement of the Leap Motion, because if the Kinect was placed directly behind it, hand gestures would disrupt the Kinect’s head-tracking. As a result, we ultimately offset the location of the Kinect so that an unobstructed ray could be traced from the Kinect to the user’s face.

Parallax Effect

The parallax effect was an essential component to the success and functionality of our device. This was a function of the users’ ability to perceive depth based on relative size. Because we used a 2D monitor display, parallax was the only way to create a 3D effect, which was instrumental to the functionality and rationale of our interface. One key factor that we discovered to be very important to the parallax effect was the texture and depth effect of the scene. To support this finding, we created a grid texture with special lighting effect to emphasize the depth effect for parallax. In addition, we learned that by simply tracking the scene camera to the location of a user’s head, the parallax effect was not achieved. As a result, we implemented the an asymmetric view frustum.

Asymmetric View Frustum

In order to create the parallax effect, we dynamically changed the FOV to match the virtual camera and viewer head. We also dynamically changed the shape of the frustum into an asymmetric shape. This constrains the display to single user support and requires manual input of the screen size and camera position relative to the center of the screen.

User Interface Design

Our primary demo features a scenario that highlights the utility of the Localized Space Display. We feature what appears to be a “work shed,” where multiple shelves are presented (in parallax) to the user. Upon the shelves are 3D models, which the user has the ability to manipulate. Overlayed on the user’s screen, is a messenger application, where they can still interact and type like they would in a conventional work setting. The scene featured two primary components, the GUI and the 3D interaction model. Both were implemented from scratch in Unity. By integrating the parallax effect and gestural control, we successfully implemented an interaction model that allowed users to translate, rotate, and scale virtual objects. This model was based on the AbstractHoldDetector class from Leap Motion. In essence, the Leap Motion API provides us with the ability to track “hold strength”, based on the configuration of a user’s fingers and hands. We defined a specific threshold for “hold strength” that would indicate an active state for the both the left and right hands. When active with one hand, users could grab objects. When both hands were active, users could scale and rotate objects. As a part of this interaction model, we implemented a mechanism to allow users to interact with the 2D GUI in the scene. By dragging the model up and out of the frame of the scene, users trigger a prompt that indicates the ability to “up”-load any objects to the messenger interface. Upon releasing the object, the model appears to be virtually beamed through the messaging platform, not unlike other familiar processes of uploading files. At any point in time, users have the ability to type in the messenger interface. This 2D GUI is fixed to the user’s screen, like any 2D interface would be on a desktop display. This GUI simulates any popular messaging app that might be useful for work settings (i.e. Slack). This feature was also constructed from scratch using Unity GUI elements.