Tobii Helps Bring Eye Tracking to XR Development

Monday 12/10/18 02:02pm
|
Posted By Leilani DeLeon
  • Up0
  • Down0

Snapdragon and Qualcomm branded products are products of
Qualcomm Technologies, Inc. and/or its subsidiaries.

Qualcomm Developer Network collaborates with a wide ecosystem of companies to help support developers in their creation of truly immersive mobile experiences with rich development tools. One such company is Tobii, who we collaborated with to integrate the cutting-edge eye tracking solution on the Snapdragon® 845 VR Development Kit. Developers can utilize this kit along with Tobii’s EyeCore™ eye tracking algorithms to create content that utilizes gaze direction for fast interactions, and superior intuitive interfaces. If you attended GDC earlier this year, you would have seen eye tracking in action with the Tobii Mirrors VR demo.

Recently we had the opportunity to speak with Dr. Ralf Biedert, Project Manager and X-Team Lead from Tobii to understand the advantages of adding eye tracking capabilities to VR applications.

How did Tobii get started in VR?

We’ve been working on eye tracking and gaze interactions for over 18 years across a variety of platforms, so we knew that integrating eye tracking into XR could create a variety of powerful benefits. One area that we explored from the very beginning is foveated rendering. With foveated rendering, developers can render just a small area of the display (where a user’s eyes are focused) in high definition, while the peripheral areas are rendered at lower detail to reduce the GPU workload. The cool thing about this technique is that, due to how our eyes work, users perceive the entire display in high detail. This is a very flexible application of the technology that can provide a range of benefits, including allowing developers to create higher resolution experiences than otherwise possible, or allowing OEMS to create more efficient headsets.

Another example of the value of eye tracking relates to the idea of hand-eye coordination. For example, in today’s VR applications, the experience of throwing something in regular games isn’t that great. So Tobii worked on using eye tracking to make actions such as picking up and throwing objects feel more realistic. In VR, the eye perceives hand movements, so if you combine eye tracking, as one more input to help determine aim, you can set up a line of sight, fix a trajectory, and provide a feeling of weight, resulting in more realistic hand-eye coordinated actions. Hand-eye coordination can also be made to feel more empowering. When we pick up an object or throw it, our gaze precedes our aim and hand movement. Combined with controller input and clever algorithms this can give users a feeling of having abilities that feel natural and intuitive at the same time.

We’ve also spent a good deal of time thinking about social interactions with a focus on players’ eyes. In social applications, players can look at each other with realistic facial movements, allowing avatars to become better aware of each other and react for a more immersive experience. Using eye tracking in this way can be transformative, helping to make typically flat and emotionless virtual avatars come alive with expressive features and realistic eye contact.

What should developers know about getting started with eye tracking?

It all depends on what type of interaction is desired. Developers can start by reading about eye tracking use cases for social, user interfaces, and hand-eye coordination in the Tobii Developer Zone. Developing simple actions like picking up an object can be done using our API, while more involved actions like throwing an object requires some math and an algorithm, along with Tobii’s throwing module. For social interactions, it’s more about designing the look and feel of facial animations for avatars.

Why should developers consider integrating eye tracking?

Adding eye tracking can optimize device performance, and it can improve user experiences. Take, for example, hand-eye coordination. Gaze data tends to be one of the best ways to approximate human attention. So eye-tracking in VR helps with several virtual interactions, from being able to pick something up naturally and effortlessly, to reliably hitting a target you are aiming at. This use of eye tracking can improve most interactions by providing an opportunity to fix a trajectory to a target before starting the action. Without eye-tracking, the usual process is to look, point and activate an action (three steps). With eye tracking, the process is simplified to two steps: look, and then activate. For example, a player can look at something and then shoot a laser at it, indicating their target with their eyes and activating the laser by clicking a button.

For social interactions between avatars, facial expressions with realistic muscle movements can be driven by eye tracking making the avatar seem more alive, or to make others more aware of you. Traditionally, avatars have been kind of creepy with limited graphic features like blocky eyes and a line for the mouth. But when we added eye tracking for facial features, it was transformational for increasing the realism, even with limited graphic detail.

Should eye tracking be kept on for the full duration of a VR session?

Once again, there are two primary ways that eye tracking adds benefits in VR – 1) improving the capabilities of the device and 2) making interactions better. In many cases, developers will choose to keep the interactive benefits on all the time given the need for constant mechanical interaction. However, some games may only turn it on when needed. For example, the ability to pick up and throw objects with extreme accuracy might be offered as a powerup.

However, eye tracking benefits related to enhanced device capabilities will likely always be on, to support things like enhanced resolution with foveated rendering and automatic configuration by knowing eye position and interpupillary distance (IPD).

Are there other use cases outside gaming for eye tracking?

Yes, we see eye tracking being useful in just about any area you can think about, from professional training and safety, to product placement in retail stories, to areas like education, entertainment and games. For example, when buildings are being designed, VR and eye tracking might be used for a study to determine where emergency signs and lighting should be placed for best illumination and viability. Eye tracking can also be used for less serious applications, such as selecting brushes in 3D painting programs, or navigating through space using your eyes in a virtual reality simulator.

How have you worked with Qualcomm Technologies, Inc.?

Tobii and QTI have collaborated to bring eye tracking to a standalone VR HMD development kit powered by the Snapdragon® 845 Mobile VR Platform. We’ve also worked together closely to make sure we have an easy-to-use SDK for eye tracking application development, and wealth of development resources and ideas via the Tobii Developer Zone. I think we have built a great, positive working relationship with QTI, and we’re excited about this collaboration in bringing something both foundational and powerful to VR that provides so many important benefits.

Do you have any final advice for developers considering eye tracking?

The integration of eye tracking into VR headsets is an important step for making both HMDs and user experiences better! However, to truly understand the power of eye tracking, you must try it for yourself. To learn more, or to arrange a demo, you can contact us.

You can also learn more about Tobii and how to use eye tracking in VR here.

Sections: