home
Search

Eyes on NVIDIA: VR eye tracking is just the beginning

  • by Anders Lundin
  • 6 min

Future of XR

In this post, David Weinstein, director of virtual and augmented reality at NVIDIA, outlines key trends driving the XR market, as well as possible challenges ahead and strong use cases. This is the fourth post in Tobii’s blog series ”The Future of XR”, where Tobii’s business partners share their views on the development of the XR industry.

What are the most significant trends that will impact the XR market in the coming 3-5 years?  

There are three major trends that will come together to have an enormous impact on the XR market: streaming, AI, and heightened immersion. 

XR streaming is currently in its infancy with technologies like NVIDIA CloudXR just beginning to roll out. As 5G becomes more widespread, XR algorithms and experiences will increasingly become split between the client and server to take advantage of the massive power of the cloud and the proximal efficiency of the HMD. With all these elements combined, streaming will become the norm for XR.

As that happens, the power of AI will enhance how we experience XR, with improved streaming, intelligent agents, and natural user interfaces like NVIDIA’s Jarvis for natural language understanding and gesture recognition. 

The third factor that will have a major impact on XR adoption will be new sensor hardware and AI software, to heighten our level of immersion.Eye tracking is the beginning – it’s already being used to optimize rendering, as with Variable Rate Shading in NVIDIA’s VRWorks SDK, and will soon make immersive ray-traced photo-realism possible.

And with more biometric sensors arriving soon, AI software will interpret and operate on that data, which will lead to: helping us feel connected with each other through expressive avatars; customizing and accelerating training and learning based on biofeedback; and ultimately, controlling our virtual environments through neural interfaces.

What do you think the biggest challenges are to XR adoption today?

XR adoption will accelerate as we tackle three major challenges: content, comfort, and isolation. As a new technology, people need a compelling reason to adopt XR — the “wow factor” builds interest, excitement, and hype, but it doesn’t necessarily lead to widespread adoption. Encouragingly, we’re starting to see more and more high-quality VR content, like in popular VR titles such as Half-Life: Alyx, and powerful enterprise visualization applications like Autodesk VRED, and ZeroLight. Soon we’ll see ray-traced photo-real VR, which will be a game-changer for design workflows. 

The second factor is comfort and convenience. For a new technology to be broadly adopted, it has to be easy to use. As HMDs are getting lighter and easier to set up, and VR image quality is constantly improving, the cost-benefit of jumping into VR is passing the tipping point for more and more users. 

The third challenge is isolation. One of the main promises of VR has always been collaboration. But ironically, today’s main use cases continue to be solitary (e.g. single-player gaming and single-user design visualization). This is starting to change.

We’re seeing strong progress on this front, from high-quality multiplayer VR games and engaging social VR environments, to collaborative enterprise XR platforms like NVIDIA Omniverse. The next hurdle we’ll face on this front will be to bring all of the social cues that we rely on, such as smiling, eye contact, and body language, into VR through natural, expressive avatars.

What will need to change over the next three years to shift or reduce these challenges?

As mentioned, we’re seeing great progress on many fronts. But in some cases, the solutions to one challenge will lead to setbacks in another. For example, while people love the mobile convenience of All-In-One (AIO) headsets, these kinds of headsets don’t have enough graphics horsepower to deliver the stunning visual experiences available from a high-performance graphics card like the NVIDIA RTX A6000. An emerging solution to this problem is split rendering, such as NVIDIA’s CloudXR: render the XR frames on a remote PC with a powerful GPU and stream the rendering to the AIO. 

Split-mode frameworks will similarly make it possible to amortize shared computation among multiple participants in collaborative experiences, like calculating the global illumination once for everyone in an architectural design review, or simulating the world’s physics once for all the players in a game. These split-mode frameworks will require fast, ubiquitous networks like 5G, and accelerated computing and rendering at the edge. 

Tobii XR future
NVIDIA CloudXR

On the enterprise side, what industries will benefit the most from XR and why?

One of the strengths of XR is how broadly applicable it will be across verticals. The earliest adopters of XR have been using it extensively for design and for training, due to the ability to understand composition and scale, and the fidelity with which experiences can be simulated.  The value of XR can’t be overstated for those enterprise use cases. Long-term, XR will be everywhere; it’s a new modality and we’ve only begun to scratch the surface of where and how it will be used.

Many of the biggest winners will be collaborative workflows where participants will include both humans and AIs – those workflows will see enormous improvements in terms of efficiency and outcomes. NVIDIA’s Omniverse platform is a great example of how a collaborative enterprise platform can enable a tremendous leap forward in efficiency and productivity.

Tobii XR future
NVIDIA Omniverse

Can you describe NVIDIA’s role in the XR ecosystem?

NVIDIA has a proud history of leadership in the XR ecosystem. For years, NVIDIA’s GPUs and scalable visualization technologies like NVIDIA Quadro Sync have been driving the display walls in multi-user CAVEs, Air Force flight simulators, and enormous immersive visualization spaces, like those found in interactive museums and planetariums. NVIDIA’s GPUs drove early head-mounted displays, when they were primarily used for flight simulators and cognitive science research. And NVIDIA developed early VRWorks tools like Direct Mode and Context Priority to make it possible to efficiently and effectively drive the first Oculus Rift and HTC Vive HMDs when they came to market in 2015.

Since then, we’ve broadened our XR offerings. We’ve continued to provide new SDKs for VR developers, including tools for VR physics, VR audio, VR rendering technologies like Multi-Res Shading, Lens-Matched Shading and Variable Rate Shading, as well as AI rendering technology like NVIDIA DLSS. We’ve also continued to boost our VR performance with each subsequent GPU architecture, adding new hardware support to improve and accelerate VR. experiences. And most recently, we created NVIDIA CloudXR, making it possible for the ecosystem to enjoy the highest quality VR and AR experiences on any XR device via optimized, network-adaptive streaming.

We’ve been fortunate to have visionary partners and cutting-edge customers who have been early adopters of our XR technology and have helped us show the world what’s possible. Recently, we highlighted many of those projects and technologies at our GTC 2021 conference. The GTC XR track was packed with product, partner, and customer talks — all highlighting how NVIDIA XR tech is propelling the ecosystem forward. In case you missed GTC, session videos are available through NVIDIA On-Demand.

    Written by

    • Tobii Anders Lundin

      Anders Lundin

      Corporate Communications Manager, Tobii

      As the corporate communications manager for Tobii, means I get to communicate the benefits of our amazing technology to customers, investors, the media, users — anyone with an interest in eye tracking and how it can be used to raise quality of life and of the UX experience. Personally, I love exploring new technology and how that impacts us and the world around us.

    In collaboration with

    • David Weinstein

      David Weinstein

      David Weinstein is the director of virtual and augmented reality, NVIDIA

      David Weinstein is the director of virtual and augmented reality at NVIDIA and is responsible for the company’s XR products, projects, and SDKs. Prior to joining NVIDIA, Weinstein was a professor of Computer Science and an active figure on the tech startup scene.

    Related content

    0/0

    0/0

    Subscribe to our blog

    Subscribe to our stories about how people are using eye tracking and attention computing.