5 factors shaping the future of XR
In this post, we wrap up our series on the future of XR in 2020, sharing my thoughts and some of our partners’ ideas on what’s cooking and the role of eye tracking as a foundational technology in AR and VR.
There are three major trends that will come together to have an enormous impact on the XR market: streaming, AI, and heightened immersion.
XR streaming is currently in its infancy with technologies like NVIDIA CloudXR just beginning to roll out. As 5G becomes more widespread, XR algorithms and experiences will increasingly become split between the client and server to take advantage of the massive power of the cloud and the proximal efficiency of the HMD. With all these elements combined, streaming will become the norm for XR.
As that happens, the power of AI will enhance how we experience XR, with improved streaming, intelligent agents, and natural user interfaces like NVIDIA’s Jarvis for natural language understanding and gesture recognition.
And with more biometric sensors arriving soon, AI software will interpret and operate on that data, which will lead to: helping us feel connected with each other through expressive avatars; customizing and accelerating training and learning based on biofeedback; and ultimately, controlling our virtual environments through neural interfaces.
XR adoption will accelerate as we tackle three major challenges: content, comfort, and isolation. As a new technology, people need a compelling reason to adopt XR — the “wow factor” builds interest, excitement, and hype, but it doesn’t necessarily lead to widespread adoption. Encouragingly, we’re starting to see more and more high-quality VR content, like in popular VR titles such as Half-Life: Alyx, and powerful enterprise visualization applications like Autodesk VRED, and ZeroLight. Soon we’ll see ray-traced photo-real VR, which will be a game-changer for design workflows.
The second factor is comfort and convenience. For a new technology to be broadly adopted, it has to be easy to use. As HMDs are getting lighter and easier to set up, and VR image quality is constantly improving, the cost-benefit of jumping into VR is passing the tipping point for more and more users.
The third challenge is isolation. One of the main promises of VR has always been collaboration. But ironically, today’s main use cases continue to be solitary (e.g. single-player gaming and single-user design visualization). This is starting to change.
As mentioned, we’re seeing great progress on many fronts. But in some cases, the solutions to one challenge will lead to setbacks in another. For example, while people love the mobile convenience of All-In-One (AIO) headsets, these kinds of headsets don’t have enough graphics horsepower to deliver the stunning visual experiences available from a high-performance graphics card like the NVIDIA RTX A6000. An emerging solution to this problem is split rendering, such as NVIDIA’s CloudXR: render the XR frames on a remote PC with a powerful GPU and stream the rendering to the AIO.
Split-mode frameworks will similarly make it possible to amortize shared computation among multiple participants in collaborative experiences, like calculating the global illumination once for everyone in an architectural design review, or simulating the world’s physics once for all the players in a game. These split-mode frameworks will require fast, ubiquitous networks like 5G, and accelerated computing and rendering at the edge.
One of the strengths of XR is how broadly applicable it will be across verticals. The earliest adopters of XR have been using it extensively for design and for training, due to the ability to understand composition and scale, and the fidelity with which experiences can be simulated. The value of XR can’t be overstated for those enterprise use cases. Long-term, XR will be everywhere; it’s a new modality and we’ve only begun to scratch the surface of where and how it will be used.
NVIDIA has a proud history of leadership in the XR ecosystem. For years, NVIDIA’s GPUs and scalable visualization technologies like NVIDIA Quadro Sync have been driving the display walls in multi-user CAVEs, Air Force flight simulators, and enormous immersive visualization spaces, like those found in interactive museums and planetariums. NVIDIA’s GPUs drove early head-mounted displays, when they were primarily used for flight simulators and cognitive science research. And NVIDIA developed early VRWorks tools like Direct Mode and Context Priority to make it possible to efficiently and effectively drive the first Oculus Rift and HTC Vive HMDs when they came to market in 2015.
Since then, we’ve broadened our XR offerings. We’ve continued to provide new SDKs for VR developers, including tools for VR physics, VR audio, VR rendering technologies like Multi-Res Shading, Lens-Matched Shading and Variable Rate Shading, as well as AI rendering technology like NVIDIA DLSS. We’ve also continued to boost our VR performance with each subsequent GPU architecture, adding new hardware support to improve and accelerate VR. experiences. And most recently, we created NVIDIA CloudXR, making it possible for the ecosystem to enjoy the highest quality VR and AR experiences on any XR device via optimized, network-adaptive streaming.
We’ve been fortunate to have visionary partners and cutting-edge customers who have been early adopters of our XR technology and have helped us show the world what’s possible. Recently, we highlighted many of those projects and technologies at our GTC 2021 conference. The GTC XR track was packed with product, partner, and customer talks — all highlighting how NVIDIA XR tech is propelling the ecosystem forward. In case you missed GTC, session videos are available through NVIDIA On-Demand.
In this post, we wrap up our series on the future of XR in 2020, sharing my thoughts and some of our partners’ ideas on what’s cooking and the role of eye tracking as a foundational technology in AR and VR.
Peter Peterson, Head of XR Software and Solutions, R&D at HP, gives his views on current XR market trends and possible challenges impacting the XR market.
Drew Bamford, Corporate Vice President of HTC Vive Creative Labs, discusses key trends and possible challenges impacting the XR market.
Brian Vogelsang, Senior Director of Product Management for Qualcomm Technologies’ XR business group, discusses trends driving the XR industry forward.
Karen Zu, VP of marketing at Pico, gives her views on the current XR market trends and possible challenges impacting the XR market.
Subscribe to our stories about how people are using eye tracking and attention computing.