Tobii's top 5 tech trends for 2023
Tobii experts share their top trends to watch out for in 2023. Including, the use of tech in sports, driver monitoring systems in EU, generative AI, data transparency, and XR.
There is something special about beginnings and endings. They help us reset, work toward new goals, develop new features, and solve new problems. While 2022 was packed with unforeseen challenges, the pandemic has proven that humans are incredibly resourceful — when faced with instability and uncertainty, creativity tends to flourish. Taking responsibility for Tobii’s XR segment is a challenge I’ve recently undertaken, and I am happy to say that from an eye tracking perspective, I couldn’t have assumed this role at a better time. We’ve just announced our first foray into mass-market consumer VR. And our XR platform footprint has expanded to support state-of-the-art optical designs, delivering new data streams for leveling up device capabilities and user experiences. One of my first tasks is to present our 2022 XR wrap-up and highlight the innovations we can look forward to in the coming months. Enjoy!
Tobii was selected by Sony Interactive Entertainment to be the eye tracking technology provider for PlayStation VR2, the announcement (July 2022).
The collaboration with our customers over the past few years has helped us form a holistic approach to eye tracking integration. And enabled us to develop a subsystem that supports various optical designs using best-value components to deliver an eye tracking solution with powerful computing power and attractive aesthetics.
Some kind of proof that we are on the right track came while I was on-site at a customer. We were discussing the design and aesthetics of one of their prototype headsets when one of the engineers asked me to point out our eye tracking components — that’s how much the collaboration with players in the XR industry has shaped the discreetness of our technology.
Committing to OpenXR and collaborating with tech giants, headset designers, and developers has helped us to align our solutions with changing headset and ecosystem needs. We’ve strived to leverage the expertise gained from working with our partners in the design of our next-gen XR platform — optimizing algorithms and component drivers for low latency and minimal power consumption to ensure the best use of resources. But as I alluded to earlier, I am seeing a lot of emphasis on aesthetics, which makes sense because wearables need to be comfortable and attractive. To achieve that, we’ve put some work into our designs so that they run on discreet and lightweight components, placed in such a way that they don’t disrupt the user experience.
For OEMs looking to embrace near-eye (pancake) optics, our next-gen XR platform supports VR headsets and AR glasses. And while we have crafted our platform to run on standard components, I think the essential expertise we bring to the table is our knowledge of optimal placement, how to avoid invasive architecture changes, and ensure that the final device design is fit for mass production.
This year we’ve continued to invest in device-optimization capabilities like software-based IPD, positioning support, and slippage compensation. Setting IPD using software circumvents the need for people to use clumsy measurement techniques involving mirrors and rulers, but it also removes the need for a physical IPD adjustment wheel and improves static distortion algorithms through ideal alignment from the get-go. Eye tracking is the perfect solution to deliver the data needed for automated IPD adjustment solutions that keep users aligned with the lens sweet spot. We’ve worked on slippage compensation and a solution we call positioning guide that helps the user place the headset correctly — both vertically and horizontally — again helping to ensure that users are in the optical sweet spot and view the display with full clarity.
The way dynamic foveated rendering squeezes resources out of a headset was one of the main reasons why eye tracking has become an integral part of modern headsets. Its sister technique — dynamic foveated transport — which helps lower bandwidth requirements, is starting to take off as the ISV community digs deeper into eye tracking and how to leverage it to enable off-device computing.
Fuzzy images, ghosting, and glare caused by lens distortion are problematic because they disturb the user and can cause some people to experience nausea. In addition to the ‘sweet spot’ solutions I’ve just talked about, we’ve recently been working on a new idea to enhance XR experiences by ensuring the user’s pupil is always in the sweet spot. The concept works by swapping out static compensation with a dynamic solution that will maintain sharp images and remove unwanted effects as the user moves their eyes around. To enable such a solution, we’ve been working on delivering a new eye tracking signal which delivers 3D pupil position in real time. Watch out for a white paper on this in 2023.
As eye tracking becomes an integral feature of XR devices, I think we can expect to see devices with varifocal displays that dynamically adjust focal distance depending on gaze direction. This kind of next-gen display system will support intuitive interaction with virtual objects and help address the vergence-accommodation conflict, reducing eye strain caused by mismatching cues between vergence and accommodation of the eye.
When it comes to socializing, it’s no secret that sensor technologies like eye tracking help bring avatars to life with authentic expressions and eye movements that mimic the user in real time. I am particularly fond of an experiment we did with LIV and Ready Player Me using the Racket: Nx game because it shows how eye tracking makes the player’s avatar expressive. One of my colleagues has spent some time enabling our eye tracking in Epic's popular avatar-making tool MetaHuman. If you want to read about into that, I can recommend his post a gaze expression experiment with MetaHuman.
The sporting world has revealed itself to be an ideal application for eye tracking, if you integrate it into sports wearables like hockey helmets and golf caps, for example, it unleashes a unique stream of insights that can be made available to fans, coaches, trainers, and athletes. This kind of solution creates new opportunities for streaming services, for example, to deliver game action from the point of view of the player — not simply what they see, but also how they scan the scene in front of them. And for performance analysis, the data provides coaches and athletes with valuable information about gameplay and what players focus on during strategic moments.
The pilot shortage in the US could reach a deficit of up to 30,000 pilots by 2025. Numbers have declined due to the early-retirement schemes offered to senior staff during the pandemic, fewer military pilots moving to commercial airlines, and because training methods cannot keep pace with demand. To address this last issue, our partner, VTR, has developed a VR-based pilot training solution designed to improve training outcomes and lower costs. Traditional simulator time is an expensive commodity and in extremely high demand. VTR’s solution replaces the traditional ways pilots learn from manuals and paper-based models to develop robust skills, ensuring that time in the simulator is as effective as possible.
Because eye tracking is a digital tool that delivers insights about human beings, it helps to transform traditional care solutions based on manual observation into the digital sphere. But digital transformation using VR brings the added bonus of mobility, often resulting in highly flexible (and cheaper) semi-automated solutions. This year, our collaboration with Olleyes, a leading MedTech specializing in VR-based eye care solutions, came to fruition. As CEO of OIleyes, Alberto Gonzalez, explains, by shifting vision care evaluations to VR, patient throughput increases, and measurements become objective and accurate. The most valuable aspect he mentions relates to how VR eases the effort needed, helping patients to do the tests and clinicians to carry them out. In the long run, they are looking to shift their solutions to the home. I imagine it would be much more fun, not to mention convenient, to sit at home and do eye tests with a VR headset than endless visits to the nearest clinic.
The relentless efforts of the entire XR ecosystem have put devices, content, and applications on a fast growth track to maturity. And I have no reason to believe that 2023 will be any different. We will continue to contribute to privacy and safety through our data transparency policy, the OpenXR standard, and overall building of the metaverse.
With device optimization at the top of the OEM shopping list, it’s my job to make sure Tobii continues to develop solutions that ensure low compute with zero warping/distortion and deliver the best possible graphical experience. As the big players focus on lighter, thinner headsets with superb image quality and great aesthetics, I’d like to leave you with one thought: some of the most desirable XR concepts need gaze-accurate eye tracking to work.
Tobii experts share their top trends to watch out for in 2023. Including, the use of tech in sports, driver monitoring systems in EU, generative AI, data transparency, and XR.
We are in the middle of a macro trend: the shift-to-the-home. Work, healthcare, and school are undergoing a digital transformation to fit into our homes.
In this post, we discuss the importance of open standards and how eye tracking as an OpenXR extension is a catalyst for innovation in the XR ecosystem.
Subscribe to our stories about how people are using eye tracking and attention computing.