Designing avatars is not just about creating beautiful faces, bodies, and peripherals but also about reflecting and communicating real-time attention and emotions of a user with othersTimmu Tõke, Co-Founder and CEO of Ready Player Me (at the time)
Introducing our new blog series for XR devs
With leaps in hardware and software happening so rapidly, I believe that many devs are keen to understand how eye tracking can help enhance virtual interactions, provide innovative new metaverse experiences, and empower users with ever deeper levels of immersion. And so, I’ve started this new blog series for XR software and game developers.
In this first post, I aim to share some of the projects that our XR team has been collaborating on, some of the stuff that makes me proud because it’s like some sort of proof point when you get to see the technology you’ve been developing make a difference to others. Read on to find out about our eye tracking for avatars, learn about the first XR game to showcase eye tracking as a core functionality, as well as some fascinating event presentations ranging from next-gen game development to pilot training in the enterprise world.
I’ll be talking more about developing for AR, VR, and MR devices over the coming months, so be sure to subscribe to our XR newsletter on our XR developers page, so you’ll be the first to hear about developments.
Avatars in the metaverse and beyond
From World of Warcraft characters to Second Life alter-egos, the digital avatar has long been a way for online communicators, and gamers especially, to manifest themselves online in any way they desire. Avatars are great because they bring freedom of expression to the online world. You can be seen how you wish — be that a replication of you in the physical world, or designing all sorts of creative new looks, from the whimsical to those that express how you feel on the inside.
As we enter the realms of extended and virtual reality, the importance of avatars will reach even greater heights as we come face to face with the virtual avatars of others in next-gen AAA games for VR headsets or in new tools for communication and collaboration among colleagues.
Eye tracking adds an extra layer of reality to avatars and virtual characters because it enables XR to emulate eye contact and eye movements that play a significant role in how people communicate. By enabling them in XR we extend these essential aspects of humanity online. Allowing users to adopt more true-to-life expressions and appear more authentically present adds to the immersion possible in the metaverse.
Tobii, LIV, and Ready Player Me
In March this year, Tobii announced an exciting partnership with LIV, the leader in XR game streaming, and Ready Player Me, a platform that enables users to create 3D avatars for use in hundreds of different XR apps and games.
.
And I couldn’t agree more.
The first experiment as part of this collaboration features a game called Racket: Nx, demonstrating how real-time eye movement in a streamer’s avatar provides viewers with insights into their gameplay. Check out the video below
Experimenting with MetaHuman
MetaHuman Creator is a tool for creating hyper-realistic characters in Unreal Engine. To explore how well eye tracking adds to that realism, Ajinkya Waghulde — one of our senior engineering managers — posted about integrating our eye tracking into MetaHuman avatars, with an impressive video example: Tobii MetaHuman avatar demo.
Take a look at how Ajinkya did it in his A gaze expression experiment with MetaHuman blog post, and why not take a deep dive into our Social Use Cases area on our XR devzone?
MetaHuman characters with eye movement enabled by Tobii eye tracking
Eye tracking at recent events
Starblazer — one of the first VR games to prominently feature eye tracking
Last week I attended AWE USA in sunny California with our team, meeting partners, devs, and other XR industry leaders. One of the highlights for me was the presentation I did with our good friends at Starcade Arcade, who talked about Starblazer a game that will leverage our eye tracking arriving in this summer’s upcoming release Starblazer: Factions.
Alexander Clark, Starcade Arcade’s founder, talks through the key features that eye tracking brings to Starblazer’s upcoming refresh — including attentive and reactive UI and intuitive interaction with objects. He also shares the lessons they learned during development, which may give you a head start when developing your games and apps.
And don’t miss Alexander’s answers to the great questions posed by the AWE audience during the Q&A at the end of our presentation: Building next-gen XR games with Tobii eye tracking.
Unity’s GDC presentation on building games for PS VR2
This video isn't hot off the press but the entire presentation is worth the watch. At GDC 2022 in March, Unity gave a fascinating presentation about building next-gen games for PlayStation VR2.
You won’t be surprised to know that what caught my attention most was when Jono Forbes — senior engineering manager in the XR Team at Unity — said,
“I’ve saved what is, to me, the most exciting new design space for last, which is eye tracking of course.“
Always great to see we’re not the only ones excited about eye tracking, and Jono does a great job talking about how Unity sees eye tracking as a key part of the coming generation of VR headsets. I’ve embedded the video below to start at the eye tracking section, 31m22s into the clip, but feel free to drag back to the start to catch the full presentation.
Enterprise, eye tracking, and pilot training
Much of this post has focussed on the gaming side of XR, but eye tracking has plenty to offer elsewhere, like the enterprise sector.
Another highlight from AWE 2022 was the presentation by Rick Parker — CTO of Visionary Training Resources (VTR). Parker tells about how VTR is helping fill a worldwide shortage of pilots by disrupting the early stages of pilot training and how our eye tracking has enabled performance tracking and demonstrated ROI — helping VTR make their business case to major airlines.
Building disruptive pilot training with XR and eye tracking: VTR case of empowering major airlines.
Our XR devzone
As you may know, we have built a comprehensive devzone dedicated to augmented-, mixed-, and virtual-reality eye tracking integrations that features documentation, guides, demos, and more. Check it out: Tobii XR devzone.
We will shortly be launching a survey to gather feedback and suggestions from devs using or planning to use the XR devzone. If you want to be included, please register for our newsletter on the XR developers page,
Related content
XR developer focus — Tobii Ocumen, VR games, and Interaction design
In this blog post, we will discuss Tobii Ocumen success stories, VR games featuring eye tracking as well as design for gaze-based interactions.
XR developer focus — PS VR2, games, hardware improvement, and healthcare
In this XR developer focus post, Johan Bouvin summarizes the impact of our eye tracking in upcoming VR headsets.
Eye tracking in XR — the 2022 wrap-up and innovations on the horizon
Tobii's acting head of XR, Emma Bauer, takes a look back at XR in 2022, talking about some of the things Tobii has been working on and what innovations she expects to see in 2023.
Subscribe to our blog
Subscribe to our stories about how people are using eye tracking and attention computing.