ARtillry Interviews is an ongoing series that profiles the biggest innovators in XR. Narratives are based on interviews with subjects but opinions and analysis are that of ARtillry. See the rest of the series here and our video interview show here.  


The past two years of excitement around immersive computing have involved various flavors of AR and VR. But the most commercially-successful and mainstream-penetrated experiences, at least with AR, have ironically been the most primitive — a la Pokemon Go and Snapchat Lenses.

But AR’s promise is to interact with physical space in more dynamic and dimensionally accurate ways. To compare it to 2D tech that we know from mobile devices, AR’s goal is to have a sort of responsive design for the physical world says XR research director at Unity Labs, Timoni West.

“If I were to make an animation of a character to walk from here to here, if I then move this chair in the way, it could really mess it up,” she told ARtillry while referencing physical points in the room. “So we want to be sure that no matter what’s happening in the world, it will still work.”

With lots of tenure in UX and interface design throughout the smartphone era and into the beginnings of the immersive era, West now leads XR research at Unity Labs. The division researches technology that underpins and paves the way for Unity’s core game engine.

Democratizing XR

One underlying goal for Unity Labs extends from a central Unity principle: democratizing authoring tools for advanced graphical interactions. That includes everything from more realistic and scalable facial capture to machine-learning powered character behavior in games.

“Instead of having pre-baked animation loops that have to be done manually, you can run an AI to have a character do things in a way that feels logical and isn’t super repetitive,” she said. “This is democratizing for indie studios that wouldn’t have someone to animate all of that.”

One key output for West’s team for example is project M.A.R.S.. The Unity extension lets devs test and build AR/MR using real-world data like the room scans, or geo-data from companies like Mapbox. Altogether it creates a more natural development and testing environment.

Bringing that back to West’s responsive design analogy, the goal is digital characters or elements that appear to have greater semantic understanding of physical space. Their interactions shouldn’t just be dimensionally accurate — including occlusion, lighting, etc. — but contextually natural.

Digital Twin

The way Unity Labs’ arrived on M.A.R.S. is an interesting by-product of its core mission. Being primarily a game engine, Unity has developed best-of-breed capabilities when it comes to building virtual worlds. And components like physics engines make for a great simulation tool.

“We’ve been making games that are basically worlds,” said West. “That means we can approximate everything in the world. There’s not really a better tool than one that already comes with physics… So we then just built this sort of world simulation tool.”

Speaking of simulating the real world in graphical environments, one gap (and thus opportunity) in AR commerce will be 3D digital twins of physical items, such as commercial products. Much of the above AR experience development will require those 3D assets for optimal AR/IoT integrations.

“Whenever anyone asks what they need to do to get ready for the future, I say make digital twins of everything in your inventory,” she said. “A lot of companies don’t have that right now or they have CAD files for 3D printing that are way too big and won’t work for mobile.”

Slow Burn

Beyond building and testing 3D worlds for AR, West is a longtime student of computing evolution including UI and input design. And these are the levels on which spatial computing could be most revolutionary. AR follows a longstanding progression towards more intuitive interfaces.

“The way I want to interact with the computer is limited by the physicality of the hardware,” she said. “So in order to move away from the physical limitations, we need to start rethinking how people should work with computers. And there’s no better way to do that than with AR today.”

But AR’s promise for truly intuitive UI won’t happen on mobile devices where the hardware still creates cognitive load. It will more likely be a wearable form factor that’s a few years away. But in the meantime, there’s exciting innovation happening as developers gain their spatial footing.

“I don’t think mobile AR will be key for consumers, but I’m excited for the future,” said West. “I think it will be a slow burn and we’re going to have to work on limitations of the hardware over the next few years. But as long as we keep that vision going, I think we can get there sooner.”

For more insights from Timoni West, see her segment of the L.E.A.P Con keynote embedded below, coded to start at the right point. 


For deeper XR data and intelligence, join ARtillry PRO and subscribe to the free ARtillry Weekly newsletter. 

Disclosure: ARtillry has no financial stake in the companies mentioned in this post, nor received payment for its production. Disclosure and ethics policy can be seen here.

Header image credit: Unity