XR Talks is a series that features the best presentations and educational videos from the XR universe. It includes embedded video, as well as narrative analysis and top takeaways. Speakers’ opinions are their own.
This week was the two year anniversary of an inflection point in AR history: the unveiling of ARkit at the June 2017 WWDC conference. This year’s show didn’t have the same AR gravity but incremental steps were taken to move the ball forward for AR. It’s evolution, not revolution.
So what did Apple announce? There’s lots of good coverage this week from news outlets so we’ll rehash that briefly but mostly attempt to go one level deeper into strategic takeaways in this week’s featured talk (embedded below). Where does this position Apple, and the rest of us?
People First
First, it rolled out details for ARkit 3, including key features like people tracking and occlusion. This solves a boring-sounding but key technical challenge for AR in detecting people in the frame. It keeps them in relative position — in front or behind — of anchored graphics, a.k.a. occlusion.
“What used to require painstaking compositing by hand can now be done in real time,” said Apple senior VP of software engineering Craig Federighi from the stage. “Now, by knowing where these people are in the scenes, you can layer virtual content in front and behind them.”
ARKit 3 also adds real-time motion capture so that full-body human movements can be processed and rendered into AR interactions in real time. This will come in handy for AR experiences where animations interact with humans in the frame. Think: AR lenses for superhero movie promos.
At a high level, this has parallels to Snap’s recent hand/feet/face tracking, Facebook’s similar play and Google’s augmented faces. These are more about body tracking to enable reliable anchor points for AR graphics, but they likewise push human-centric AR development forward.
Bigger Toolbox
Also on Apple’s list of new AR tools is Reality Kit. This is a framework for developers to infuse animations in real-world environments in natural ways. That includes things like automatically scaling objects or giving them real-world physics (think: bouncing ball) to evoke realism.
“Creating 3D environments can require deep knowledge of 3D modeling and sophisticated gaming engines,” said Federighi. “But what about developers who want to incorporate 3D and AR in their apps but don’t have that experience? That’s where RealityKit comes in.”
Meanwhile, Reality Composer builds interactive scenes. It has a library of virtual objects and the ability to import USDZ files (Apple’s endorsed 3D file format). It also lets developers customize interactions with objects (tap, spin, etc.) and it runs on iOS for rapid prototyping on iPhones.
“Reality composer is a new app featuring a drag & drop interface and a library of high-quality objects and animations,” said Federighi. “It’s integrated with XCode, but it’s also available on iOS, so, you can edit, test, and tune your app right on the device where it will ultimately be delivered.”
Democratizing AR
Federighi referenced game engines, and some have signaled this as a detriment to the Unitys and Unreals of the world. But more so, we believe the above tools will bring AR development down market to non-technical creatives and brands. Unity won’t necessarily lose developers over this.
And that down-market accessibility is sort of the point. By democratizing advanced AR creation, Apple is following a market-wide trend to lower AR barriers. Content creation is a primary component of early-stage chicken & egg challenges for new technologies or marketplaces.
So Apple, in having high stakes in AR, is looking to accelerate the market. That means consumer adoption, but it can be sped up by developers’ ability to build better content for users to latch onto. Those two variables move up and to the right in a sort of slow-moving step function.
Speaking of content, some of the new ARkit outputs were shown off through Mojang’s on-stage demo of its new Minecraft Earth. It included people tracking and the ability to animate game objects through gestures or even putting oneself in the game using the new occlusion capabilities.
AR’s Light of Day
But most notable about the Mojang demo is a subtle point: the game has a release date. One slam against tech giants’ keynotes is that they sometimes lack real-world outcomes or explicit release dates. Last Fall’s AR demo featured a 3D Galaga game that still hasn’t seen the light of day.
Another side note is that Microsoft owns Minecraft. Mojang developed Minecraft Earth (sort of like Niantic developed Pokemon Go), but we rarely see Microsoft come near the Apple stage, though the rivalry isn’t as heated as it once was. Minecraft Earth also uses Microsoft’s spatial anchors.
Back to demoing things that are more tangible, Apple is learning lessons, just as the rest of the market is — a mark of early-stage sectors. As a platform, it’s doing what it should at this point: pushing the ball forward with tools to make it easier to create that still-unknown killer app.
“It’s a huge year for AR,” said Federighi.
See the WWDC keynote below, coded to start at the right point for the AR segment.
For deeper XR data and intelligence, join ARtillery PRO and subscribe to the free AR Insider Weekly newsletter.
Disclosure: AR Insider has no financial stake in the companies mentioned in this post, nor received payment for its production. Disclosure and ethics policy can be seen here.
Header image credit: Apple