The past few months have seen the standard run of summer developer conferences. Some featured AR platforms including Apple (ARkit), Google (ARCore), and Niantic (Lightship). So we’re synthesizing their top AR updates for our XR Talks series (video and takeaways below).
Starting with Apple, one of the biggest headlines and focal points is what it didn’t announce at WWDC. Unsurprisingly – though still a matter of contention in the tech press – Apple didn’t announce its rumored AR hardware. That hardware is still likely coming but it could be a while.
Worse, that omission overshadowed the notable AR announcements and updates Apple did make. For one, its Room Plan API could be impactful in subtler ways. As we examined, it could democratize room scanning to enable all kinds of interior design apps (starting with Shopify).
Can Apple Democratize Room Scanning?
Incremental but Important
Beyond Room Plan, Apple launched ARkit 6. It includes updates that are incremental, but important. They push ARkit forward in capability and appeal for developers, including plane anchors, improved location anchors, better motion capture, and more granular camera access.
Starting with plane anchors, they let AR developers build in the functionality to track flat objects on which to place digital elements. Previously, objects like tables and walls were mapped by an iOS device as they were scanned (the reason you have to wave your phone around).
But the issue was that the plane would continue to be mapped and updated as the session went on. This resulted in a shaky anchor point and graphics that aren’t locked to their surface, spoiling the illusion. Now, plane rotation is static even if its shape changes as the camera moves.
Moving on to Location Anchors, these are geographic points that activate AR experiences, and remain persistent and synchronous across sessions and users. This is the promise of the AR cloud and multi-user AR. Apple now has more cities mapped in countries like Canada and Japan.
The AR Space Race, Part II: Apple
Bodies in Motion
As for motion capture, these derive anchor points for humans in the video frame. ARkit can now track people and computationally devise a sort of skeleton. That skeleton then becomes the basis for positional tracking, including a body in motion, to affix digital elements.
One use case that comes to mind for motion capture is AR adornments for dance routines. This has become popular in the exploding world of TikTok, which itself is gearing up its AR efforts now that it too is a platform (Effect House). So we’ll see if Apple can get some of that mojo.
It’s also worth noting that ARkit 6 comes with camera access improvements. These will give developers more control over the iOS device’s camera during an AR session. For example, they can decide things like frame rates and resolution that work best with their AR app.
This is a logical update, given that the iPhone camera is now so powerful and has several modes, such as 4K and 60 frames per second. Depending on the AR experience, it might make sense to use all that horsepower (lots of motion), or if all that data is overkill (simple lenses).
We’ll pause there and cue the full video below. Stay tuned for more platform updates and breakdowns in the coming weeks as we continue this series with Google, Niantic and others…