“Trendline” is AR Insider’s series that examines trends and events in spatial computing, and their strategic implications. For an indexed library of spatial computing insights, data, reports and multimedia, subscribe to ARtillery PRO.
Apple is clearly intent on AR. It’s driven by all of the “follow the money” factors we’ve examined, including both boosting and succeeding an aging iPhone. But rather than the AR revolution we envisioned in the wake of ARkit’s 2017 unveiling, it’s been more of a gradual climb.
Continued steps in that evolutionary path were taken yesterday at Apple’s iPhone 12 event. Beyond standard processing upgrades that support AR, a few moves directly accelerate it. These mainly involve two future-proofing moves whose impact will be seen in the longer term.
The first of those is the iPhone 12’s 5G support. For all of the reasons we’ve examined, 5G will better enable “AR everywhere,” including low-latency graphics rendering and millimeter-precision device localization. Network rollouts still have a ways to go, but Apple is getting ahead of that.
The second future-proofing move is lidar. It unlocks more robust AR through faster and more comprehensive spatial mapping. Only the iPhone Pro and Pro Max are equipped with lidar, but this signals the capability we’ll see across the iPhone line as it trickles down in future cycles.
Seeing in the Dark
Going one level deeper on lidar (light detection and ranging), the technology involves sensors that track how long it takes light to reach an object and bounce back. This is the state of the art for depth sensing, and is how autonomous vehicles achieve computational vision to “see” the road.
Apple integrated the technology in the iPad Pro last year, signaling that it would soon arrive at an iPhone near you. This aligns with Apple’s AR master plan but more immediately has photography benefits — a key focal point in the iPhone’s horse-race against Samsung and Google flagships.
With smartphones maturing and each innovation cycle getting more rote and “incremental”, the camera has been the sexy component on which Apple has focused innovation and device marketing. That applies to AR and photography, but the latter is a much larger market today.
Meanwhile, smartphone cameras are all about innovating around space constraints and achieving DSLR quality with only millimeters of focal length. Lidar now joins Apple’s multi-camera and software-fueled systems for better autofocus and “seeing in the dark” in low-light scenes.
The Long Game
But what about lidar’s longer-term AR play? As noted, lidar unlocks sharper and more acutely-tracked AR experiences. Exceeding the capabilities of the “workaround” (yet impressive) depth-sensing cameras in the iPhone’s last few generations, lidar will enable AR that “just works.”
This will manifest in the mostly-unseen computational work that happens before AR graphics are shown, such as spatial mapping. Lidar is better equipped to quickly scan room contours, which is the first step towards “believable” and dimensionally accurate AR that occludes physical objects.
Many smartphones can do this with depth-sensing cameras. But the experiences are uneven, dependent on lighting conditions, and require waving the phone around to achieve many scan points. Lidar conversely is a more capable and purpose-built technology for spatial mapping.
Besides knowing that Apple is heading in this direction — and lidar’s obvious alignment with its AR trajectory — the company mentioned AR and lidar in the same breath a few times at yesterday’s event. Perhaps more notable, today’s AR undisputed heavyweight, Snapchat, is already on it.
Undisputed Heavyweight
To put some color (literally) on what lidar means for AR, Snapchat revealed its intentions to use the technology to embolden its signature AR lenses. A fleeting moment of yesterday’s event featured an upcoming Snapchat lens that demonstrates the added dimension lidar will bring.
As shown in the video above (and this timestamped portion of the event), the lens shows AR animations such as flora and wildlife in the foreground and background — all of which occlude the human in the frame. Tracking and rendering appear to happen quickly and realistically.
This lens, and Snap’s overall lidar embrace, aligns with a trend we continue to track in the company’s AR trajectory: the ongoing transition to the rear-facing camera. Erstwhile focused on face filters and selfie fodder, it wants to augment the broader canvas of the physical world.
For Apple, Snap, and others in the AR value chain, lidar will represent a big evolutionary step in underlying capability. Though AR was a sidenote in most of yesterday’s rapid-fire unveilings, its day will come and Apple knows it. The technology was just given more tools to get there faster.