One of AR’s biggest opportunities is to broaden beyond our current connotations. It’s not just about cartoon monsters and social lenses, but should include any meaningful augmentation of reality. If it’s a digital technology that alters or improves your perception, it’s AR.
One example of this principle we’ve explored is intelligent audio whispers through pervasive AirPods. The hardware use case has been conditioned… now it just needs software (apps) to unlock audio AR experiences for everything from geo-local discovery to smart assistant functions.
Other examples of this “loose construction” of AR include everything from Zoom backdrops to the LED walls increasingly used in film production. And there are consumer apps today that augment reality in smart ways such as the Brickit app for utilizing all your extraneous Lego pieces.
There are several more of these broadened connotations for AR that deserve their own article. But for now, we’ll note the latest one to cross our desks: immersive art. The idea is to transform a given space through digital experiences that augment and interact with the environment.
The latest example of this concept was brought to our attention by Chicago-based marketing agency Next/Now. It created a 24-foot diagonal LED screen that’s a digital canvas to enliven the Washington D.C. headquarters of commercial real estate firm Cushman & Wakefield.
The display projects data-driven alterations of moving digital art. This involves a generative visual system whose movement, color, and composition are influenced by environmental data like light and sound. The result is a living visual palette that augments the office experience.
Under the hood, Next/Now tells us that it used a combination of software including Touch Designer by Derivative, Notch, Unreal Engine, and Cinema4D. Altogether, this cocktail of software tools enabled custom 3D animation and effects, rendered on the giant LED screen.
“We are laser-focused on reimagining how people interact with spaces, places, and each other,” Next/Now Founder, Alan Hughes told AR Insider. “Technologies like wall-sized LED displays, reactive surfaces, spatial audio, and AR give us a toolkit to define the future as we invent it.”
Back to the concept of redrawing the boundaries of AR, the technology shouldn’t have a narrow definition of graphics that overlay physical spaces. “Augmentation” should include a more expansive set of ways to fuse the digital and physical, such as visual search.
More examples could be on the horizon as Apple’s XR entrance looms. Though it may enter the market this year with a VR-like device (with AR passthrough), its longer-term smart-glasses play could reinvent AR. This could involve broader ways to augment and improve perception.
This concept represents a natural evolution for AR, as well as a way to expand its potential business models. Just like the evolution of and lifecycle of past emerging technologies, AR will grow into its own skin and expand into forms of augmentation that we don’t currently associate.
As for business models, broadening AR’s definitions means broadening its use cases. And that means broadening its addressable market. This could be a positive step as it means potentially more revenue for the early and unproven AR sector that’s still in the process of defining itself.