One factor that’s defined the tech world in the past half-decade is privacy reform. There’s been swift backlash to the unchecked era of ad tech, which went into hyperdrive with the advanced targeting capabilities of smartphones, starting around 2007 (read: iPhone).
The resulting privacy reform has been taxing for players in the media and mobile ecosystems (often rightly so), even those who already followed good practices. And the degree of pain they now feel is a function of switching cost in pivoting from existing paradigms and procedures.
This all leads up to the concept of getting it right from the beginning. That paradigm shift wouldn’t be necessary if best practices were standardized at the starting line. This is a key concept for spatial computing and XR (AR and VR) because that’s precisely where they now stand.
These principles drive the mission of the XR Safety Initiative (XRSI). Founder Kavya Pearlman believes that pre-emptive action is needed to establish privacy and security standards. Because without them, we’re bound to repeat the mistakes of the smartphone era.
“It is not going to happen on its own,” said Pearlman. “The status quo will maintain itself, and get worse. We must make it happen […] XRSI exists to build standards, framework, and guidance so that the next generation of the Internet is open, equitable, and free from harm.”
Software & Silicon
According to Pearlman, challenges in establishing privacy standards in spatial computing stem from its complexity and its novelty. Starting with complexity, there are several integrated parts of the immersive tech stack. We’re talking hardware, software, and silicon among other things.
Given that value chain, it’s easy for companies to shirk responsibility for privacy measures (or lack thereof). For example, the software at the app layer may blame or abdicate responsibility to the operating system, which in turn points fingers at the hardware manufacturer.
With respect to novelty, all the above is plagued by an additional layer of ambiguity. Given inherently new use cases and integrations, security standards become even less defined. The result is to amplify tech companies’ ability to wiggle out of responsibility in all the above ways.
As a countermeasure, Pearlman is intent on eliminating all that ambiguity. That takes form in responsibility models that define who’s on the hook for what functions. And that’s anchored in use-case hypotheticals that define myriad permutations in XR user scenarios.
For example, if you’re accessing Google Docs through the browser on a headset, where does responsibility lie for communicating to the end user about security? This responsibility isn’t new in integrated systems… but accelerates as XR’s data capture is more dimensional.
“We are moving towards an era of constant reality capture,” said Pearlman of XR tech’s double-edged sword. “From a cybersecurity perspective, the attack surface has expanded from the nodes, servers, and the network to our living spaces and even to our brain.”
The Carrot and the Stick
The key word above is communication. To offer maximum utility, companies like Microsoft and Magic Leap should enable third-party cloud-based software. But should they also alert users in some cases that their software choice means they’re swimming at their own risk?
To answer the question about divisions in responsibility, Pearlman has initiated the XRSI Privacy & Safety Framework. Modeled after the likes of AWS and Microsoft in enterprise software, it’s all about delineating when you, the user, are under their care… and when you aren’t.
But rather than a TOS-like wall of text, these players have learned that the message is more effectively delivered in infographics. As shown in the chart below, a shared responsibility model delineates responsibility in an intuitive way. It’s all about the art of simplicity.
Beyond end-user understanding, a key benefit of this infographic approach is to lower enterprise adoption barriers – a critical part of the equation. Standardization in any industry is only as good as its adoption, so the name of the game is to incentivize companies to get on board.
This involves both the carrot and the stick. The carrot can involve outcomes like gaining user trust and being viewed in a positive PR light for protecting them. The stick is the opposite: getting slammed by the tech press and regulators (and losing users) for data protection missteps.
“Responsible innovation is not only good for building trust and reputation but also cost-effective for the entire product life cycle,” said Pearlman. “The best example is Meta, which continues to suffer from the loss of trust that emerged from its ‘move fast and break things’ motto.”
Business Outcomes
The “stick” may also involve GDPR-like regulation, where it becomes required to display privacy and security policies to users. But this is years away from materializing, given the pace of legislation and regulatory bodies, not to mention XR’s nascent stage and relatively low adoption.
So the near term will be all about private-sector measures and self-regulation. This is never easy to initiate, so it’s all about motivating good behavior. This means tying privacy best practices to business outcomes, such as positive recurring revenue (ARR) impact from instilling user trust.
Whether it’s driven by the public sector or private sector, the desired end result could be similar. For example, the responsibility model could be standardized and widely used in the way that privacy policies are today among websites and apps. They’re universally practiced.
In a similar sense, the ultimate motivator may be competitive pressure. And that happens in a few steps. It takes one or a few influential players (like a Magic Leap) to begin communicating a responsibility model to its users. From there it becomes a user expectation.
Other industry players are then compelled – whether they like it or not – to meet that expectation. And the more companies that adopt, the greater the competitive pressure becomes. Eventually, like a privacy policy, it becomes table stakes. You’re not seen as legitimate without it.
And that’s what’s needed to make sure everyone at the table is living up to privacy standards, with no weak links or gaps in the value chain. Meanwhile, XRSI has assumed the burden of kicking off that process, knowing that those who don’t study history are destined to repeat it.
“Web 1.0 was created by idealists and unleashed unprecedented potential,” said Pearlman. “Web 2.0 was created by capitalists, and realized web 1.0’s potential. But it excluded and marginalized several groups, centralized power structures, and exacerbated societal harms. Web 3.0 is an opportunity to become the creators of our tech ecosystems and own our realities.”
For those interested in learning more about XRSI and the issues it’s engaged with, check out the upcoming Metaverse Safety Week.