
The annual HLTH event in Las Vegas convenes the major players and industry leaders driving innovation in healthcare technology. By providing a stage for important conversations, and showcasing technology innovations in areas like digital health, AI, age tech, ambient sensing, and patient-led care, this event provides a snapshot of the health IT industry and key business trends that will be driving this multi-trillion dollar sector in the near future.
Toto, I don’t think we’re in Kansas anymore. Nathan, I don’t think we’re in Silicon Valley innovation anymore. Yes, we did go to see The Wizard of Oz at The Sphere. It was totally worth it. You should see it. We need a lot more content, but this is a story about health technology.

In the years that I’ve been going to HLTH, I’ve found it is a fantastic way for me to reframe and recalibrate my assumptions about enterprise innovation, tech trends, and the kinds of problems this gigantic sector, which represents nearly one-third of the US GDP, actually needs help solving.
HLTH is a leading event that gathers over 12,000 professionals, including 2,750+ CEOs, from global giants (like Kaiser Permanente, Microsoft, BD, Samsung, GE, Google, PwC, CVS, etc.) to innovative startups, across the entire healthcare technology spectrum.
The tech innovation on display at HLTH gives us a real-world reality check about the industry’s direction, momentum, and focus; and gives clues to which opportunities in spatial computing, AI, automation, and IoT are likely to be most successful in the space.
Here are the top trends and insights I learned at HLTH 2025 and what they could mean for Enterprise innovation, XR, and yes, obviously, AI.
- AI, workforce efficiency lead all health tech conversations.
- AI needs a human layer for trust & adoption.
- Top 3 spatial computing use cases on display.
- What do ambient, remote sensing in health mean for XR & IoT

AI leads all conversations. Data clarity and workforce efficiency lead use cases
It’s no surprise that AI was a leading topic, or that nearly every booth in the Expo claimed an AI advantage. By far the most common application of AI was hyper-focused on processing and presenting data more clearly and efficiently.
Whether this was the “ChatGPT for Doctors” that OpenEvidence trains only on medical literature, or tools like Vital.IO, lead by Mint founder Aaron Patzer, which uses AI sentiment analysis to personalize your healthcare communication at the reading level that works best for you; there’s incredible amounts of value in the health data we already collect, and AI applications that can surface the right data at the right time are flourishing.
The next AI frontier is unstructured data.
While it’s hard to see how this level of AI innovation can sustain itself as some of the early data problems get ironed out, several leaders I spoke with noted that “We’ve been currently only working on the 30% of data that’s structured data. There’s another 70% of healthcare data that’s unstructured, more complex, and represents a new frontier of potential value.”
Workforce challenges like hospital safety, doctor burnout, and access to quality training are further driving interest and urgency on AI-driven efficiency or focus time that can be given back to doctors and nurses.
If you’re not addressing the workforce strains or the (data) complexity of the care chain, don’t expect to be top of mind for healthcare CIOs or investors.
This point was notably clear in comparison to the “renaissance” of smart glasses and MR headsets we’ve been seeing in the XR world. Smart glasses may have your attention, but the health tech industry isn’t as focused on new displays and interactions as much as we might hope.
AI needs a human layer for trust & adoption.
Across all conversations, there was widespread emphasis on “Use AI, but with a human layer as a safety.” For this highly clinical and scientifically oriented audience, a human expert is still required to foster adoption and trust of AI tools and results.

“The human in the loop,” the doctor, RN, or other healthcare professional, is supposed to guarantee that machines are only automating good decisions, but many tools I noticed provided optional links and resources for clinicians to review as part of their workflows, but not all of the tools I talked to were measuring the click through rates on these linked documents.
While there may be some outsized optimism in the assumption that people are actually checking the work of the AI, this ‘human in the mix’ standard seems to resonate with enterprise buyers.
XR organizations could learn a lot from this model for building stakeholder trust with a disruptive technology. Can you summarize your safety strategy in one sentence or less?
Top 3 XR use cases we saw at HLTH
1. Workforce Training is still the most prominent use case.
With workforce strains and shortages hitting nearly every inch of the US healthcare system, training remains one of the easy-to-understand, high ROI applications of virtual reality (VR) in healthcare.
VR’s ability to deliver hands-on skills training and foster empathy through immersive perspectives and storytelling gives XR training unique advantages in a time when HIT leaders are balancing care with patient-centricity.

Notable VR training solutions on display at HLTH 2025 included Osso VR’s surgical simulations, and a novel training developed by startup Lucid, which builds empathy for patients experiencing post-anesthesia delirium and aims to reduce occurrences of patients being restrained in the recovery room. Talk about improving care outcomes.
2. Mixed reality tablets for surgery, not smart glasses.
Thanks to all the consumer smartglasses hype coming out around Meta Connect and Snap’s Lensfest, I was hoping to see the future of heads-up surgery and smartglasses interfaces at HLTH, but I was quickly reminded how timelines in innovation differ wildly by industry.
There were some limited but significant examples of mixed reality (MR) overlay in surgery settings, but these were much more bulky and ruggedized than a pair of stylish sunglasses.

One device by SKIA used an OR-ready tablet device to align medical scans on a patient’s brain as a guide for surgical procedures—they call it “medical markerless AR”. While the super-sight it enables was striking, the size of the 2-handed device felt pretty bulky for any long-term use.
From the limited presence on the show floor, it seems that current smartglasses and headsets are too cumbersome or restrictive, or otherwise unfit for surgery theaters, despite the added accuracy of ‘in-place spatial data’ for surgeons.
3. Sales engagement tools using VR headsets.
Ok, VR nerds, this might be the big news you were hoping for! Healthcare marketers don’t yet know how to build for 3D, but they are trying nonetheless.
Often at HLTH there has been a smart vendor of MRI machines or other tremendously large, expensive devices, who used virtual reality showrooms to give immersive tours of their delicate and hard-to-ship equipment. (We love a good application of the RICE model for VR use cases.)
Outside of these, here were some companies exploring sales engagement tools that utilized VR headsets, but much of this content was 1-1 VFX that could have been on a flatscreen and didn’t seem to be well-suited for an expo of 12k people.

There were at least 3 Proto hologram boxes across the expo. AARP’s Age Tech Collective used theirs smartly to spotlight each of the founders in their incubator; ensuring hologram founders were always available and pitch-ready if someone wanted to learn more.
If you’re an experienced designer with skills telling 3D stories, there are definitely some opportunities to help level-up the XR content in the space, but interest in VR is still pretty nascent—especially considering one big innovation on display was a mobile app that let radiologists see their scan on a mobile phone, on the go, not in a specialty viewer or headset.
Remote ambient sensing could be big for XR and IoT.
Simply put, ambient sensors collect all kinds of health, wellness, and situational information from patients without touching them; often without you noticing. This is the one space where healthtech starts to feel like spy tech.
The types of Camera Vision being used for body tracking and fall prevention are examples XR pros would be familiar with, but other examples got quite wonky quite quickly. RGB color analysis of your face for blood health, or chat apps that recognize elevated emotions through sentiment analysis before your patients know they’re upset.
Where past years featured large custom devices, we saw full ambient sensor arrays installed as easily as wifi panels in ceilings.
Advancements in sensing, through a variety of easy-to-deploy remote sensors, wrapper apps, edge compute, and network protocols for data sharing, indicate a future where smart glasses could offload sensing to IoT devices that feed rich data into spatial computing apps.
The Health IT industry’s rapid developments in the area of ambient sensing could open the door to lighter smart glasses with more personalized, contextual information.
Advice for innovators working on a healthcare strategy. Invest in partnerships and expect excruciatingly long timelines.
There’s no escaping the hard truth that Health IT timelines differ dramatically from what Silicon Valley is used to. Global innovators I talked to on The Tech Glow Up, shared that the contracting phase for an enterprise-scale pilot would typically take at least six months.
For newer technologies looking to break into the healthcare space as a path to scale, it’s important to understand Health IT expectations about timelines—and make sure your investors get it, too.
Building research partnerships with larger players and platforms can be a way to speed your entry into bigger health tech contracts, but it won’t change the layers of red tape and review you’ll still need to get acclimated to.
A passing moment of consumer LBE VR in the wild
As I was wandering Vegas after the event, reflecting on the marathon of interviews and conversations with leaders in health technology I’d just completed, I ran into VR Adventures, a VR theme park on the Linq Promenade, which housed 6 multiplayer VR attractions in the space of an ice cream shop.

Most of the 4-person ‘rides’ were thematic attractions like hot air balloons or white water rafts that were packed with physical actuators to simulate movement and haptic feedback for the riders. Furthering the similarities to an ice cream shop, a staff of two could easily manage the whole operation.
These theme-park-inspired installations used older, ruggedized VR headsets and ran about $15 a person or $60 for a full ‘raft.’ For Las Vegas entertainment options, a trip to the VR arcade is actually a pretty accessible price point. Could LBE VR attractions be the answer to all the unused retail spaces dotting malls around the country?
Possibly, but we’ll need more use cases and applications of spatial computing than just video games and thrill rides to fill up the whole mall, hospital, or health care system.
Nathan C Bowser is founder & CEO of Awesome Future, and host of the business podcast ‘The Tech Glow Up.’






