Visby will speak at the VR/AR Association’s SF chapter event about lightfields on 6/14. ARtillry is an event partner and will moderate/emcee. Contact us for a discount code to attend.


In all the VR excitement, we often forget about today’s technical challenges and limitations. We’re talking things like connectivity speed, battery life, and graphics processing — the factors causing first generation high-end VR rigs to be stationary and tethered experiences.

But one of the biggest technical hurdles is file size. For HD resolution in 360 degrees, the sheer weight of bits starts to really stack up. And don’t forget that total is then multiplied by two for VR’s sterescopic experience (these challenges were outlined in a Google I/O session).

This is the challenge that Visby is trying to solve. We got the chance to catch up with the stealth-mode company in San Francisco recently. Its core technology is a codec for VR content capture and playback that will achieve meaningful lossless compression.

Where this really comes into play is lightfields. For those unfamiliar, these are essentially VR experiences you walk around in. The key is photorealistic 3D object rendering from every possible angle, including things like accurate light reflections — A massive data payload.

Visby achieves this by capturing multiple perspectives in a given lightfield. It then uses that data to extrapolate and simulate the remaining vantage points, thus shifting the load from storage to processing. The result is a data-efficient, yet still dimensionally accurate, lightfield.

This has been one of the biggest gating factors in reaching VR’s true potential. Visby co founder Ryan Damm explains that we have fully immersive graphical VR in games, including positional tracking. And we have photorealistic VR without full immersion and tracking.

But the true promise of lightfields is the best of both worlds — photo-realistic immersive 3D spaces you can walk around and experience positional tracking. And that’s when we start to get to VR’s holy grail, the fabled holodeck (though haptics remains a technical hurdle).

Thinking further into the future, Damm and BD lead Scott Hill aspire to support lightfield applications in computer vision. This takes things like AI and autonomous vehicles and empowers them with a more multidimensional sensory input (field of vision).

But in the nearer term, they believe that lightfields will follow VR’s overall path, applying first to entertainment like gaming and cinematic applications. Then it will move into the all the VR-ripe verticals we continue to examine — everything from enterprise to education and design.

“All of the areas where VR is being discussed will benefit from lightfield technology,” said Damm.


Damm will speak at the VR/AR Association’s SF chapter event about lightfields on 6/14. ARtillry is an event partner and will moderate/emcee. Contact us for a discount code to attend.

Header image credit: Lytro