Next-gen production has met its match with music… not that music was ever an industry to shy away from innovation. Musicians are, after all, some of the most legendary innovators of all time. Experiential art – that which offers deeper and more immersive user experiences than traditional formats – has taken artists’ eagerness to push technological boundaries to the next level.

Perhaps no one put the experiential genre of music on display better than Justin Bieber with his 2022 virtual concert in the battle royale game Free Fire. Content creation studio ZeroSpace worked with Bieber to create the experience, including the use of his digital double to create the music video for his song Beautiful Love. This saw the transformation of Bieber’s likeness into a 3D asset.

The first step in this process was to scan Bieber with a photogrammetry rig so an avatar could be built using the data. This photogrammetry scan was then combined with footage of him dancing to create a realistic animated asset. With that, a virtual avatar was born and ready to be deployed for the Free Fire concert, which attracted more than 100 million views.

Lessons & Learnings

So what were the biggest lessons, takeaways, and success factors to point to? As is often the case, the success of experiential art in music depends on the technology powering it. Artists have pushed creative studios and tech providers to take big swings with such projects, so the magic often lies in the ability to fulfill the vision. Making this work typically includes some combination of the following:

1. Photogrammetry

Photogrammetry allows producers to create high-fidelity 3D models of real-world props and talent. This usually involves a multi-camera rig that captures image data from several angles to construct 3D models. It’s a complex process but the results, when done right, are realistic 3D models that can then be deployed in several ways and for several potential 3D endpoints. For example, when used in combination with motion capture technology – as done in Justin Bieber’s case – animated 3D models can be created.

2. Motion Capture

That brings us to motion capture. Production studios require the highest accuracy, flexibility, and performance from motion capture tooling to deliver a realistic digital double of an artist. Motion capture is a critical part of experiential music whereby, on a virtual production stage, an avatar and human can interact in real-time. The talent can see their character and adapt and modify their performance based on what they’re seeing with an immediate feedback loop. From VR Scouting and full-body performance capture to green screen and in-camera VFX, motion capture powers true-to-life visuals through tools that have been rigorously trialed and tested.

3. Markerless Motion Capture

Taking traditional marker-based motion capture even further, we’ve seen in recent months the power of markerless motion capture come to life. This requires significantly less equipment and less set-up time. This has meaningful results when it comes to streamlining operations and speeding up production. Capturing motion without markers ups the game in terms of the quality of the experience for the viewer or participant, making best-in-class virtual body ownership and immersion in VR a reality with both accurate tracking and very low latency.

For example, ZeroSpace applied markerless and optical capture technology to use simultaneously for a Jack Daniels virtual concert experience. Vicon was used to track the props, getting the cymbal movement from the drumming to a high degree of accuracy, while markerless tracking enabled the performance capture of five people all in one day. Altogether, concurrent multi-modal motion capture is ushering in a new era for motion capture that presents tremendous potential for production studios.

4. Headsets and Spatial Computing

Live streaming into Apple Vision Pro is perhaps the newest way for brands and artists to leverage headsets as a platform for live performance and entertainment. This pushes the boundary for spatial computing in terms of what headsets are capable of. For a raw take on what this looks like, check out this demo on LinkedIn. This is a telling example of how production studios continue to experiment with what’s possible in experiential art for entertainers.

Democratizing Experiential Art

What’s so powerful about this new era of production in music is that there’s a democratization of the technology taking place. What was once possible only for huge production studios with big teams and even bigger budgets is now within reach of smaller studios and independent artists who are deploying this tech on a smaller scale yet still producing high-quality content.

This levels the creative playing field in a meaningful way, which will encourage producers and artists even more to think outside the box. There’s tremendous potential in what’s to come.

David Edwards is a VFX product manager at Vicon. Elena Piech is a real-time 3D producer at ZeroSpace


More from AR Insider…