
The BBC, ITN, and many global news organisations, academic institutions, and technology providers have spent the last year reimagining the news production room. And the active ingredients in this transformation were XR and AI, specifically aimed at changing how live broadcasts are produced.
Known as Evolution of the Control Room – Leveraging XR, Voice, AI & HTML-Based Graphics Solutions, the IBC Accelerator Media Innovation Programme project set out to explore how to combine XR, voice command, and AI to help news teams work faster, more flexibly, and anywhere — including from inside an XR headset.
The program also brought together a broad range of media brands under one unifying goal: Simplify news production workflows. This includes making life easier for operators, helping them be more creative and helping them be more creative.
Making Live Production Easier in a New Media Era
All news teams need to do more with less while engaging wider audiences across multiple platforms. Pushing to be more efficient, scalable, and flexible, organisations are finding ways to power distributed workforces. While many remote media production technologies are out there today, few have fully harnessed the potential of XR to transform home offices or remote work locations into truly intuitive, virtual environments that replicate the traditional on-site experience.
“Broadcasters today are no longer just TV studios — they’re media production facilities, catering to a wider audience across multiple devices and platforms,” said project champion and Grace Dinan. “At the same time, the number of independent content creators is growing, with some producing professional-quality work using just a laptop or smartphone. To stay relevant, broadcasters must adopt more flexible, scalable, and multi-platform production models.”
Fellow project champion Morag McIntosh added: “The BBC has long been a global frontrunner in developing new technologies for live production control, with a key driver being to provide value for audiences. This project aimed to transform our options for live broadcast production. AI technologies bring huge opportunities for streamlining and simplifying workflows, allowing for more content to be created, and allowing us to achieve higher production values without added cost.”
Focusing on AI assistance for newsroom agility, XR control for production from anywhere, and automated graphics for multi-platform efficiency, the project group developed a virtual XR control room, voice-driven AI workflows, and HTML-based graphics solutions. After an accelerated R&D process over six months, the proofs-of-concept were showcased live at IBC2024 in September.
Intuition and Efficiency with XR and Voice Control
Developed by TRANSMXR in Unity 3D, the team designed a customisable, virtual control room for XR head-mounted displays. The goal was to transform any workspace into a full production gallery, supporting remote collaboration, distributed studios, and work-from-home solutions.
Compatibility across virtual screens and external hardware devices like keyboards meant users could easily navigate hybrid environments while using hand gestures to interact with studio production elements, including the Cuez Rundown system to help news teams organize scripts and run orders in real-time. Integrating voice control to interact with production interfaces was one of the most important and nuanced aspects of the project.
“To be effective, voice control must go beyond simple navigation,” said Dinan. “That’s why we focused on AI-assisted voice interaction, allowing operators to communicate in natural language with the UI, rundown, and automation system. However, voice control isn’t always the best option. Many operators prefer tactile feedback — physically pressing a button provides assurance that an item has gone to air. There’s a trust issue when using voice for critical production tasks.”
Despite challenges around latency, intuition, and familiarity, McIntosh sees huge potential in voice control to help directors be more creative and responsive.
“I love the idea of bringing voice – the original control protocol – back to the control room, she said. Automated technologies have provided huge gains for the industry, but they’ve taken a lot away in terms of user experience. Since multi-camera TV began, we have built our live galleries around the central idea that directors and operators should be looking up and having their head in the live action of the programme.”
While operators today rely on a range of keyboards, mice, and touchscreens to control their content, McIntosh emphasises the value of core human intuition.
“Bringing voice control back into galleries using AI and natural language feels like a logical return to the simplest, most effective, most instinctive option, she said.”
What’s Next for XR and AI in the Control Room?
Some of the proofs of concept developed across the XR, virtual production, and graphics workstreams have already begun to shape real-world production environments. While Dinan highlights a use case – developing a real-time graphics system in XR for an election news broadcast – McIntosh explained that the BBC has made significant leaps in using flexible HTML graphics workflows to give content creators new tools and options in multi-platform production. The initiative also fueled development in new XR solutions among other project participants.
“nxtedition, one of our participants, has now integrated AI agents into their latest release,” said Dinan. “Their version of the XR Control Room has won multiple awards and is already in the hands of customers.”
“Audience research is clear,” added McIntosh. “Humans want humans to be central to news output. When it comes to news, maintaining transparency and prioritising trust is critical. This year, we’re continuing the work we started on last year’s AI workstream, building out a full agentic ecosystem of a number of live gallery assistants with a focus on latency and reliability. The ambition here is to use AI to meaningfully assist our skilled operators, whilst prioritising value for audiences.” McIntosh.
A new iteration of the project is called AI Assistance Agents in Live Production, championed again by the BBC, Channel 4, and ITN, along with participants Cuez, Amira Labs, and Highfield-AI. The group aims to develop AI-driven production assistants that can navigate run orders, detect errors, preempt potential issues, and even control production systems through natural language commands.
This group’s journey captures exactly what the programme is all about — no-wrong-answers-R&D, all-in collaboration, and genuine commitment to improving real-world creative workflows. Immersive technologies and AI have a huge role to play in shaping the future of media production, especially when the biggest players and most ambitious teams come together to make it happen.
Muki Kulhan is Innovation Lead at the IBC Accelerator Media Innovation Programme. The 2025 Accelerator projects referenced in this article will be showcased at IBC2025.
