Vizrt and Ncam worked together earlier this year to produce the 2022 Elections for BBC Northern Ireland. By combining the two companies’ lines of expertise -- Vizrt’s software-defined visual storytelling and real-time graphics, with Ncam’s real-time camera tracking -- they enhanced the conventional coverage with augmented reality (AR) graphics.
Held every five years, the programming covers the elections results for all 90 members of the Northern Ireland Assembly, which is the devolved legislature of Northern Ireland. This year’s elections, which were held on 5 May 2022, were strongly anticipated across Northern Ireland (NI) as well as the wider UK.
As It Happened
Throughout the election coverage, BBC NI worked with the Ncam Mk2 tracking system alongside Vizrt’s graphics compositing engine, Viz Engine. The goal was to create AR content linked to updates and events as they happened within the BBC studio space to enhance the visual quality and produce more engaging content for viewers. The integration between Ncam and Vizrt relies on real-time tracking data being sent from one machine to the other accurately and without interruption.
The content ranged from small AR pop-up statistics on each Member of Parliament running in the election to huge picture board walls detailing party members and critical information. Because the graphics were only 3D elements augmenting the real studio environment, the newscasters were able to move freely within the studio space to discuss and give insight into what viewers were seeing.
Camera Tracking Bar
However, many of these graphics featured live animations, which meant their position was especially critical. These graphics are often linked to live data represented with charts and graphs rising out of the floor, for example, as the numerical data dynamically updates through animation.
Setting Up the Studio
The Ncam Mk2 camera tracking bar and Connection Box were mounted to the BBC’s broadcast camera, and the bar was articulated up towards the studio ceiling. BBC used the Ncam 'natural core', which relies solely on natural features like lighting rigs, scaffolding and other real geometry as tracking points.
Tom Evans, Head of Product Management at Ncam, said, “Referencing tracking points (or datum points) from a studio ceiling is very typical. Ncam can track both with infrared using reflective elements, or from fixed natural features using a pair of stereoscopic cameras, or from a combination of the two methods.”
The Mk2 Connection Box device is about wirelessly linking the tracking bar to a local or remote Mk2 graphics data server. It eliminates the need to attach a server physically to the camera. The server was then racked in a separate room on a floor above the studio alongside the Viz Engines. This is also where the operator would manage the Ncam system and make any adjustments.
Mk2 data server
Key Location Data
“Within Ncam we can set the datum points manually anywhere we need to, save them and then reload them at the click of a button. Once that data has been set within Ncam and then further tweaked within Vizrt, we can consistently reload the exact same point time after time without daily readjustments, saving time, effort and ultimately money,” said Tom.
“As with any tracking system, Ncam generates a 'point cloud' saved with all the reference points detailed within the project. As part of the set up, calibration is essential to establish the algorithmic relationship between the angle of the Ncam camera bar and the production camera and lens. If the Ncam camera bar is moved or knocked then this calibration – usually a 5 minute process – will need to be repeated.
“Other on-going adjustments may have to be made to gamma and exposure level settings when natural features are used, depending on the lighting environment. Sometimes the density of the point cloud (number of points referenced) will also be adjusted for optimal performance. There are also delay settings that can be adjusted to allow for the GFX render delay to ensure that both Ncam and Viz are in sync.
The system’s ability to zoom in and focus on these graphics while maintaining a realistic look was another crucial factor for BBC NI, especially as the host was often in shot or introduced to the shot while the graphics were live. The camera work, which typically focusses on the host, could then proceed as usual.
To make the graphics lens match the live action lens, BBC NI used Ncam’s lens calibration to build an accurate map of the lens in use, including any lens distortions. This lens profile was then exported directly from Ncam into a format that Viz Engine can process, enabling users to zoom in and focus on the host or the AR elements as required, without compromising the illusion. www.vizrt.com