Vizrt and Vicon Track Together on Virtual Sets
Motion capture specialist Vicon has formed a strategic partnership with broadcast production tools developer Vizrt to develop a camera tracking system for integration with Viz Virtual Studio. Owing to the precision of its camera lens calibration, the new system, called Broadcast, allows production teams to deploy virtual sets quickly and accurately, based on smooth, low latency camera tracks. It requires a small set of Vicon’s Vantage optical motion capture cameras, and is delivered through a straightforward workflow.
Vicon’s Broadcast system includes new software called Studio developed especially for broadcast virtual set applications. Vicon Studio calibrates broadcast cameras and lenses in a few minutes, and is compatible with Vizrt’s Tracking Hub, a configuration and control system for virtual studios that Virtual Studio refers to for location data to position the set assets. Vicon Studio also integrates with commonly used broadcast lenses from Fujinon and Canon. The software’s resulting output is a synchronised stream of data including camera position and focus, plus the zoom information from the lens.
Complex 3D virtual sets that include interactivity can be created in Viz Virtual Studio’s interface using its 3D graphics tools. It can be integrated with newsrooms and existing software installations and, as well as camera tracking systems, supports other studio equipment within its own studio and production workflow. Virtual Studio’s Defocus Shader emulates the focus effects of optical lenses to create a virtual set that is almost indistinguishable from a physical studio.
As virtual sets become more common and at the same time more complex, accurate tracking is playing an integral role creating realism in programming. Vicon’s optical tracking will also make it possible for a production teams to scale virtual projects to include multiple studio cameras and props within the same Virtual environment.
The process begins with Vicon’s Calibration wand fitted with active markers that calibrate the optical and video reference cameras positioned on the physical set. Video captured while waving the wand in front of the camera is combined with the Vicon cameras’ tracking data it collects simultaneously. Warren Lester, hardware product manager at Vicon said, “Using timecode and genlock, frames from the video are matched with frames of tracking data. This reveals the 3D position and orientation of the wand as measured by the Vicon system, plus the observed position of the wand as seen by the Studio camera.”
The resulting data is automatically processed to calculate the lens parameters - focal length and distortion - and then calculates the position of the camera sensor relative to the markers on the camera. Also, a tracking target of known dimensions is mounted on the camera body. When the camera ‘looks’ at this target, the user can define the exact field of view of the lens.
“By waving the wand in front of the camera, rather than manually surveying the lens, you can determine the camera sensor position very accurately. This allows us to output the sensor position, as the camera moves over time, to the Vizrt Tracking Hub and Viz Virtual Studio software,” said Warren.
From there, the Vizrt lens calibration separately calibrates the correct field-of-view, nodal point and deformation coefficients for every zoom and focus position of the lens. Once Studio has calibrated the position of the camera sensor, Vicon’s Tracker software streams out how the position changes over time, along with the lens field of view. The Vizrt software maps out those properties of the lens – that is, how the distortion changes with field of view. Consequently, an effective overlay of virtual elements can be obtained at whatever lens zoom and focus setting Tracker is reporting.
Vizrt chief engineering officer Gerhard Lang said, “Studio uses the calibration data to send the sensor position to the Tracking Hub, which in turn calculates the studio camera position and sends the final package to Viz Engine, Vizrt’s compositing engine. These positions are used to align the virtual set elements in 3d onto the video from the studio cameras so that, for example, a virtual object on the floor, remains stable on the floor as the camera moves around it or zooms in or out.”
The Vicon cameras measure only the position of the broadcast or studio camera or cameras, and currently no use is made of the of other motion capture abilities of the system and Vicon cameras. However, Vicon and Vizrt plan to exploit those capabilities in the future. www.vicon.com
www.vizrt.com