Ncam Mk2 Suite Upgrades AR Tracking Accuracy and Flexibility for Users

Ncam Mk2 bar server

The Ncam R&D team continues to develop their AR platform – the Mk2 Camera Bar first shown as a prototype at IBC2019, the Mk2 Server and supporting Ncam Reality 2020 software, now redesigned to add more functions for users.

The Mk2 Camera Bar makes use of the Intel RealSense hardware, heavily modified to suit broadcast and film environments. It is now smaller, lighter and able to mount to a wider variety of camera rigs so that DPs can use it with a jib, Steadicam, wire cam or drones.

RealSense is Intel’s approach to Simultaneous Localization and Mapping (SLAM), and is used to equip devices with the ability to see, understand, interact with and learn from their environment. Not a software approach, it uses instruments and algorithms to solve for the location and orientation of a device – such as a robot, vehicle or Ncam’s camera bar – in 3D space, while simultaneously creating and updating an electronic map of the unknown surrounding environment.

Ncam mk2 virtual studio1

Image courtesy of Tencent using Reality FX and Real Depth.

RealSense tracks the environment from an inside-out POV to determine the device’s own movement, in contrast to the outside-in POV of motion capture which tracks objects by surrounding them with stationary cameras. One of the best aspects of SLAM is the ability to track all six degrees of freedom – the X, Y, Z position in 3D space as well as angular orientation pitch, yaw and roll.

Another refinement of RealSense is that it uses Visual Inertial Odometry (VIO) to track its own position and orientation in 3D space. VIO appears to be the closest electronic equivalent to how a creature senses the world – using CMOS sensors as eyes to see the surrounding environment, an inertial measurement unit (IMU) as the inner ear to sense balance and orientation and computing as a brain that combines the information into real-time location and mapping.

Mk2 Camera Bar –  What is New

Ncam mk2 camera bar

Ncam mk2 server Connection Box

The camera bar’s previous generation hardware required an Ethernet tether to return tracking data to a separate server running Ncam Reality software. The new software now runs on the Mk2 Server which can be mounted on the camera or rig itself so that all camera tracking and lens data is computed locally. (See all hardware mounted in the image at the top.) This configuration gives Ncam users completely wireless tracking on a standard RF camera link and opens possibilities for remote production of AR graphics. It also leaves rack space free for outside broadcast applications on board vehicles.

The Ncam Reality 2020 software suite has been redesigned for ease of use, performance and stability and to add some new functionality. When extracting features for efficiency you can extract a hybrid set including natural features, markers and fiducials. Wireless functionality is included to use with the Mk2 hardware. The AR Suite software, which comes bundled in a Lite version as standard with the hardware, integrates directly into the Unreal Engine 4 to use as a complete, accurate system for real-time VFX compositing while shooting.

Ncam mk2 virtual set1

Ncam has made other improvements to the Mk2, aimed to result in faster, more precise set-up and calibration including a wizard-driven, task-specific UI, reducing the need for specialist operators. Regarding accuracy, the system is able to track non-natural features without needing to first ‘learn’ a set of marker patterns. Due to mounting the lenses internally, the camera bar set-up is now more rugged. Also, customers have options to purchase outright or license via annual subscription.

CEO Nic Hatch said that Ncam’s goal is to help customers realise their vision without having to worry about equipment – either hardware or software – and that they regard the updates made to the platform in this release as the foundation of their future development. “The close partnerships we have with Intel, Epic and others will allow us to take advantage of those companies’ future improvements in tracking and rendering, as well as our own developments in spatial environment data capture and its reuse in non-live environments.”