Ncam tracking system simultaneously records a camera’s position and orientationNcam-EOT-FP-98a
in real-time. Live composites of the real set with VFX are inserted into the
viewfinder to help visualise shots.


Ncam Camera Tracking Brings Post Production On-Set

Ncamis a camera tracking system that simultaneously records the camera’s position, orientation and focal length, all in real-time. With this information it’s possible to insert visual effects elements into shots and view the finished composite with reasonable accuracy. It can lock the digital material to all the movements of the camera in real-time -panning, tilting and zooming – all types of camera move can be accurately tracked. Until recently, this kind of work has been done in post production by tracking the plate. It can often take half a day to track a shot.
Ncam-EOT-FP-98


Ncam has played a role on major film productions such as'The Avengers'and'White House Down', and forthcoming movies'Jupiter Ascending'and'Robot Overlords'. On the recent film‘Edge of Tomorrow’, production VFX supervisor Nick Davis introduced director Doug Liman to Ncam during early production, resulting in a decision to carry out a demonstration on the movie’s Heathrow set at Leavesden Studios. At the studio, only a part of the set had been built practically.

On Set with Ncam

CG supervisor Martin Chamney fromNviziblevisual effects studio, a sister company to Ncam, and his team of two other Ncam technicians showed Doug Liman how the green screens could be replaced with digital set extensions and animated aircraft, and live composites of the real set and visual effects elements could be created and inserted into the camera’s viewfinder to help visualise shots.

Ncam is focussed on moving post production closer to production to help filmmakers make better informed decisions while all members of the production are still working together – the crew, director, producers, DP, on-set VFX and SFX supervisors, cast and many others. Ncam also aims to provide more accurate, timely data to visual effects teams.

Ncam-Film


Martin explained that the Ncam system comprises of a camera bar containing two small witness cameras and multiple sensors and a tether which sends data back to an Ncam computer workstation.  “The camera bar is mounted to standard rods under the main lens of the film camera. The witness cameras have fixed wide angle lenses and process the visual tracking aspect of the system. Sensor tracking aids this process as well detecting inertial movements of the camera. Visual tracking and inertial tracking is mixed together in a fusion algorithm, creating a 3D camera track solution in real time. This creates comparative results to post production tracking - but working in real time is the critical difference,” he said.

Six Degrees of Freedom

On set, a second workstation streams 3D CG content to the Ncam workstation. Ncam software combines the tracked camera with the CG assets, applies lens distortions and keys the green portion of the live image and replaces it with the CG assets all in real-time. The composite of the live plate and VFX is relayed back to the eye piece of the main camera so the camera operator can see the composition and framing of the shot and the feed is simultaneously recorded by video village.

Ncam-EOT-FP-480


Martin said, “The recorded Ncam material is beneficial to editorial because the editor gains a better feeling for the geography, screen direction and narrative. Editing with this material is preferable to cutting together many images full of green screen, and the VFX team gains a blueprint of the director’s vision for the shot.

“Ncam doesn't only give the visual output. An automatic take system saves a digital file to disk every time the main film camera goes in to record mode. This FBX file can be used for tracking purposes in post production. The primary data within the file is the camera’s six degrees of freedom – position x, y and z, and rotation x, y and z, dynamic focal length and a point cloud – generated from detected areas of contrast in the scene - all synced with timecode.”

Flooded Set

Later in the production of 'Edge of Tomorrow', Ncam was in use during principle photography of the scenes for the movie’s climactic Paris sequences when the alien Mimics attack the city from under the Louvre, where the team was able to put in the correct geography of that location as it was to appear in the film. The practical set was a section of the Tuileries garden, which Ncam was able to extend with buildings to be seen in the distance, including the Eiffel Tower in ruins and the pyramid of the Louvre. These are important assets for the story and showed the crew where best to point the camera at any moment.

Ncam-EOT-FP-481


“The Paris sequences involved a one week night time shoot, which I could handle with two technicians. The set was challenging, essentially a massive tank of water,” said Martin. “This meant that everyone but the camera crew and the talent was confined to the perimeters of the set. ‘Edge of Tomorrow’ was shot on film with Arriflex and Panaflex cameras and anamorphic lenses. A Technocrane was used, mounted on a vehicle that could be driven around the large set area. It could remain tethered to the Ncam system throughout.”

One of the technicians was at the centre working with the crew, another was at the edge of the set at a computer controlling the server end with the tracking software, and Martin was at another computer providing the graphics and previs for the composite, monitoring the fit and line-up. The Ncam system was on a long tether attached to the bar mounted on the camera, and Martin and the other team members communicated via radio.

Post-vis Shots at Nvizible

Martin Chamney’s company Nvizible were able to make use of their Ncam data to supply post-vis shots for various sequences for the finale of the film. During the battle, the characters have fallen through the Louvre building into a completely flooded underground carpark under the museum, where the main character Cage must swim down to destroy the Omega, a massive creature concealed inside, taking the role of controller of the Mimic aliens. Surrounded by wire-mesh barriers and underwater green screen, following Cage’s progress through the dim set became uncertain.

Ncam-EOT-FP-49


Cage's companion Rita also has to negotiate her way around a Mimic in a corridor where, again it was at first hard to tell in what direction she and the creature were moving and their relative locations. After receiving underwater green screen material from editorial, Nvizible were able to create post-vis of the creatures and digital set extensions as required to assist with screen direction issues and the editing process.  

Lens Distortion

While Ncam can be used with almost any camera, digital or film, the lenses are critical to the system’s results. Before starting to shoot with Ncam, all lenses that will be used have to be calibrated using Ncam’s proprietary calibration system. The lens distortion is read in and used to create a 3D lookup table that can be used as a real-time distortion model. Thus, when Ncam is compositing any virtual assets over a live action plate, it is using that same 3D LUT to distort the CG elements to match the plate. Focus data is produced frame-by-frame, showing the operator the exact focus location at any moment and preventing guesswork.

Lens distortion is recorded separately from Ncam’s camera tracking data, which is concerned with positional data x, y and z, and rotational data x, y and z. Ncam’s managing director Nic Hatch said, “The ultimate goal is to recreate the camera Ncam is tracking with, and this means accurately capturing both the postion and rotation translations, sensor size, as well as the position and nodal points of the lens, which changes during a shot as the operators zooms and re-focuses. The recorded values map the entire lens, that is, the full range of the focus, iris and zoom – on all settings.”

Ncam-EOT-FP-50  


As key information, these lens distortion parameters are and are captured using rotation encoders, which make the entire range readable, from minimum to maximum. Ncam manufactures its own external encoders, or is capable of interfacing with existing systems, such as Preston. Preston’s encoders are used on film shoots with a follow focus, and are used like a focus puller.

Graphics Engine Integration

A typical VFX workflow starts with a VFX facility shooting a lens grid as a way of determining distortion. These are then used to un-distort the plate, the CG elements are added in - CG elements are always rendered without distortion - and the whole plate is then distorted back again before cutting it back into the footage.

The most recent Ncam SDK allows users to integrate Ncam with any graphics engine so that it can read the live data. This data is used to communicate the translation details, timecode, focus/iris/zoom on a per-frame basis, plus real-time lens distortion. "The resulting volume of data is not as overwhelming as you might expect but, with all distortions made in real-time through out a shoot, it is extremely useful for broadcast, on-set previs or to save for post production,” said Nic.

“Ncam uses a total of nine sensors, not only optical tracking. It also automatically seeks ‘natural’ tracking markers existing in the scene surrounding the camera, for example, spots of high contrast. Manual set-up, apart from lens calibration, is not necessary. The resulting collection of points recreates the environment as a 3D point cloud, as mentioned. Furthermore the optical track is stereo optical, which is essential for accuracy. An automatic alignment feature measures the difference between the Ncam camera bar and the camera’s nodal point.

Ncam-EOT-FP-73

Consistent Monitoring

“In a corresponding monitoring scenario, the user has a clean feed shot by a camera without the Ncam bar, and the composited Ncam feed. The Ncam feed includes its own internal engine, plus an internal keyer for keying out green or blue screen, a compositor and colour grading software. Ncam carries out all of the associated calculations and outputs it within the composite through an SDI stream to the video village and other required playback, to the viewfinder in the camera, to camera operators’ field monitors, steady cam monitors, eyepieces on digital cameras – in short, everyone involved with the shoot, because all proceedings and on-set decisions are going to be based on that live composite.”

The staff that is required on set to actually operate Ncam is minimal. In a studio shoot a trained member of the camera crew places the Ncam bar on the camera and re-sets it for different lenses during the shoot, while an Ncam technician works at the Ncam server. Training production members to use Ncam is straightforward. Outdoors or on location, the operator may need a grip to help organize cabling. Also, assets may be fed from the asset engine in Ncam to a VFX artist or previs specialist working on set in Motion Builder on a laptop.

CG Engine

For movies, Ncam becomes the composite and CG engine outputting FBX files. From an ALEXA camera, for example, as shots are captured, the camera records its own raw format images, Ncam records its FBX data plus slate information for the director. When the camera stops, so does Ncam. It then sends the FBX files out to solid state drives. From this point, the data can be read by 3D software. The FBX file contains the essential point cloud data, which is the primary purpose of Ncam, camera rebuilding data, lens information and time code.

Nic also explained that the next step in Ncam’s development is to embed this information into the images for everyone involved in post to access, without needing to use a separate process to associate the data with the images. The company is working with its industry partners now to achieve this. For example, a three year development contract with The Foundry is underway.

Ncam addresses multicamera shoots in a special manner as well. A typical example, would be several cameras shooting a single robot character’s performance. One Motion Builder workstation running alongside the crew sends the asset, or robot, information out to Ncam’s servers, which composite that robot into the monitoring of all of the cameras involved in real time, preserving cameras angles and other qualities. This technique was used on ‘The Avengers’ – and demonstrated at NAB 2014. www.ncam-tech.com