Match TV Virtualises and Networks Remote EVS Workflow at Winter Universiade
Crew at Match TV studio
Match TV is a Russian free-to-air sports television channel that specialises timely, detailed, comprehensive access to major sporting events for fans and viewers. Its content includes the Olympic Games, plus many of the World and European Championships.
In March 2019, Match TV served as host broadcaster for the XXIX Winter Universiade 2019 held in Krasnoyarsk in Siberia. Among the largest multi-sport events in the world after the Olympics, Universiade is organised for student athletes by the International University Sports Federation (FISU).
Fast Efficient Live Media Network
The Winter Universiade attracts a lot of international media attention in northern Russia, creating an opportunity for Match TV to show its ability to deliver high quality programming at a challenging, multi-sport event. They devoted considerable effort to developing an efficient, flexible production workflow. In particular, Match TV was keen to virtualise backend resources, set up remote operations and support collaboration among its production teams.
Match TV studio
As the physical hub of this workflow, Match TV clustered EVS’ XT-VIA production servers, using the XNet-VIA live media sharing network, an ethernet media sharing network built for fast, efficient live media sharing between the XT-VIA servers and other live production tools on the EVS VIA platform. This network can be expanded to serve very large events, while maintaining the security and underlying simplicity of its design.
The project was ambitious. Because the Xnet-VIA network is still quite a new development, Match TV’s deployment would be the first at such scale. Furthermore, the deployment schedule at the event was extremely tight, and Krasnoyarsk is in a remote location 4,100km east of Moscow with a harsh winter climate where temperatures drop to -42°C.
Virtualising the Backend
Eight EVS XT-VIA servers were interconnected using the ethernet-based live media sharing network within the event’s International Broadcasting Center (IBC). The bi-directional 10GigE network was configured and deployed through EVS’ new XHub-VIA network switch to facilitate sharing of content between servers. It achieves high speeds by giving operators greater bandwidth than was supplied by EVS’ former SDTI-based network – relying on the Xhub-VIA hardware with 18 ports of up to 25 GbE and four splittable ports of 40 or 100 GbE, plus two 1GbE management ports.
Krasnoyarsk studio
Apart from the IBC, Match TV operated from 11 OB trucks distributed among the Universiade’s various venues. These vehicles were also equipped with EVS gear including 49 XT3 live production servers, nine XFile3 systems to back up the content, stream and restore media files in any format during the tournament, nine XTAccess media transfer engines and six IPDirector content management suites.
To increase their flexibility and agility and to limit the need to purchase dedicated hardware for the project, all of EVS’ backend resources would be virtualised. The EVS software applications for IPDirector, the live PAM database, IP MOS, IPWeb and XTAccess were run on virtual machines and run on available general purpose hardware platforms. This approach would save a significant amount of rack space, reduce the costs associated with shipping equipment out to Krasnoyarsk, and result in greater redundancy.
Networked Crews
In terms of human resources, Match TV would have crews situated at each venue, in the IBC and also back in Moscow at the Gazprom-Media broadcast centre and the studio at the city’s Ostankino Technical Centre. The goal was highly collaborative production workflows, and an environment that would function in a way similar to a news production infrastructure. The various contributors and producers needed to be able to log and browse content remotely. The IPWeb tool, powered by visualising the supporting hardware, established remote content access, and the integration between IPLogger and Match TV’s FileCatalyst file transfer system facilitated searching for content in the IPDirector database from a distance.
Crews working inside the OB trucks at the venues would be able to push and pull content to and from the IBC using EVS’ XFile3 system, which automates file transfer, transcoding and archive during ingest. As well as generating slow motion replays for the live production, the operators of the EVS LSM (live slow motion) control panels on-site would be able to create and edit clips on the fly for contribution - and then push their clips to the IBC via a single command.
Specific to Match TV was the integration of EVS’ systems with its own ingest scheduling tool and with its third-party systems. These included integration with the complete Octopus Newsroom NRCS for AB-Roll playout, and IPLink for Adobe, an existing integration that gives Premiere Pro editors instant, direct access to EVS near line content. IPLink allows detailed logs that are created remotely to appear in the editor’s Premiere timeline while the material is still being ingested. Rough cuts created with EVS tools can be converted to a sequence in one step.
An Opportunity
Almost no time was available to test the equipment ahead of deployment, XNet-VIA was used to share clips and recorded media between the eight XT-VIA servers, fast enough to ingest a total of 22 feeds simultaneously, all of which were recorded for archive in 1080i using the AVC-Intra Codec. EVS’ loop recording within the XT-VIA servers meant that operators had immediate access to content – still on the system - captured from the previous day, which proved extremely useful for Match TV’s news and studio production teams.
EVS IPDirector
Using the new H.264 onboard proxy that uses the MJPEG and MPEG-DASH standards, they could also simultaneously record content in high and low resolution. The editors were then able to browse and quickly create playlists using IPDirector, and journalists could cover breaking news direct from the Winter Universiade.
By virtualizing all its EVS backend resources, Match TV’s operations became much more flexible. The remote, collaborative workflow designed for the project meant that operators at the venues, the rotating 12-man teams of loggers located in Moscow and crews in Ostankino, as well as FISU teams stationed within the IBC could all access content such as profiles, archive material and super slow motion clips when needed.
Because the loggers were able to log content and create rough cuts remotely from Moscow, four time-zones to the west, Match TV didn’t need so many people working in the event’s IBC, saving money on travel. The detailed metadata added to content by the loggers gave Match TV’s in-house journalists and news producers the chance to create their own stories independently, several times faster than it would normally take.
As they had intended, Match TV used the games as an opportunity to prove its services, delivering comprehensive coverage both to its home audience in Russia, and those watching the global feed it was producing for the unilateral broadcasters covering the action. The had successfully taken advantage of new production systems, deployed in an untested workflow for an application where failure was not an option, and meanwhile gained practical knowledge of what the new systems were capable of. evs.com