Sony presented some of its latest products at a recent technology roadshow, including its C3 Portal camera in the cloud service, an entry-level switcher a new pitch-side LED solution.

Sony pitched up at Twickenham rugby stadium at the end of September to present the first of a series of technology roadshows (themed ‘Live Your Vision’) that will wend their way around Europe, offering face-to-face meetings with customers to present new products for the first time in 18 months.

Sony C3 Portal camera

Cloud ready: A Sony camera linked to the C3 Portal via an Xperia Pro phone

Among the new technology getting its first airing in public were: the newly announced C3 Portal camera in the cloud service; a new entry-level production switcher; and how to deliver multi-region advertising from pitch-side LED screens without the need for overlays.

C3 Portal is the next generation of Sony’s cloud-based service XDCam Air - the name stands for Camera-Connect-Cloud Portal. “The main difference or evolution from the XDCam Air is that we are expanding the range of the supported devices,” moving beyond XDCam “to support everything from Alpha up to Venice”, explained Pavel Skabrout, workflow architect, Sony Professional.

It also adds new functions and features to XDCam Air, including a new application (C3 Portal) and additional features in the cameras (such as the PXW-Z280, PXW-Z750 or PXW-X500) via new firmware. This will include “quick and easy connection between the camera and the cloud service”. It also enables the interface for many devices, such as card readers, to allow users to send the files through the application to the C3 Portal.

The system supports two main workflows. “One is a live streaming workflow which is very simple. You just do basically drag and drop,” he said, demonstrating by connecting a server in Prague to the live stream from the camera in London, and then controlling the camera (including zoom, focus, iris and gain).

Sony Andy Hotten XVS-G1

Andy Hotten with the new XVS-G1 live production switcher at Sony’s Live Your Vision roadshow in London

The other is a file-based workflow, so Skabrout, as a remote user, could “look inside the camera; I can see all the clips which have been recorded and I can pick the files one by one and take them to my production remotely, but also you can do it at your end. Going to the camera menu or using that C3 Portal application. We can also now change the metadata and all the relevant information which can be input for the material.” Users with the relevant permissions can change the metadata, add additional information like a still picture, or contribute to the cloud with an external audio file. Users can also change the settings of the camera remotely, such as adding proxy recording, or altering any other variable, then creating a preset for all of their production.

It also has integration through a MOS gateway to newsroom computer systems like Octopus, ENPS, Open Media or iNews, so story names can be reflected and transferred into the camera. “We can do that automatically and activate the planning remotely and also that can be done by the journalist directly on site,” either via the camera menu or a mobile phone where the planning metadata for the story will come, and can then be transferred to a camera. It can also be integrated with third-party applications via API, and this has been done with some third-party apps, although they couldn’t say who with right now.

There are also basic cloud editing functions, such as trimming, and then export into different non-linear editors like Avid, Adobe Premiere, Edius or Final Cut. There is also a plug-in for Adobe Premiere where all the assets in the C3 Portal can appear immediately in Premiere.

When streaming, the C3 Portal can also record live as a growing file that can be edited as it continues to record.

When the C3 Portal goes live in December, “the most supported cameras will be the Z280 and PXW-FX9. And then later, through the firmware releases for the cameras, we will support other cameras as well, such as the FX6, Z750, Z450, X400 and others,” said Skabrout.

The system can also be used with non-Sony cameras, such as iPhones or Android devices, via either a free, dedicated application XDCam Pocket (available now), or via the future C3 Portal App. That gives phones the possibility of both live streaming and file contribution. “We also can interface with third-party devices through the card reader option. So whenever a device can present itself as a card reader through that application, we are able to send the files to the C3 Portal and then make that available for any user.”

One unique function is its multi-link operation, “where we can connect a couple of phones together – to boost the bandwidth available,” he explained. “In that case one phone would be used as a forwarder and the other phones will be used as a distributor. But of course the forwarder is also uploading and what is unique is that the local connection between the phones for the file distribution is done via WiFi, but all the phones are uploading the file via LTE connection, so that application is controlling the data flow for, let’s say, the WiFi just for the download and the LTE for upload.”

The system can also include talkback. It has a built-in intercom, but it requires additional hardware, the PWS-110RX1A server, which can take the stream from the camera and convert that to an HD-SDI signal.

The system can also search across all the metadata and even the Portal’s speech-to-text conversion. “So it points you directly to where the word has been used,” making it simple to find and make clips. Speech-to-text is available in a number of major languages, including: French, English, Spanish, Hindi, Chinese, Japanese, German, Italian and Portuguese (Brazilian).

Existing XDCam Air users include the BBC, CH Media Group in Switzerland and Telemadrid, but C3 Portal hasn’t been trialled with any customers yet.

XVS-G1 Live Production Switcher

Sony has introduced a new entry-level live production switcher, the XVS-G1, that joins its XVS line sitting below the XVS-6000. The two-box system “is simpler to install for live events or in fly-aways,” said Andy Hotten, product specialist, Sony. It is aimed at general live use for sports, esports, news or entertainment, and is suitable for small to mid-size studios and outside broadcast vehicles.

Sony Parallel Ads Football

Two Parallel Ads feeds side-by-side for USA and Canada from a single camera

There are four different control panels available: 1M/E with either 16 or 24 buttons, or 2M/E (16 or 24). The system uses FPGA (field-programmable gate array) hardware for real-time processing of UHD, wide colour gamut, high-dynamic range imaging with an very low latency. For ease of installation a single network cable links the processor and panel.

“We have added an optional GPU in the box for additional effects or future expansion,” said Hotten. The GPU can be used as a clip player, 3D DVE or to provide four additional keyers. The clip player is available for the first time in a Sony live production switcher. It offers four channels in HD or two in UHD, with each channel having the ability to playback clips of up to 60 minutes in duration. It supports AVC files in MOV or MP4 format and back-up storage of all clips is available on an internal SSD.

The basic unit comes with 44 inputs and 24 outputs (in HD), and up to four M/E busses with 16 keyers (all with resizer and chromakey). Every M/E bus has a dedicated keyer for frame memory clip transitions. It also has an enhanced multi-viewer that can display multiple types of information such as audio level meters and multi-layer captions overlay.

Internal HDR conversion is optional, with frame synch on each input, format conversion and colour correction. “It has been designed for live event production in the field,” he said.

Sony C3 Portal diagram

C3 P Oh: Sony’s new C3 Portal connects cameras to the cloud

It uses a new web-based control app with the same look of Sony’s MVS and XVS series mixers, with an optimised menu structure to reflect the live operation workflow and setup. It can have up to 16 main instances pre-loaded to make it easy to set up for different users.

Sony is due to deliver the first two units this week to customers in France and Norway for events and live production.

Global advertising choice

Probably the most interesting demonstration was of Parallel Ads, based on Dynamic Content Multiplication technology by AGS (Appario Global Solutions). This enables rights holders and sponsors to display different LED screen advertising to different territories live without requiring any virtual ad inserts. Up to four or six different feeds can be displayed on the pitch-side LED screens in a way that can be read out by high-speed cameras using global shutters, by synching each image to a different frame.

It allows ads to be easily tailored for different audiences, both in the venue and for different broadcast feeds. It also means broadcasters are not limited to static cameras, while the cameras can still be used for slow motion replays without affecting the advertising. It can be done in real time, without intruding on the production, and is claimed to be very cost effective.

It works with Sony’s high-end live cameras, such as the HDC-5500 (which has a native UHD sensor and global shutter technology). “The global shutter technology [in the camera sensor] enables an incredibly clean image to be captured, and then within the Parallel Ads application they’re able to take the individual phases of video and show the different graphics that are being displayed by the LED board that are invisible to the naked eye, and so the in-stadium experience isn’t altered. But then it can be deployed across any cameras that are in the stadium, assuming they have a global shutter,” explained Rob Thorne, consulting architect, Sony live production. It doesn’t work with a rolling shutter, because of the ‘jello’ effect such shutters deliver that makes rapidly moving subjects look unnaturally distorted.

Multiple different images can be shown because “they are in essence rapidly firing images into the LED boards that are just faster than the human eye can refresh, but which the camera is able to capture.” A camera running at 200 frames per second, such as Sony’s HDC-3500 or the HDC-P50 miniature camera, can deliver four different feeds (four 50P images), but the higher-speed HDC-5500 can capture up to six different feeds.

Caption - Advertising goes global Rob Thorne consulting architect Sony live production

Advertising goes global - Rob Thorne, consulting architect Sony live production

There is no extra processing being done in the camera or in the CCU. It’s purely a question of reading particular frames. “It’s more of an application of the image rather than processing of the image,” said Thorne. “Each feed has got a different splice of time. It’s traditionally used for replay angles, so when they’re doing slow motion they’re slowing down each frame and then playing that back at a typical speed, whereas now what you’re doing is you’re taking those feeds and then displaying individual images of that.”

When broadcasters want to use those feeds for slow motion “the system manages that for you. So, it will replace the phase two, three and four with the original graphics, so that way you don’t notice anything unusual if you’re watching the local feed as opposed to the international one.

Besides being usable with Sony’s 5500, 3500 and P50, any camera that has a global shutter can use this technology. However, Sony was the first to have the technology within its UHD cameras that supported the DCM technology, although it is not exclusive to Sony.

It has already been used for several different sports events, such as England’s FA Cup final this summer, and rugby.