Advances in technology mean that broadcast engineers have an increasingly important role to play in aiding the creative process, says John Maxwell Hobbs.
In the first of a new series of regular blogs from a panel of senior industry figures, John Maxwell Hobbs explains the benefits of creative engineering.
Broadcast engineers have long been viewed as boffins whose main job is to hand out the kit at a beginning of a shoot and then patch it up when it’s returned in pieces after wrap.
However, the introduction of digital technology has led to dramatic changes in the entire production process: in particular, the relationships between engineering, operational, and production staff throughout a production.
Digital technology has made the production process more fluid, blurring the boundaries between pre and post-production, requiring more multi-skilling.
For years, production technology has been electro-mechanical and chemical: gears, spinning wheels, cathode ray tubes, celluloid and plastic, developing fluids and dyes – these have been replaced by sensors, OLED screens, and ones and zeroes.
A new camera may have a recognisable form factor, but it’s no longer a clanking Victorian wind-up box with a lens on it – it’s now a highly complex computer with a lens on it, and requires not simply a good eye for framing a shot, but a deep knowledge of the complex configurations needed to achieve the desired aesthetics.
Case in point - in the early days of HD production in the UK, I was working for a very big broadcasting company.
We had the first generation of digital HD video cameras and they were going to be used on a major location shoot by a renowned DP.
When he came to collect the cameras, the engineering team began to tell him what needed to be done to configure and operate the unit. He responded with a brusque, “I know how to use a camera,” picked up the case and left.
Two weeks later, a stressed out associate producer came to my team and said, “You’ve got to help us, we’ve got two weeks’ worth of outdoor shots where all the skies are pink.”
One of our acquisition engineers got to work solving the problem, and was hailed as a hero for doing something that could have been easily avoided in the first place.
There are positive stories as well.
At the same broadcaster, a shooter/director had an ambition to be able to capture the precise moment that interview subjects came to a realisation.
Working in collaboration with one of our engineers, they came up with the idea of using an ultra-high speed Phantom camera, normally used for natural history shoots. Not only was it unprecedented to use this type of camera to shoot interviews, but they worked out how it could be hand-held as if it were a traditional shoulder-mounted camera.
By combining their shooting and engineering expertise, they came up with a truly innovative approach to realise a creative ambition.
This productive production/engineering partnership is the norm in music production and always has been.
Recording engineers have always been viewed as part of the creative team, and are the main individuals tasked with helping the musicians to achieve their aesthetic visions.
During the recording of The Beatles song, Strawberry Fields Forever John Lennon decided that he liked the first half of one take of the song and the second half of another take and wanted to combine the two.
The problem was, the two takes were at different tempos and keys. Today, with digital technology, the splice would have been completed in a matter of seconds, but in 1967, it wasn’t even possible to vary the speed of an analogue tape deck.
In order to accomplish the pitch shifting, Phil McDonald, an engineer at Abbey Road Studios, figured out a way to modify the electrical current flowing to the tape deck in order to speed up and slow down the decks.
Producer Nigel Godrich, who is often referred to as, “the sixth member,” of the band Radiohead, worked as a recording engineer for their first two albums, and thanks to the innovation he brought to the recording process, was asked to become their full-time producer, as well as a performer in singer Thom Yorke’s side project, Atoms for Peace.
“The possibilities are now limited only by the imagination of the users”
There is precedent for this type of collaboration within broadcasting.
Between 1958 and 1998, the BBC Radiophonic Workshop was a shining example of creative engineering.
Daphne Oram, the co-founder of the Workshop, who began her broadcasting career as a studio engineer, was an early innovator in the area of electronic music creation, and with the founding of the Radiophonic Workshop, promoted the use of cutting edge sound creation technology for sound effects and musical scores in BBC productions.
Delia Derbyshire, also an early member of the Radiophonic Workshop, is most famous for her electronic realisation of composer Ron Granier’s theme for, “Doctor Who.”
Grainer was so impressed with her contributions to his score, he pushed for Derbyshire to receive co-composer credit for the work, which she finally received in 2013.
Music has been fertile ground for this sort of creative engineering due to the “hackability” of music technology – it can very easily be put to use in ways other than originally intended.
The move to software-based video production and transmission affords something similar.
Resolution, colour gamut, aspect ratio, bit rates, and more can be switched on the fly; green screen allows scenes to be composed one element at a time, much in the same way that compositions are built one track at a time in music production; wide shots in UHD can be re-framed in post-production; virtual camera positions can be achieved by compositing the output of multiple devices; the possibilities are now limited only by the imagination of the users.
In this brave new world, creative collaboration between the dreamers and the builders is imperative.
As well as being a music producer and composer, John Maxwell Hobbs is a course leader at the National Film and Television School, media consultant and the former Head of Technology at BBC Scotland