High Dynamic Range is hugely popular because of the punchy, engaging images it produces, but what challenges does the emerging technology pose for production teams?
Of the new technologies offered to camera people over the past couple of decades, High Dynamic Range (HDR) imaging is possibly the most liked.
It produces punchy, engaging images, but avoids the eyestrain and costs of stereo 3D, and the negative audience reaction to high frame rates in drama.
Concerns abound, though, over how this material is to be originated.
Even monitoring is complicated by the fact that HDR consumer TVs do not try to match one another for absolute brightness, and there is alarm over the idea that it might require new camera equipment, or entirely new techniques.
Happily, many of these fears are easily dispelled.
Director of photography Markus Förderer (pictured, below), whose recent credits include Independence Day: Resurgence, shot the short film Meridian for Netflix with a Red Weapon 6K and a Sony F65.
“Meridian was a test to show the latest camera technology and the highest image quality, so we used the most advanced 4K cameras,” he says.
Directed by Curtis Clark, an accomplished cinematographer, Meridian was shot at 60 frames per second to stress-test production and delivery systems.
Förderer says the high frame rate was unconnected to HDR.
“Netflix wanted a test, and this was the highest spec that it would have to deal with.”
An HDR finish required even more precise exposures than normal.
“When you master for HDR, you lock in how you’re exposed,” Clark says. The production used a Sony BVM-X300 monitor, taking advantage of its peak brightness (15 times that of a conventional display) and the inky blacks of OLED technology.
Monitoring was fed via a Fuji IS-mini LUT box to facilitate both technical and creative picture manipulation.
“It was a very simple process,” Clark says. “We used it as a regular monitor and ACES handled all the colour transforms. Then we created a specific LUT that emulated a Kodachrome look, but extended it for HDR, so the highlights were way brighter than on film and the shadows were darker.”
Most of the production was shot on Leica Summilux lenses, which provided a high level of contrast.
However, Förderer notes a need for restraint.
“I love the possibilities – showing high contrast, like a very bright window – but it can be distracting.
“I think it’s beneficial to not use the highest-contrast lenses.
“Softer lenses give you a softer transition. They create a certain halation, which gives you a transition between high-contrast edges, though you can still use very bright highlights and very dark shadows.”
For the opening of the film, Clark used archive Kodachrome material, scanned at 4K, to show a period Los Angeles.
Then, as Förderer says, “they moved to us on a stage and I used old Cooke Speed Panchro lenses to ease the transition”.
He adds: “You have to be more careful with lighting and it takes a bit more time.
“For one shot in which a cop sits at a table in front of a bright window, we planned to blow out the window so we didn’t need blue [screen].
“Usually this isn’t a problem, but here you could still see wrinkles on the fabric from outside. I had to add four times the light to blow it out.”
Meridian was recorded raw, with Förderer avoiding ProRes compression due to the uncertainty over delivery standards.
“It’s helpful in post to be able to debayer in the formats required,” he says.
“If you shoot ProRes, you bake in the log space.
On Resurgence last year, we did an HDR pass working off RedLogFilm [files].
“That worked great, but there was little room to change things.
“I learned on Meridian that you have to be more precise. You have to know what you’re seeing on set, because you’ll not have a lot of room to move it in post.”
Amazon’s Dawn
Cinematographer Owen McPolin (pictured, below) shot prehistory drama Dawn, a pilot for Hulu and MGM that was mastered in 4K HDR.
His other credits include Mr Selfridge, Whitechapel, Ripper Street, Da Vinci’s Demons and Vikings.
McPolin recognises the concern over standardisation.
“It is very sketchy because it is so new,” he says. “We had to know in advance that Hulu and MGM wanted a deliverable they could grade for HDR.”
Dawn was, he says, “all outdoors, all landscape, all set in the Neanderthal times – all shot in daylight”.
“We shot it on Arri Alexa Minis because they helped us get to those locations and they had the sensor we required,” he says.
“We shot raw as opposed to Arri Log C because it gave us the full wide gamut.”
One emerging standard that is often conflated with HDR is the ITU Recommendation BT.2020, which defines a colour gamut exceeding that of common Rec 709.
HDR, however, does not inevitably imply Rec 2020.
“They weren’t necessarily [asking for 2020]. The colourspace we used to generate a 3D LUT might have been a derivative of the 709 colourspace.”
Shooting raw creates more work in data handling, says McPolin:
“We had Codex drives built into the cameras. They only lasted eight minutes. That was a problem.”
McPolin used a pair of Atomos Shogun Flame monitor-recorders (pictured), with one displaying a creative LUT and the other using its in-built HDR monitoring features, which proved the more reliable in extreme circumstances.
“We fed one with the raw, the other with our 3D LUT, so I had an idea of the final grade.
“There were times when they were the same, but at night and in bright exteriors, the HDR looked different. I was getting weird artefacts in my LUT that were not displaying in HDR,” he says.
“If you remember the old Sony 790 camera, it looked like you had the clipping set to 110. It looked like a slightly digitalised version of the true blue.”
During prep, he says, it was decided that when they shot in a dark environment, McPolin would give some exposure in the blacks.
Retaining detail
This need to retain shadow detail is commonly raised in discussions about HDR.
McPolin describes it in photochemical terms:
“It’s like you’re looking at an interpositive, and then the TV you’re watching it on does all kinds of stuff too. You can’t guarantee consistency – it depends on the TV and the monitor. You hope you’re catering for it all as much as you can.”
Despite this, McPolin is happy that modern camera technology can deal with the situation.
“I like it. If I’m in a room and I want to shoot out of a window, people can see inside the room and outside,” he says.
“If you are shooting on any of the systems that shoot raw, Log C or any of the flatter images, it should be simple to do. If someone says this might be broadcast in HDR, you just pay a little more attention to the raw than you would have in the past.
“It’s all there anyway, it’s just the slice of the cake you’re showing to the audience. This slice has got lots more cream and icing.”
The advice, then, is to aim for precision exposures while keeping enough shadow detail to avoid the darkest parts of the frame becoming a super-black void.
Principal colour scientist at FotoKem in Los Angeles Joseph Slomka (see below) addresses common fears regarding the potential to overdo HDR.
“Once you get out of the tech demos, HDR is used more subtly,” he says. “Narrative is about people. Cinematographers are not going to radically change the way people look.”
HDR remains, however, a rarefied field, with few fixed points of reference and varying ideas as to how it should be approached.
Perhaps with this in mind, Netflix has made Meridian available for download as a high-quality file.
“It’s a huge file – 90GB – and it’s for everyone to play with,” says Förderer.
“Because there’s not really much footage out there.”
HDR: the view from post
HDR requires at least as much work from post as from those directly behind the camera.
Joseph Slomka (pictured, below), principal colour scientist at FotoKem in Los Angeles, says: “Most of the changes exist in post-production because that’s where people are being given the time to deal with it.
“Nobody’s saying: ‘Hey, have an HDR monitor on set.’ They’re saying: ‘You’ve shot some stuff, make it HDR.’”
Slomka says planning is key to HDR, including the issue of overpowering highlights.
“We work with the cinematographer ahead of time as much as we can. We ask: ‘Where are your highlights, what’s your mid-tone ratio.’
There are advantages to planning your HDR up front, particularly the highlights.”
In contrast to the more cautious approach of the camera department, Slomka is comparatively liberal about the choice of shooting format.
“Some people feel they have to use raw for HDR, but we’ve found that not to be the case. How you treat the material is more important.”
The variable nature of consumer display devices has led to part of the technical burden moving to distributors, who understand distribution chains in terms of how they handle pictures and what they will be displayed on.
This complicates calibration of both on-set and post-production monitoring.
“There’s not a solid standard,” Slomka says. “We find out what the final deliverable’s going to be and we set up to that.
“For instance, some people are using ACES. They set up to a D60 white point. That doesn’t work with Dolby Vision, which has a [bluer] D65 white point.”
Sometimes, multiple deliverables are required to suit various distributors.
“We say this is the thousand nit master, or we can be asked for a 600 nit deliverable. We make the best quality master possible. Anything that needs to be done after that is up to the distribution chain.”
The situation is akin, Slomka says, to the variable amounts of compression applied by internet-based video on demand.
No comments yet