December 30, 2012

Effective Post Production and Dailies Pipelines

Written by

For more information on the S3D Centre’s research activities in HFR and other areas, and to see our current supporting press material please visit HFR Research home.

We advise the reading of the first HFR Blog #1 Here , and Effective Pre-Production and Production of Variable HFR projects (Part 1 of 2) Effective Pre-Production and Production of Variable HFR projects (Part 2 of 2) before reading on.

Key words for this article:  

  • HFR = High Frame Rates. Traditionally, cinema productions are filmed at 24 frames per second, or for broadcast at 29.97 (30) frames per second. There are numerous advantages to filming in HFR, especially in 3D.
  • VFR = Variable Frame Rates. This is the use of multiple frame rates within the same container/sequence. Does the utilization of HFR as a ‘tool’ in this regard change the immersive experience for the viewer? We suspect that the question of whether or not to utilize HFR in a stereoscopic 3D film within a narrative context depends largely on the creative intent of the film itself.
  • SFR = Standard Frame Rate (24fps in North America, 25fps in Europe)

To re-iterate: We are researching not only the standalone benefits of higher frame rates in S3D, but also the effect of VARIABLE HFR on aesthetic and immersion in the context of a single narrative.

Post Production Considerations

We were at this years SMPTE Symposium, where the 2012 topic was HFR. We have been dealing with many post production considerations over the past months, and this is an area of concern echoed relentlessly at the Symposium. From :

“Paul Chapman, senior vp of technology at Burbank-based postproduction facility Fotokem, led a discussion of some of the challenges that high frame rates bring to post, during SMPTE’s symposium on high frame rates for digital cinema.
 For starters, he asserted that support from creative editorial is lacking and needed. This was a sentiment echoed by additional speakers, including Disney’s Howard Lukk, who did however single out Adobe’s Premiere Pro as offering HFR support.
Chapman reported that he conducted an informal survey of dailies vendors, and his findings suggest that all are now testing high frame rates. He expects to see quite a lot of development in this area during the next 12 months. He reported that Fotokem’s NEXTlab near-set dailies system is already working on a HFR project.
Nico Recagno of SGO (the maker of the Mistika postproduction system, which is in use at Park Road Post on The Hobbit) said that managing dailies at HFRs is “horrifically difficult,” citing challenges including time code, methods of viewing and QC, and sound sync.
Recagno added that the biggest concern that he is hearing from colorists is getting proper calibration during color grading. He and Chapman both urged the community to create HFR standards in some of the discussed areas, including time code.”

We are also members of the SMPTE 21DC high frame rate study group and can echo that these are very new concerns. Furthermore, there are simply not many software applications/dailies software and hardware already capable of S3D toolsets, let alone HFR 3D toolsets. SGO Mistika has been reckoned as the ultimate HFR 3D toolset, but with its steep price point we have found it to be out of reach for the majority of filmmakers. Denise has been a user (and Beta tester) of Avid Media Composer and Symphony for S3D already, so we were able to test it on our HFR 3D material. Avid already has functional S3D toolsets so this made the process of attempting the playback of HFR 3D simplified.  Before we go more into this process, we will share some specs on our post production toolsets:

Macbook Pro for on set dailies and offline edit:

  • 17” running 10.7.3. 2.4 GHz Intel Core i7.
  • 8 GB 1333 MHz DDR3. AMD Radeon HD 6770M. GPU Graphics bus, Intel HD Graphics 3000. GPU built in bus.
  • Footage stored on Pegasus Promise through thunderbolt connectivity.

We have an additional Magma 3-slot chassis that can be used with this configuration, running Red Rockets or other cards.

MacPro tower:

  • 3.2 GHz dual quad core with 18gb ram.
  • Nvidia Quadro FX 4800. Mac Raid Card – Raid 0.
  • 1 x RedRocket. BlackMagic 3D Extreme PCI card for playback.

Cine-Tal Cinemage 2000 monitor for colour grading, and Hyundai Hyundai 46” passive polarized 3D LCD (S465D) for 3D TV playback.

Regardless of your software, realtime playback of S3D material is already a problem. Unfortunately HFR and VFR footage only exemplified this issue. This is where the greatest expense may be incurred by an HFR / VFR 3D shoot- the cost of a reasonably new machine, additional video/PCI cards, and fast storage to get fast ingestion and playback of footage.

Software Capable of viewing HFR that we used:
•    Dailies:
1.    RedCine X (limited depth grading/convergence tools, some colour grading tools. Great way to view camera metadata and export new clips. Somewhat confusing to create S3D containers)
2.    DaVinci Resolve 8 (excellent metadata capabilities, easy to create S3D containers but ONLY from EDLs created within Resolve. Excellent colour and depth grading. Easy to switch between left and right image. You can export literally any deliverable in HFR from DaVinci Resolve.)

•    Editing Solution that we used:
1.    Avid Media Composer or Symphony. Can work directly from RED footage using AMA, or you can create proxies either from a dailies software OR inside Avid directly using the transcode tool. Easy to create S3D containers, easy to adjust sync on clips. Excellent depth grading and colour grading. Can export 720p60 media.

We were very encouraged to discover that Avid DS can utilize 23.98, 24, 47.95, 48, 59.94, 60fps timebases and footage among many other frame rates. It also had S3D toolset, and was capable of delivering full HD and 2K video. Unlike MC and Symphony, DS uses a different plugin scheme than AMA, but it has been built to match the experience and settings as closely as possible to AMA.


Click to see a larger image with Avid’s comprehensive metadata.

About editing HFR / VFR material

If you are planning on editing with proxies (this is likely going to be the only way to view the S3D HFR material in real-time) you can one light and encode to HD within dailies program. You can then advance to your editing software (in our case, Avid).

Our original clips were 4096×2160 at 59.94, 48.95 and 23.98. You can get almost real time playback of this material in 3D within Redcine, rastered to 1/4 with a RedRocket in chassis connected on the MacBook Pro, and ¼ raster with the MacPro with a RedRocket inside. Similarly, within Avid I was able to view real time playback of AMA 3D HFR material at draft quality.

Depending on your final output, you can use these dailies and editing software to export to a variety of deliverables. Avid Symphony exported any QuickTime or video you wished, and Avid DS was able to export TIFF 16bit Image Sequences if you desired a conversion for DCP. RedCine was able to create new Red clips (called trims), pretty much any video and some image sequence options. DaVinci Resolve had excellent deliverables support of video, image sequences, and DPX.


To view a larger image of the correct setting in DaVinci Resolve for 60fps OR VFR inside 60fps container WITH real time playback, click the above image.

Creatively editing with HFR / VFR

Stepping away from the technical discussion for a bit, I wanted to discuss creative choices for a VFR project. I first want to be completely honest and say that after viewing the HFR proxies in 2D, 3D alongside standard frame rate material, I had some very surprising reactions.

I found that it was incredibly difficult to go back to watching this material in ONLY 23.98fps after being exposed to the HFR versions of the material. While watching the 23.98fps material I felt as though it was simply missing something. I speculate I was not enjoying the blur of 23.98fps on the majority of footage, and preferred the HFR versions on many occasions. I found that I did NOT like the aesthetic of HFR on close up faces, something echoes by my colleagues. HFR on CU faces seems to be far more intimate than what a viewer may like during emotional scenes. We all observed that the background separated, and was more defined in the HFR versions than in 23.98fps. This is a great thing in 3D, but be careful with set decoration. Any flaws in the background become more pronounced in HFR.

Imperfect movements whether it is a shaky dolly or jerking dance move became even more pronounced in HFR 3D. I decided on one occasion to cover up a bumpy dolly track with the 23.98fps version of the clip- the blurred frames ‘smoothed’ the image compared to the HFR versions. I found that our main subject, the dancer, was a very beautiful dancer but sometimes the HFR created a kind of ‘animated’ look to her dancing. 23.98fps seemed to retain the ‘soft, graceful’ aesthetic of dance, but it also enhanced blur and judder to the point that it was unwatchable in 23.98fps. Lighting on a human subject appeared more ‘hard edged’ and defined, and when viewers of the material suggested it made a person look video-game like, I speculate that this hard edged lighting was a common denominator.

The HFR handheld shots were shockingly intimate, and I had a great time playing with these and intercutting them among different moments to create different effects. The film is not 100% complete yet, so that is all I will say on this for now.

In general, Stereo 3D shots need to be held longer, and the entire edited sequence slowed down to allow the viewer time to absorb and appreciate the effect. This is fairly common knowledge, and with HFR I find the shots need to be held a fraction longer. HFR seems to provide even more information, and there is more to look at with the added frames and background depth. So slowing the pace of the edit down is even more important now that ever. After recently viewing “Life of Pi” in 3D, I feel this film is an excellent example of the editing held longer than is typical in a feature film. The scenes were artfully composed with spectacular scenery everywhere, even in the reflections of water. If this film was HFR, I think that the shot length would have become even more important due to the increase of detail.

There may be a second part to this post soon, due to the very soon to be released “The Hobbit: An Unexpected Journey” and information from that project. I can confirm that the post team used Avid Media Composer and SGO Mistika, and actually edited the film in 24fps offline only to online in Mistika later. This was due to technical limitation in HFR 3D, but also the editor’s familiarity to the 24fps pipeline. More toolsets are due out soon that we may have access to so this blog may be updated sooner rather than later with more information.

Learn more about Stereoscopic 3D editing at ’Working in Stereoscopic’ guides.



Our Partners