How High-Frame-Rate Dual-Projector 3-D Made Its Movie Debut at the World Premiere of The Hobbit
Enabling the first-ever 48-frames-per-second showing of a major motion picture in 3-D required a massive effort involving projection technology, sound and screen equipment, and earthquake and wind protection.
by Terry Schmidt
WHAT was that crackling noise? Here I was, sitting in the back row of a glorious old movie theater (Fig. 1) at the November 2012 world premiere of The Hobbit, with Peter Jackson and James Cameron in the house, and all I could hear, and all I could think about, was the crackling sound. After a month on-site in Wellington, New Zealand, and after years of R&D to bring high-frame-rate (HFR) cinema to major motion pictures, was the big debut for this technology going to be undone by some mysterious audio problem? Then it struck me. The red carpet stretched more than half a kilometer and took more than 2 hours to traverse. The movie-goers were likely hungry. Everyone had found a goody bag on their seat that included a crackly little bag of potato chips – which they were now digging into. It was the potato-chip bags – not an audio malfunction – making the racket!
Fig. 1: The historic Embassy Theatre in Wellington, New Zealand, was the site of the world premiere of The Hobbit: An Unexpected Journey, and also of the first-ever showing of a major motion picture shot in 48-fps 3-D.
My heart rate went back to normal, and I started to enjoy the first-ever true movie-going experience of 3-D HFR cinema. It had been an intriguing technical journey to that moment, which was arguably the night when digital cinema and movie-going entered a new age.
World-First Pressures
As Chief Scientist for Christie Digital Systems, I led a team in Wellington in November 2012 that installed three Christie CP2230 projectors with integrated media blocks into the circa 1924 Embassy Theatre for what was to be the world premiere of The Hobbit: An Unexpected Journey, director Peter Jackson’s much anticipated prequel to the Lord of the Rings movie trilogy.
It was the first mainstream, mega-budget movie to be shot, edited, and shown at 48 frames per second (FPS) cinema, and in 3-D. The collective eyes of movie-lovers and the motion-picture technical community were squarely fixed on Wellington and this event. The pressure was high for Christie, which was providing the projectors and systems, to deliver a flawless premiere.
The physical properties of the venue, timing, and pure pressure made for some long days and late nights, but in the end, movie executives and film fans started to understand why leading directors like Jackson and Avatar’s James Cameron are such avid proponents of HFR cinema. Much was learned from that premiere about this evolving medium, and the technical challenges and possibilities it introduces.
Planning the Parts
Each of the celebrated Lord of the Rings movies has had its world premiere at the Embassy in downtown Wellington, the city closest to Jackson’s New Zealand home, as well as to visual effects house Weta Studios and the Park Road Post Production facilities. The vintage movie house was originally designed for live stage productions and has since then been strengthened to withstand earthquakes. For the Hobbit premiere, this grand old theater was to seat 758 VIPs in its ornate auditorium.
To provide a truly rich HFR 3-D and sound experience for the premiere, two synchronized Christie CP2230 projectors were to display 48 frames per second, showing 3-D images at the extraordinary brightness level of 11 ft-L, up from the conventional 4.5-ft-L target of standard digital-cinema screens (Fig. 2).
Fig. 2: An adjustable stacking frame is used to accurately align two 3-D Christie projectors for a higher brightness showing of The Hobbit at the Embassy Theater in Wellington, New Zealand.
The projectors were fitted with passive RealD XLDP boxes to drive high-efficiency 3-D. Two synchronizing Christie Integrated Media Blocks (IMBs) were installed into the projectors, to project onto a state-of-the-art, high-uniformity, silver screen from RealD. To complete this ultimate movie-going experience, the Dolby company was asked to fit a 35-channel version of its Dolby Atmos digital surround audio system into the ceiling and walls of the movie house.
In all, three projectors were planned into the design to ensure continuity and redundancy, even in the face of an unlikely failure. The two main units would use a specially mastered version of The Hobbit that factored in the exceptionally high brightness levels, while a third backup projector was schemed in to run a standard sequential flashing 3-D configuration, with a standard master timed for 4.5 ft-L brightness. The dual synchronized projectors each output a separate continuous image, one for each eye, presenting the viewer with the ultimate realism in 3-D.
None of this was in place when the Nov. 28 premiere date was set. The planning started weeks ahead and far away from New Zealand. I arrived 3 1/2 weeks early to supervise the project and work with our technical partners.
The Embassy Gets Fitted
Our very modern equipment and mounting frames — first assembled and staged in Christie’s engineering and manufacturing headquarters in Kitchener, Ontario, Canada, and then shipped “down under”— had to somehow fit into a very old building. This proved a challenge: the only way the projectors were getting into the Embassy Theatre booth was through the roof.
We hired a street-level crane (Fig. 3) to lift all of our equipment to the roof. Thankfully, we had sunshine on crane day and were able to unpack outdoors. Some very strong, very eager locals manhandled three projectors, one pedestal, one unassembled stacking frame, and all the accessories through two small doorways and a narrow passageway. With the equipment inside, we had a first good look at the technical challenges in front of us.
Fig. 3: Wellington city council blocked off traffic while a huge crane hoisted three projectors, mounting frames, and pedestals to the roof for unpacking and hand delivery into the projection booth.
The deep cinema stage had been refitted with the 60-ft. 3-D silver screen, with its silver particles that reflected with circular polarization preserved vs. flat white diffusers that scrambled the polarization. There was no screen structure other than a peripheral frame tilted back at a 10° angle to aim directly at the projection booth. This new RealD screen called “Precision White” had a half-angle of 40° and a gain of 1.5 vs. a standard gain of 2.4. This meant the usual hot spot was not evident, and the high-brightness 3-D was more uniform than usual.
The projection booth needed major upgrades. We required a new, very tall projection port glass for our dual-stack system. Since Wellington is in an active earthquake zone – rated at 1000 per year – there are very strict construction laws in place, in this case dictating the need for a new 12-in. U-channel steel beam to be bolted through the wall over a window. This was brought on by having to cut a larger port opening for the dual projectors in a solid concrete wall, a process that removed some internal steel reinforcement.
The sheet of noise-reducing port glass arrived broken in half. With no time to get a replacement, we were forced to make do. Fortunately, the crack was almost horizontal and centered, so by using the adjustable extrusions of the stacking frame, we were able to adjust the two beams from the top RealD XL system to clear over the crack, and the lower projector to clear under. It worked!
We also had to upgrade the heat-extraction system, the existing system having been calculated by city engineers to be inadequate to handle the new Christie outputs. We mounted three huge, low-noise heat extractors on the ceiling and used two louvered window boxes that exhausted to the roof area through the booth outside wall. This was by design, to prevent back drafts from the notoriously high winds of Wellington. The third-party Network Attached Storage (NAS) system – between unplanned sleep modes and rebuild processes and passwords – took some finagling, but we tamed it in time for the premiere.
Team Effort
The HFR speeds necessitated the installation of integrated media blocks (IMBs) into the card cage of each projector. This allowed data to be connected to the highly parallel internal backplane of the projector. Each IMB was connected to its respective NAS device via a CAT6 Ethernet cable, where the content was stored in a Raid 5 configuration of four 1-terabyte drives. A small coax cable connected the two projectors to each other, so specialized dual-projector software could keep the left- and right-eye images in perfect synchronization.
All-electronic systems possible were also connected to battery backed-up UPS power supplies, in case of a short power outage that would require a long microprocessor re-boot delay. This included all three projector signal electronics, and three of the six redundant NAS power supplies, as well as the Dolby Atmos processor. In this way, a small power hiccup would require only a fast lamp re-strike in order for the show to carry on. Ultimately, this hard work led to a very successful first showing of the film, and this powerful new cinema technology. At this point, I offer a closer look at the progress of HFR.
HFR’s Big Moment
The worldwide conversion of film cinemas to digital has seen many bumps along the road. Initially, hardened film buffs missed the shaky and grainy film look they grew up with, and purists nostalgically yearned for the simplicity of output from the old mechanical film projectors.
With the advances in DLP Cinema, speeds fast enough to support 3-D with a single projector became possible. The frame rate, however, remained at 24 frames per second (fps) to match the look and feel of film. Even home TV exceeded film with its 30 fps, and with the popularity of HD digital TV, most home theaters have now reached 60 frames progressive for HDTV broadcasts.
Despite all these advances, digital cinema frame rates seemed stuck at 24 fps. All that started to change 2 years ago, when Jackson, Cameron, and other influential directors started exploring what a faster frame rate would do for cinematic storytelling.
Cameron showed a specially shot series of higher-frame-rate 3-D test clips at Cinemacon 2011 in Las Vegas. These consisted of medieval feast and fighting sequences shot at 24, 48, and 60 fps for the cinema industry’s technical community at the Colosseum at Caesar’s Palace. The demo was supported by four projectors set up for fast-switching audience evaluation among the three frame rates. This content demonstrated that high panning speeds and fast-action sequences could benefit in clarity at the higher frame rates of 48 or 60 fps versus the standard film rate of 24 fps.
The demonstration was a huge success, as everyone could see that 48 fps delivered significant improvements. Jerky motion artifacts disappeared and action scenes flowed smoothly. There was a discernible further improvement in going to 60 fps, but the difference from 48 fps was relatively small.
A year later, again at Cinemacon, Jackson showed an unfinished 10-minute excerpt from The Hobbit at 48 fps using Christie projectors. The excerpt drew a lot of controversy from film-buff bloggers who objected to the hyper-realism and missed the soft, classic “film look.” Many thought it looked more like actors on a stage because the 3-D effect and increased motion accuracy and increased clarity paralleled real life. This is called the ‘presidium effect’ – presidium being the name for a stage surround.
Some extreme comments likened it to a TV soap-opera look, which “cheapened” the medium. Jackson counter-argued that the short, unfinished clip did not allow enough time to draw viewers into the story to then forget about the projection medium and enjoy the realism of the movie. It did not dissuade him from finishing and releasing the film in HFR.
The experience in some ways mirrored the transition from standard-TV to HDTV resolution. As with HDTV, the acceptance of the strong advantages of HFR’s realism, and increased detail in both fast motion and scenery, will take some time, but is predicted by many to be inevitable. Cameron has promised both of the Avatar sequels to be in HFR, although at the time of writing he has not declared whether it will be 48 or 60 fps.
Having seen the final product, I am happy to report The Hobbit: An Unexpected Journey has met mostly with encouraging acceptance in its newly introduced HFR format in almost 900 screens worldwide, as well as standard 24-fps 3-D showings. Early reports of nausea from too much HFR realism in the action scenes were replaced with reports of breathtaking scenery and very realistic close-ups that perfectly conveyed the story and characters. According to many, HFR cinema is here to stay. •
Digital cinema and 3-D are now in a kind of marriage relationship, with digital’s electronic speeds enabling 3-D that was otherwise cumbersome and expensive using film. In turn, 3-D has hastened the wide adoption of digital cinema in the past few years, accelerating what had been a somewhat stalled adoption rate.
High-speed DLP imagers from Texas Instruments enable 3-D as a valuable tool in a cinematographer’s storytelling arsenal. Together, they form a practical means to add realism, and a new dimension to movie presentations to large group audiences.
3-D digital cinema, more properly called video stereography, uses what is called Z-Screen liquid-crystal technology, which allows two completely separate eye images to be sequentially displayed efficiently from a single projector. Inexpensive left- and right-hand circularly polarized filters are typically used in disposable, recyclable glasses to separate left and right images. When used with new polarization-preserving high-tech silver screens, less than 100:1 crosstalk is possible, reducing so-called bleed-through ghosting from one eye to the other.
Although there are several eye-separation techniques used in digital cinemas worldwide today, I like the efficiency of those from RealD. Its RealD XL box cleverly recovers the normally “lost polarization” into a second image, which is recombined with the original at the screen. When this box is slid on its rails in front of the projection lens, one of the planet’s largest wire-grid polarizers splits the light path into two polarized channels.
The box’s two liquid-crystal dual pi-cell structures, the Z-screens, switch the polarization at 144 Hz (or 192 Hz for HFR) rate between left and right eyes. This provides amazingly clear and bright full-color 3-D realism.
The original, and still most accurate, way to reproduce a stereo image and see 3-D in a cinema is with two projectors (Fig. 4). One is dedicated continuously to the left eye and one is dedicated continuously to the right eye. In amusement parks, where there are 3-D rides and special theaters, this is still done with film loops in two projectors. Using two DLP Cinema projectors and synchronizing the content doubles the effective brightness and can prevent any flashing artifacts for people who cannot tolerate a 50% black flash period in each eye, even at high frame rates.
Fig. 4: Dual-projector 3-D shows continuous images to each eye, replicating the “real world” more closely than so-called “double flash” techniques used for HFR 3-D single-projector presentations.
It turns out that the alignment of two projectors for 3-D presentations does not need to be nearly as accurate as for 2-D presentations; when only one image is in each eye, our eyes and brain align them automatically with low fatigue.
However, for 2-D movies without glasses to separate the images, alignment must be virtually perfect all over the screen. This alignment is easy to do in a vertical stacking frame once the concept of projected image keystone and lens offset is understood. Keystone distortion results when the aim of the projector is off perpendicular to the screen. There is almost always some keystone due to the steep down angle in modern stadium seating theatres.
In dual-projector systems, this keystone must be matched in both vertical and horizontal directions by observing each image carefully, using a red test pattern designed for the purpose from one projector and a green one from the other. By starting with the projectors as parallel to each other as possible, lens offset is used to align the images in the center, and lens zoom is used to make the images the same size. Using the adjustable feet of the projectors, leveling and aim are adjusted until the images overlay each other as closely as possible. The confusing factor seems to be that the adjustment needed to align the center crosshairs of both images can be done by both by aim and lens offset. The breakthrough concept is to learn that adjusting the aim of the projector increases or decreases keystone, but the adjustment of lens offset does not.
– T.S.