VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


December 17
2019

ISSUE

Web Exclusive

Revisiting a Crucial WWII Sea Battle in Roland Emmerich’s MIDWAY

By KEVIN H. MARTIN

The Battle of Midway is regarded by historians as the turning point for the War in the Pacific. An against-the-odds Naval victory for the United States against the Japanese, the incident had previously been dramatized in Universal’s Midway, released in 1976. That star-studded epic was enlivened through the use of the low-frequency sound process Sensurround, but it’s visual magic drew primarily on stock imagery, combining actual war footage with live-action and visual effects clips from earlier movies [including Thirty Seconds Over Tokyo, Away all Boats and Tora! Tora! Tora!]

Filmmaker Roland Emmerich had intended to tackle this subject matter a number of years back following his successes with Stargate and Independence Day, but studio intransigence delayed progress, allowing Michael Bay to steal WW II naval thunder with 2001’s Pearl Harbor. Being able to take full advantage of modern digital backlot innovations, Emmerich has now at last mounted his tribute to The Greatest Generation with a tale that spans the first several months of the war, beginning with the Day of Infamy, covering the Doolittle Raid on Tokyo [which served as the finale with Bay’s film] and climaxing with the titular engagement.

Pixomondo renders an extensive aerial view of Pearl Harbor under air attack. In addition to the various smoke, flame and flak elements, digital doubles were also needed to populate the besieged base. (Image courtesy of Pixomondo and Lionsgate)

While Midway features location shooting in Hawaii at Ford Island for the Pearl Harbor segment, most of the Lionsgate film was shot in Canada. As was the case with Emmerich’s The Day After Tomorrow and White House Down, production based at Montreal’s MELS Studios [MELS VFX provided previsualization during the shoot], where production designer Kirk Petruccelli (Emmerich’s The Patriot) built a huge section of aircraft carrier flight deck. Backed by bluescreen, the setpiece was populated with newly-built mockups of the attack aircraft. Some types of period aircraft no longer exist, while those original planes still surviving have been altered to remain flightworthy and were therefore unusable.

Visual Effects Supervisor Peter G. Travers joined the project on day one of post-production. Travers had supervised work on a diverse group of feature film universes spanning The Matrix Reloaded, Harry Potter and the Sorcerer’s Stone,, Stuart Little 2, Lord of the Rings: The Two Towers and Captain America: The Winter Soldier. He was a CG supervisor on the team that won an Oscar for What Dreams May Come, then shared a Saturn nomination for the effects in the Watchmen feature, and since then supervised VFX on 22 Jump Street, Guardians of the Galaxy and the most recent Ghostbusters. “A lot of previs had been done before shooting, so they had figured out how best to shoot the live-action,” he explains. “Production knew they could get by with minimal and partial sets in some instances. Filling out those environments meant there would be a heavy reliance on CG, because they would shoot their partial carrier environment, and we’d be adding to that, sometimes making it appear as a different vessel.”  

An American fighter plane, surrounded by flak bursts, is hit and its wing catches fire. VFX toolsets utilized throughout featured Maya for animation and Nuke for compositing, with rendering handled through proprietary means by the lead houses Pixomondo and Scanline. (Image courtesy of Pixomondo and Lionsgate)

Emmerich’s Director of Photography, Robby Baumgartner, acquired digitally via Panavision Millennium DXL and Red Helium 8K cameras, shooting on Panavision’s Primo lenses. “There was 100% freedom of camera during live-action, which is pretty much part of the modern model for CG filmmaking,” says Travers. “I find that actually helps the CG, because the more organic the camera work, the more matching to that is going to make for a realistic illusion. We were going to have a ton of CG shots with people in the foreground – sometimes in a cockpit, sometimes on the ground or aboard a ship – with the background fully digital, so that meant matching contrast and illumination to the [look of] live-action, even if the only real section is a portion of the plane and the actor inside.”

Because of Emmerich’s commitment to realizing the various Midway engagements in a way that did more than just ‘print the legend,’ Travers was assigned history homework. “Very early on in the show, I was charged with reading this massive book called Shattered Sword, which comprehensively covers the battle from the Japanese perspective.” he reports. “Based on Japanese records and, to a lesser degree, American ones, that became our bible. Researching and understanding the subject matter behind what I’m doing is a big part of the job of a VFX supervisor, and in this case, it was very clear the strong historical aspect had to be preserved. The big deal here is that Midway really occurred, which meant we had an answer key for certain important aspects. We knew what a [Japanese] Zero looked like and what other ships looked like, right down to the rivets. Part of our process required us to scour WW II reference footage for planes getting destroyed, tracer fire, flak, any kind of big splashes and other organic events.”

With virtually no period reference depicting dive bombing from the attacker’s perspective, these shots, which included pilot POVs, proved among the most challenging. (Images courtesy of Pixomondo, Scanline and Lionsgate)

“We knew the first Japanese wave attacking the island of Midway utilized 100 aircraft, and had very accurate numbers for the planes used in each sortie. Very rarely did we ever deviate from the actual numbers. There isn’t a single point in the movie where I said, ‘this is not true to history, but let’s do it anyway.’”

—Peter G. Travers, Visual Effects Supervisor

ZERO VFX handled bluescreen cockpit composites. (Image courtesy of Pixomondo and Lionsgate)

Fidelity to historical details extended to shot design, which had to accurately represent fleet and squadron formations. “We knew the first Japanese wave attacking the island of Midway utilized 100 aircraft, and had very accurate numbers for the planes used in each sortie,” states Travers. “Very rarely did we ever deviate from the actual numbers. There isn’t a single point in the movie where I said, ‘this is not true to history, but let’s do it anyway.’ At worst, it was a gray area, something we didn’t know. So the framework for our taking liberties was limited to those aspects for which we lacked visual reference.

“Some of those unknowns concerned involved very important story points,” Travers continues. “Specifically, the dive bombing of the Japanese carriers. We did have POVs from the carriers, footage looking up at the attacking planes, but except for a couple of stills, there wasn’t any POV film from any of those [Douglas] SBD-2 Dauntless planes. So, when we were deciding on looks for the carrier attack, we first had to determine the dive angle. Glide bombing – approaching your target at an angle less than 60 degrees [off horizontal] – was the established approach, but Japanese carriers were so maneuverable they could actually dodge these bombs. Even with great precision, this more horizontal approach was a gamble.”

Since sharing between vendors would be problematic owing to global illumination issues, both Scanline and Pixomondo, which split the primary duties, handled their own shots. (Image courtesy of Pixomondo and Lionsgate)

The successful approach used by U.S. pilots that Travers had VFX emulate is one that might otherwise have been inspired by X- and Y-wing fighters attacking the Death Star. “By choosing to dive bomb rather than glide bomb, the plane comes in almost vertically,” he explains. “The bomb will fall right along your [flight path] if you stay on that course. The danger, obviously, is being able to pull up in time, so in depicting this, we had something both accurate historically and visually exciting.”

In discussing the division of VFX labor, Travers is adamant in his belief that sharing shots between vendors usually creates a nearly impossible dilemma. “It’s folly when you’re trying to make things look photoreal,” he maintains. “We have all this amazing technology at our disposal these days, but even so, you can’t separate global illumination among vendors. In this film, the carrier’s shadow is present on the water, while the water is reflecting on the carrier – these aspects are all connected, and there’s a visual relationship between the carriers and the planes attacking them as well. So for the sake of integrating all these aspects, the work had to be done by a single vendor.”

With aircraft traveling at a high speed, pyro elements had to reflect those dynamic aeronautical conditions with respect to the way winds would shape flame. (Image courtesy of Scanline and Lionsgate)

Scanline and Pixomondo, both of whom worked on Emmerich’s 2012 and Independence Day: Resurgence, were lead houses on the film, supervised by their Laurent Taillefer and Derek Spears, respectively. “Scanline has a lot of proprietary tools for dealing [with water],” acknowledges Travers, “and they have been an industry leader in ocean stuff, so they handled some seriously big, heavy-duty water solutions. For example, as the carriers executed evasive maneuvers, they would be generating the most incredible large wakes. Pixomondo also took on their share of that.” A number of the film’s bluescreen cockpit composites were handled by ZERO VFX under the supervision of Troy Moore. A Boston-based unit of ZERO matched the look of their work with the shots produced by the lead houses.

Animation was accomplished in Maya and compositing via Nuke. “Both of our lead companies had their own proprietary renderers,” says Travers. “We absolutely needed partners possessing these toolsets and this level of experience to pull off these elaborate simulations. This all had to look photoreal. At first glance, you might think that would be easier to do, but to really get things looking right, it becomes a lot harder. If you create fantastic streams of fire from some creature breathing magical energies, who knows if you’ve got it right, but here, we know what bombs going off looks like, so there isn’t artistic license to coast on when you’re portraying a Japanese Zero crashing into the Pacific Ocean. There’s only one way that can look, and getting the shot to that point is where the discipline overcomes the difficulty.”

A successful dive bombing caught Japanese with unlaunched aircraft still on the carrier deck. A multitude of interactions between vessels and ocean were necessary to create convincing imagery, including the way waves broke around a carrier and the caustic lighting reflecting on its hull. (Image courtesy of Scanline and Lionsgate)

At Pearl Harbor’s Battleship Row, American vessels were destroyed as part of the attack by waves of invading Japanese aircraft. Closer views included specific modeled detail, such as mangled hatches, while interactive colored light from the fires flickers on the metal superstructure. (Images courtesy of Scanline and Lionsgate)

The previsualization effort carried well into post for many all-CG shots. “All through the most intense part of the battle, it was supposed to look very messy, with all kinds of debris splashing in the water, plus all the smoke and tracer fire,” Travers remarks. “And there are tons of digital doubles, too. Getting into shot production, we discovered there was no way to make the shots too messy. The further we went with those elements, the more realistic the shots became. That is the gist of what became a rather laborious process over the past six months.” Final touches were added during the film’s digital intermediate, which was handled by Efilm.

“Getting to work on movies is cool, and doing visual effects is one of the coolest aspects of doing movies,” Travers concludes. “So the fact I got to work on a movie like this one, with all this historical significance – well, there are certainly plenty of worse jobs in the world than mine.”


Share this post with

Most Popular Stories

The Miniature Models of <strong>BLADE RUNNER</strong>
02 October 2017
Exclusives, Film
The Miniature Models of BLADE RUNNER
In 1982, Ridley Scott’s Blade Runner set a distinctive tone for the look and feel of many sci-fi future film noirs to come, taking advantage of stylized production design, art direction and visual effects work.
The New <strong>Artificial Intelligence</strong> Frontier of VFX
20 March 2019
Exclusives, Film
The New Artificial Intelligence Frontier of VFX
The new wave of smart VFX software solutions utilizing A.I.
THE PEARL: THE SUPER ALIEN MODELS OF<strong> VALERIAN</strong>
02 August 2017
Exclusives, Film
THE PEARL: THE SUPER ALIEN MODELS OF VALERIAN
Among the many creatures and aliens showcased in Luc Besson’s Valerian and the City of a Thousand Planets are members of the Pearl, a beautiful...
Converting a Classic: How Stereo D Gave <strong>TERMINATOR 2: JUDGMENT DAY</strong> a 3D Makeover
24 August 2017
Exclusives, Film
Converting a Classic: How Stereo D Gave TERMINATOR 2: JUDGMENT DAY a 3D Makeover
James Cameron loves stereo. He took full advantage of shooting in native 3D on Avatar, and has made his thoughts clear in recent times about the importance of shooting natively in stereo when possible...
2018 – Year of the Rapidly Expanding <b>VFX Moviemaking Vocabulary</b>
03 April 2018
Exclusives, Film
2018 – Year of the Rapidly Expanding VFX Moviemaking Vocabulary
Industry leaders discuss trends and issues.