VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


June 04
2018

ISSUE

Web Exclusive

A Generation of STAR TREK Effects on TV

BY IAN FAILES

Dan Curry inspects one of the sets during production. (Image courtesy of Eric Alba)

“We realized we had an obligation to the audience because by that time they had seen Star Wars and other visual effects films that had huge budgets. Of course, we didn’t have huge budgets, nor did we have time. But we were pretty much expected to turn out half a feature for every episode and VFX team members were no strangers to 60 to 80 hour weeks.”

—Dan Curry, Visual Effects Supervisor/Producer

With the advent of Star Trek: Discovery, a new era of Star Trek on television is well underway. The CBS All Access series certainly takes advantage of the latest in visual effects techniques to help continue the Trek mythos, but paving the way for how it and many other television shows with significant visual effects get made were the various Star Trek series of the late 1980s through to the mid-2000s.

It was during the making of these Star Trek episodes – for The Next Generation, Deep Space Nine, Voyager and Enterprise – that visual effects for television transitioned from the film, optical, motion control and miniature worlds to the use of computer-generated imagery and digital compositing. Along the way, the visual effects teams from that era helped usher in an incredibly efficient production process for making VFX for TV. And, when they had to, they still relied on scores of practical effects – things like cloud tanks and plenty of run-and-gun solutions – to get shots out the door.

Dan Curry was one of several VES and Emmy Award-winning visual effects supervisors and producers working during that time on the different Trek series; key Emmy Award-winning collaborators included Ron Moore, Rob Legato, Gary Hutzel, David Stipes, along with many others. Curry shares with VFX Voice just a few of his lasting memories from this important time in visual effects.

Gearing up for a then-new Trek

Before The Next Generation began airing in 1987, Star Trek had already delighted audiences via the original 1960s series and several feature films. Of course, it didn’t take long for the Patrick Stewart-starring TV re-boot to catch on – that series alone lasted 176 episodes. Right from the pilot, Curry and his collaborators realized it would be efficient to split up visual effects duties between two teams by alternating episodes. “That way,” says Curry, “one team would be focusing on what was happening on stage, while the other team focused on the post-production issues and supervising compositing.”

Another early and important development made for The Next Generation was one Curry attributes to Paramount. “They made a courageous decision to have a final product that was not a film negative; it was instead a video master. In those days, we were compositing on one-inch analog tape, which was state of the art at the time. But it meant we were also experiencing the same sort of image degeneration that would occur when duplicating generations of film in optical compositing. Later, when D-5 digital tape came into existence, that problem disappeared and we were able to dupe and re-work things without suffering image quality loss.”

Visual Effects Supervisors Rob Legato, ASC (at camera) and Glenn Neufeld with the cloud tank set-up. (Image courtesy of Eric Alba)

Cloud tanks were regularly relied upon for otherworldly cloud and alien formations. Here, Visual Effects Supervisor Rob Legato, ASC is positioned at the camera, while fellow Visual Effects Supervisor Glenn Neufeld injects an ink and paint solution into the tank. (Image courtesy of Eric Alba)

“Digital ships were used in the background because we felt they still didn’t look as good as physical models. I remember the episode ‘Sacrifice of Angels’, working with David Stipes, from 1997 where there were many, many ships in some scenes. The background ships were CG but the foreground ships were physical models. We did not do a fully CG series until Enterprise in 2001.”

—Dan Curry, Visual Effects Supervisor/Producer

On the stage shooting a miniature against a dayglow orange screen lit with ultraviolet light. (Image courtesy of Eric Alba)

Behind the scenes of a motion-control matte pass of one of Star Trek’s runabout ships. (Image courtesy of Eric Alba)

The dayglow orange and ultraviolet light system was an alternative to shooting models motion control against white cards. (Image courtesy of Eric Alba)

“No one person was responsible for the visual effects on Star Trek. VFX were the result of a collaborative effort by a dedicated team of really terrific artists who had great technical skills and creative vision.”

—Dan Curry, Visual Effects Supervisor/Producer

The classic Trek ships seen in The Next Generation were physical miniatures filmed on Image G’s motion-control stage in Hollywood. Traditional oil matte paintings were also a mainstay of visual effects production, relying primarily on the work of Emmy Award-winners Syd Dutton and Bill Taylor, VES, ASC of Illusion Arts. The result was significantly high quality work, which was actually something Curry suggests came from audience expectations after seeing the oft-lauded feature work supervised by artists like Douglas Trumbull VES, Dennis Muren VES, ASC and Richard Edlund VES, ASC.

“We realized we had an obligation to the audience because by that time they had seen Star Wars and other visual effects films that had huge budgets. Of course, we didn’t have huge budgets, nor did we have time. But we were pretty much expected to turn out half a feature for every episode and VFX team members were no strangers to 60 to 80 hour weeks.”

A typical ship fly-by back then might have taken an entire day to film with motion control. If multiple ships were involved, that time could easily extend to a whole week, especially if explosions and special interactive lighting were required. Part of the issue was that the elaborate bluescreen motion-control systems were only the domain of VFX behemoths such as Industrial Light & Magic.

The Trek team had therefore been shooting ships against white cards for matte passes. To contain the silhouette of the model, each foam core card was necessarily set at a different angle to the source of light. That unfortunately resulted in a slightly different luminance on each card and therefore different densities on each matte element as the cards were repositioned to accommodate the camera moves.

This all changed when Gary Hutzel came up with the idea of using dayglow orange screens and cards illuminated by ultraviolet light. Now whatever angle the cards were placed resulted in the same luminosity. “Suddenly it didn’t matter what angle the cloth or card was to the camera as the luminosity was even all along,” recalls Curry. “That was a real Godsend for all of us.”

Staying old-school

Developments in CG and digital compositing were making major headway in film and television visual effects, particularly in the early 1990s. But sometimes the best solution for those Star Trek television effects remained practical.

“Liquid nitrogen, for example, was one of the best allies,” states Curry. “We would use that for all sorts of things because it was heavier than air. We could shoot vats of liquid nitrogen to create riffs in the time-space continuum, say. I remember a script calling for a space string that was an interdimensional portal. We took sections of large sonotubes covered in black velvet placed parallel to each other with lights and vacuum cleaners underneath that would then pull the roiling liquid nitrogen down into the gap between the tubes. Then by running it in reverse, it looked like streams of energy were coming out of a weird slice in the fabric of space.”

Cloud tanks were another practical effects approach that aided in the kind of alien world imagery Star Trek’s stories required. Realistic computer effects simulations were not yet commonplace and cloud tanks gave fantastic results, as Curry explains.

“Cloud tanks look great because they give a physical, natural effect. You could make the clouds behave in different ways by preparing different layers of water. You could put in a few inches of water at one temperature, then cover it with a plastic sheet. Then put in salt water and cover that with a plastic sheet. Then put in hot water, and so forth. When the layers of different temperature water were ready, you just gently pull all the plastic sheets out, and the paint or heavy cream or whatever material you’re using for the clouds would behave differently as it encountered different temperature levels.”

The move to digital

Eventually, the VFX team did embrace CG. One of the first computer-generated effects was produced for the 1991 ‘Galaxy’s Child’ episode of The Next Generation, in which the crew of the Enterprise encountered a giant space plankton. The adult creature was a physical model built by Tony Meininger, but its baby had to be cute and capable of fluid movement, so they relied on Rhythm & Hues to create and animate a CG model.

Digital ships did not find favor until Deep Space Nine, which began airing in 1993. Scripts required more ships in various scenes than could be shot within our time and budget limitations. “Digital ships were used in the background because we felt they still didn’t look as good as physical models,” outlines Curry. “I remember the episode ‘Sacrifice of Angels’, working with David Stipes, from 1997 where there were many, many ships in some scenes. The background ships were CG but the foreground ships were physical models. We did not do a fully CG series until Enterprise in 2001.”

Ultimately, a myriad of CG effects made many of that era’s Star Trek scenes possible – everything from wormholes to extensive environments and morphing transformations of the character Odo (portrayed by René Auberjonois).

“[Paramount] made a courageous decision to have a final product that was not a film negative; it was instead a video master. In those days, we were compositing on one-inch analog tape, which was state of the art at the time. But it meant we were also experiencing the same sort of image degeneration that would occur when duplicating generations of film in optical compositing. Later, when D-5 digital tape came into existence, that problem disappeared and we were able to dupe and re-work things without suffering image quality loss.”

—Dan Curry, Visual Effects Supervisor/Producer

Model Enterprises positioned at the Image G motion-control stage for use in The Next Generation. (Image courtesy of Eric Alba)

The four-foot miniature of the Enterprise. (Image courtesy of Eric Alba)

Three models of the Enterprise at three different scales – six feet, four feet and two feet – used for different filming purposes. (Image courtesy of Eric Alba)

Groundbreaking effects for TV

The Next Generation, Deep Space Nine, Voyager and Enterprise dominated the Emmy awards for visual effects during their runs, with the shows continuing to live on in repeats and in digitally re-mastered releases. “I was extremely proud of the work and being part of the VFX team,” says Curry, who ended up working for 18 years on the four series.

“One of the things I’d like to stress is that no one person was responsible for the visual effects on Star Trek. VFX were the result of a collaborative effort by a dedicated team of really terrific artists who had great technical skills and creative vision. The team included people who did the pin-registered transfers, phenomenal physical and digital model builders, people who shot motion control, compositors and animators, matte painters, and so many others behind the scenes.”

“The collaboration between the art department, camera department, makeup and costumes, and special effects were also critical factors in the success of Star Trek,” adds Curry. “It was a large group of artists who worked towards the common goal of creating that universe for Star Trek stories to unfold in.”

Making a Robot… from a Shampoo Bottle

One of Dan Curry’s most memorable Star Trek effects moments relates to the time he was called upon, last minute, to deliver a drone robot for The Next Generation ‘The Arsenal of Freedom’ episode, which aired in 1988.

“In that episode,” relates Curry, “the crew of the Enterprise goes to a planet where everybody has been killed by a drone robot during an arms demonstration. Production originally hired a sculptor to build a model that looked like a claw and kept getting snagged in the branches of the jungle set. It was heavy and unwieldy. The producers didn’t like it and so they came to me and asked, ‘What can we do?’

“So I went home and got a plastic Easter egg, a shampoo bottle, one of those little ribbed tubes that you put cables in and a L’eggs pantyhose container, and made a robot out of it. Instead of shooting motion control, having practiced Tai Chi for many years, I had learned how to move slowly. Everybody thought I was nuts, but I did what normally would’ve been motion control by hand by putting the robot on a stick and just moving it in front of a greenscreen.

Dan Curry’s robot for ‘The Arsenal of Freedom’ constructed from a plastic Easter egg, shampoo bottle, ribbed tubing and a L’eggs pantyhose container. (Image courtesy of Dan Curry)

“Then we needed a scene where Geordi [La Forge, Star Trek Lt. Commander played by LaVar Burton] figures out how to see a larger invisible version of that drone by luring it into the atmosphere where its heat signature resulting from friction with the planet’s atmosphere would give it away. So I decided to take the same model design, cover it with black velveteen flocking and glue little slivers of white plastic bag on it.

“We put it on the motion-control rig with electric fans blowing in all sorts of directions and kept the shutter open for three seconds per frame. The long exposure time meant the flapping plastic slivers had motion blur that created a gaseous feel. Then we keyed real fire – newspapers burning in a barbecue shot against black velvet – through alpha channel derived from the blurred plastic slivers element. That’s what created the illusion of hot gases emanating from the re-entry of the drone.”

The Enterprise takes out the larger drone above the planet’s surface. (Image © 1988 Paramount Pictures, courtesy of Dan Curry)

Share this post with

Most Popular Stories

2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
09 January 2024
Exclusives, Television/ Streaming
2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
Despite production lulls, innovations continue to forward the craft.
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
23 January 2024
Exclusives, Television/ Streaming
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
In the case of the second season of The Wheel of Time, the on-set visual effects supervision was equally divided between Roni Rodrigues and Mike Stillwell.
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
05 March 2024
Exclusives, Television/ Streaming
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood.
REMOTE WORK 2024: FINDING THE BEST BALANCE BETWEEN HOME AND OFFICE
17 January 2024
Exclusives, Television/ Streaming
REMOTE WORK 2024: FINDING THE BEST BALANCE BETWEEN HOME AND OFFICE
In just a few years, turbocharged by the pandemic, remote work has become widely established in the VFX industry and is now a preferred option for many visual artists.
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
09 January 2024
Exclusives, Television/ Streaming
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
VFX Supervisor bridges Italian cinema, Hollywood blockbusters.