By TREVOR HOGG
Images courtesy of Paul Debevec
By TREVOR HOGG
Images courtesy of Paul Debevec
In the middle of the innovation centered around photogrammetry, HDRI, image-based lighting, digital actors and virtual production has been Paul Debevec, VES, who is currently the Director of Research, Creative Algorithms and Technology at Netflix and Adjunct Research Professor of Computer Science in the Viterbi School of Engineering at the University of Southern California. “As I accepted the Charles F. Jenkins Lifetime Achievement Award [at the 2022 Technology and Engineering Emmy Awards], one of the things I noted is that they were recognizing all the stuff that I did 20 years ago,” Debevec recalls. “I was a different person back then! I am still trying to do some cool stuff. There was a bit of magic at Berkeley that was nice to ride the wave of.”
For Debevec, motivation comes from trying to figure out how things are made. “Movie visual effects were so astounding and exciting to watch, and there’s a reason why they call it movie magic because it looks like one thing but is actually a different thing. It’s a miniature or bluescreen or CGI render but looks like a spaceship or dinosaur, or your lead character is careening off a railroad track.” Academia and the visual effects industry are technological partners, Debevec says. “So much of what happens in visual effects is made in this virtual collaboration with academia; they inspire and feed back into each other. It comes together at the SIGGRAPH conference.”
Academia was not an unfamiliar career path for Debevec as his father earned a Ph.D. in Nuclear Physics from Princeton University and became a professor at the University of Illinois, which had a nuclear accelerator. “My dad being a nuclear physicist and a professor was definitely influential, because I felt that I should have some relationship with academia. His work was technical, and when I visited his nuclear physics laboratory, there would be all sorts of interesting technology. There were PDP-11 and VAX computers. My parents also happened to do some darkroom photography. At some point in elementary school, the computer van came by, and students were allowed to play with the TRS-80 or Apple II.” His fascination for computer programming led to the purchases of a Commodore VIC-20, Commodore 64 and Commodore 128, which are preserved in a display case along with an 8mm Bell & Howell film camera that was gifted by his grandfather. “It’s like your typical story where you get a movie camera as a kid and would make some stop-motion films. You could hear it click and shoot the individual frames. I also had the Canon AE-1, which I used when I was my high school yearbook photographer.”
Debevec earned degrees in Math and Computer Engineering at the University of Michigan in 1992 and a Ph.D. in Computer Science from the University of California, Berkeley in 1996. “You try to find something that maximizes how many of your talents and interests it leverages because that’s going to give you the most capability in that particular area. Finding something that leverages an interest in photography and computers, especially using computers to take pictures, and my love of film visual effects, all of that converged. At the same time, since my father was a professor, I don’t think it would have ever occurred to me to directly work at Digital Domain or Industrial Light & Magic. However, I did apply for an internship at Industrial Light & Magic in the summer of 1994 but never heard back from them. But, I was lucky at USC that I could be a research professor publishing at the SIGGRAPH conference and collaborating and working with the visual effects industry for scanning actors and consulting on image-based lighting that was done on films.”
Photography made Debevec aware of the possibilities for lighting. “Being in the darkroom and shooting all that film gave me a sense of the dynamic range of light in the world – seeing those little numbers on the shutter reel that went from 1 to 1,000. The work that I ended up doing in high-dynamic-range imaging was inspired by realizing back then that there’s this huge range of light in the world that is not getting captured on a single exposure of a negative. When digital cameras came out, they had even less dynamic range.”
After witnessing the dinosaurs in Jurassic Park during his second year of graduate studies at Berkeley, Debevec was surprised by the methodology utilized by Industrial Light & Magic to integrate the digital creatures into the scenes with actors. “They said, ‘We try to write down where the lights were and then try to recreate it. If it doesn’t look right then we iterate until it finally looks right. That sounded laborious and something difficult for me to practice. I liked the idea of adding things that were never there for me to tell a story. When I made my film Fiat Lux, where we put the big monoliths and spheres in St. Peter’s Basilica, I was able to use this process where I captured the real light in the real world using HDR photography panoramically, so you see light from all directions and then light your CGI objects with this measurement of the light that was actually there. The data worked, and nowadays when ILM needs to put a dinosaur into a shot, they’re likely to use an HDRI map to light it.”
For Debevec, his best accomplishments occurred when he was able to take a new technical idea and apply it in a creative way that counted for something, such as with his short films, The Campanile Movie, Fiat Lux and The Parthenon. “I was lucky that when I went to SIGGRAPH for the first time I was covered by my internship at Interval Research Corporation and I saw the SIGGRAPH Electronic Theater and was blown away. I did my Ph.D. work on modeling and rendering architecture from photographs, which got published in SIGGRAPH 1996.
One of the examples in the paper was doing a 3D modeling of Berkeley’s tower from a single photograph because it has so much symmetry. I had the idea that this would be far more striking if we could not model just one building, but also model its environment in 360 and then fly around the building. I met an architect professor named Cris Benton who did kite aerial photography. We took a Canon Rebel film camera with a 24mm lens and got it 400 feet in the air and took pictures of the Campanile with that.” The photogrammetry work in the short captured the attention of Visual Effects Supervisor John Gaeta, who was working on the visual effects of The Matrix. “John realized that tech would solve a visual effects problem that they were having in recreating a CGI version of the scenes that the bullet time shots would take place in. I helped to advise him before he went to take the photos on top of that building in Australia. John hired one of the grad students I had supervised on The Campanile Movie, and they got my software I had done my Ph.D. for at Mannix Entertainment, which was then used to create the virtual backgrounds. Being associated with The Matrix gave me a certain runway to keep doing the stuff that I was hoping to do in the academic world.”
The first light stage at UC Berkeley was a rather low-tech affair with the original goal being to surround actors with LEDs to display images of the set around them. “It was made out of a few hundred dollars of lumber and a single 250w photography kit spotlight that spun around on ropes,” Debevec recalls. “Once we had devices that surrounded people with concisely controlled computer illumination, it became clear that we could do lots of interesting things with this. The polarized spherical gradient illumination was born out of trying to have a process that would get 3D scans of the face that would be as detailed as doing a plaster cast and then laser scanning it. One day, I started playing around with putting polarizers on all the lights and flipping a polarizer in front of the camera, and I was able to isolate the specular shine of the skin. From different lighting conditions, you could figure out the surface orientations, basically what part of the light stage is reflecting in that pixel of the skin so you can get a high-resolution surface normal map photographically. If you write an algorithm that does a photogrammetry solve, plus embossing for all of this detail, you get face scans from this normal map that are 1/10mm accurate. When [Visual Effects Supervisor] Joe Letteri saw our work through Mark Sagar at Wētā FX, he decided to send the Avatar actors over, and we got to start applying this technique in movies.”
Neural rendering and AI are completely transforming how computer-generated images are created. “There are the famous stories about how Phil Tippett thought his career was over when computer-generated dinosaurs became a thing,” Debevec observes. “It turns out that his career was not over. Look at Starship Troopers. This is another moment where a lot of us need to take the time to familiarize ourselves with the power of these algorithms. AI-generated art may be still frames, but the research papers are showing that they can make image sequences. I did an interview for The Hollywood Reporter about how AI would change Hollywood back in May 2020 and I almost cheekily said, ‘A few decades from now you’ll be able to send in the script of your movie and it will output the movie. You just tell whose style you want to make it in.’ It turns out that’s not decades from now. We will be able to have this PDF to MP4 converter within this decade. In fact, you don’t even have to supply your own PDF of the script. We’re going to make the script with ChatGPT, which doesn’t do things that have tons of substance to them. But if you have a person figure out the key elements, story points and character arcs of a film, that generating script and text, at least in an augmented way, is going to help you write your script, and then it will output a version of the film. Where the real research is going to be is how to then direct it to fix it up and also get it to be consistent from frame to frame. I firmly believe that as crazy as it is, we won’t be making movies or visual effects the way we are now within a decade. The most optimistic guess of what’s going to happen is this will make it so that many more people have access to creating things that look like high-end feature films. It’s about embracing change and finding out how these tools can elevate your craft,” Debevec states.
Certain achievements stand out for Debevec, such as “building the 3D facial scanner that got used on 100 movies with the light stage and getting invited to the White House to scan President Barack Obama in 2014,” Debevec remarks. “And getting to work with the survivors of the Holocaust through the USC Shoah Foundation and recording them with holographic imaging light-field arrays we built for that.” There are defining character traits. “Whenever I get to a point where I have a vision for something and it doesn’t exist yet, then I want to close that gap.”
Private research has bigger budgets than academia. “It’s a constant effort to have the powers that be convinced that this is research worth investing in. In the context of a film production where you need to do something that’s not going to fail, you need a safe space to try new things or something just because it’s interesting. A lot of the stuff that I’ve done which had an impact were things that I didn’t know were going to work or not. However, I knew it was something that I’d learn something from. I’m in a good position at Netflix because I have two decades plus of research results, not all of which have been applied in the industry yet. We can dip into the archives of cool things that are so much more practical to do today because cameras or LED panels are better. I’m hoping that we’ll get to come up with some good stuff that will help our productions and eventually everybody else.”