That early tool segued into the development of a real-time cinematography simulator called Cine Tracer, after Workman had started a company called Cinematography Database and begun selling a Cinema 4D previs plug-in known as Cine Designer. The plug-in allowed for very accurate technical previs with rigged 3D models of cameras, lights and other film industry equipment. It caught the eye of Unreal Engine maker Epic Games.
“One day Epic Games asked me if I would develop something similar for Unreal,” relates Workman. “So I started to learn Unreal Engine 4 and I made a quick prototype into a video game. I posted the video online and it went viral. Since then I’ve kept adding to what is now called Cine Tracer as I learn Unreal Engine, and it’s turned into its own ecosystem and includes set building and digital humans on top of the cameras and lights from my Maya/Cinema 4D work.”
Having worked with Epic since the beginning of Cine Tracer, Workman has maintained that relationship. He was awarded an Unreal Dev Grant and has been part of the Unreal Engine 4 Live Training stream. When Epic Games became involved in helping productions with real-time footage delivery on LED walls (the kind popularized by The Mandalorian), they asked Workman to help demo the tech on a purpose-built stage featuring Lux Machina LED screens. Specifically, he was part of an LED wall shoot as ‘director/DP’ during SIGGRAPH 2019.
“I was already going to be presenting twice with Unreal Engine at SIGGRAPH, so I was up for one more project,” says Workman. “I had no idea how massive the LED project would be and I spent a full month on the stage. My primary role was to be the cinematographer and consult on the camera, lens, lighting, grip, equipment, and the general approach to shooting the project. I also had to consult on crew and other typical cinematographer responsibilities to make a smooth professional shoot.
“The team on the project had already worked together on The Mandalorian, so I was the one catching up,” adds Workman. “But I contributed ideas on how I wanted to control the CG lighting environment at run time. I was also responsible for finding shots that best showed off the interaction of the LED reflections of the wall on the live-action motorcycle and actor they had there.”
Since the LED wall experience, Workman has continued experimenting in real-time and in showcasing his results. For example, earlier this year he built his own virtual production at home, a fortuitous step for when the COVID-19 crisis arrived. “The main tool I’ve employed here is the HTC Vive Tracker that allows me to track the position of a real-world camera,” explains Workman. “I bought a Blackmagic URSA Mini Pro 4.6K G2, and using a Blackmagic DeckLink 8K Pro I can bring the live footage into Unreal Engine 4.