By TREVOR HOGG
The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.
Winner of three prestigious Folio Awards for excellence in publishing.
By TREVOR HOGG
Even since the iPhone debuted on June 29, 2007, a constant focus for Apple and consumers has been on improving its optical capabilities to the point that a more accurate name would be the iCamera. The technological advancements have not slowed down as consumer expectations increase with each passing year. The iPhone 12 Pro Max incorporates 5G, LiDAR, Dolby Vision HDR and Apple ProRAW, which will only continue to cement mobile filmmaking as a viable and visually acceptable form of cinema.
“I have the new iPhone 12 Pro Max that has all of those bells and whistles,” explains Michael Shelton, Visual Effects Supervisor at Pixomondo. “I do shoot a lot with my phone because it’s small, fast and easy to take out. Fifty percent of what I’m capturing is on my phone. I can shoot hundreds and hundreds of pictures. I use applications that mimic the film back of the camera that we’re shooting. I use that a lot of times when I’m scouting a location or I’m with the DP and he is framing up shots. I’ll know if we’re going to be doing an extension, this is what we’re going to see, this is what we might see, and this is what we’re not going to see at all. It’s the best way of doing that because I can’t on my DSLR. It’s a tool that I use on every show every day, but I’d never rely on it. I would need to battle-test something like that in a lot of different scenarios before ever being comfortable leaving our $60,000 LiDAR scanner at the shop. I would love it if it became that compact and easy because that would be a game-changer.”
“It has Dolby Vision certified HDR 4K 60 fps and not only that,” states Carsten Kolve, Facility Digital Supervisor at Image Engine. “I have a virtual camera system, LiDAR scanner and facial capture – all of these things are in my pocket. A whole lot of quite advanced technology has been made available to a whole lot of people, including us, to utilize. We certainly make use of it. On some projects we used iPads and iPhones to capture a camera motion for previs or full CG shots. It’s easy to acquire this kind of stuff without having a big motion capture stage. If professional workflows work more into the consumer area, it will hopefully have an impact on the type of tools that are available to us at their price point. The primary reason they put LiDAR in there was to be able to get a depth approximation of your scene and direct the focus better in the photograph. It’s not intended to be a set acquisition or even a high-end 3D scanning device as of yet. We’ll see how fast it evolves.”
Augmented reality combined with LiDAR on mobile devices provides the opportunity for previsualization that will assist with framing and composition. “If you want to shoot an action where you have a background that’s not there or on location and want to replace the building with something bigger, up until now you couldn’t see that building,” remarks Joseph Kasparian, Visual Effects Supervisor at Hybride. “The iPad and iPhone are capable of loading an asset, using LiDAR to track properly the background, and utilizing AR to visualize that asset to get better framing. I’ve seen tons of applications meant for interior design where you can do this. For sure, I would use these apps to visualize and compose something that is not on set. 5G is supposed to go up to 10 gigs a second for a data transfer. You would be able to go through and stream through edits in real-time. It’s mainly a question of how much companies are going to charge us for data transfers! 5G is going to make the whole concept of working in the cloud a bigger reality.”
“The primary reason they put LiDAR in there was to be able to get a depth approximation of your scene and direct the focus better in the photograph. It’s not intended to be a set acquisition or even a high-end 3D scanning device as of yet. We’ll see how fast it evolves.”
—Carsten Kolve, Facility Digital Supervisor, Image Engine
“The idea is that LiDAR helps to find the shape of things, so the iPhone and iPad can do things like defocus that a tiny camera sensor can’t do,” notes Berj Bannayan, Co-founder of Soho VFX. “You get that nice portrait mode where you see how the focus rolls off of objects. AR is becoming an interesting area for consumer electronics. I can scan my living room and upload a piece of IKEA furniture to see how it looks. One of the things that struck me when I saw the iPhone 12 is that I could be constantly doing LiDAR scans of the set while we’re shooting. They’re low resolution, but it certainly can be interesting from an on-set reference standpoint for saying that the camera was approximately 12 feet away from the actor and there was a light approximately 13 feet above his head.”
The popularity of content creators on YouTube, Snapchat and TikTok who have mass followings is seen as the democratization of filmmaking through mobile devices. “There is a massive paradigm shift that has happened over the last few years in regards to what passes as acceptable entertainment,” believes Lou Pecora, Visual Effects Supervisor at Zoic Studios. “Most kids watch way more YouTube than television or movies in the theater, even before COVID-19. People put a camera on themselves, talk and do quick cuts less than a syllable sometimes of them standing in their kitchen or doing something stupid, like the ice bucket challenge. These get millions of hits. Talk about the democratization of entertainment, but is that cinema? [Director/screenwriter/visual effects artist] Gareth Edwards made amateur films that looked amazing using consumer technology in a way that most people don’t have the dedication, discipline and drive to get that result. That’s why there aren’t a million Gareth Edwards.”
Mobile software applications have arisen that have enabled Hollywood directors such as Steven Soderbergh to make Unsane and High Flying Bird on the iPhone. “From the outset, FiLMiC Pro was intended to mimic all of the manual controls that you would find on a $10,000 to $50,000 camera, whether it is an ARRI or RED,” explains Neill Barham, Founder and CEO of FiLMiC. “We wanted cinematographers and directors to feel at home on the mobile platform and to also offer an opportunity to people who couldn’t afford those cameras to create their own material, and have it as a learning path to follow their ambitions if they wanted to grow into the Hollywood ecosystem. This coincided with the release of the iPhone 4 which had a 720p HD capability. We knew early on that the technology was only going to get better and that supposition proved to be true.”
Aspect ratios from Super 35 to 1.33:1 to DCI are readily available for mobile filmmakers. “It is a great opportunity for somebody to learn whether the story is best told in 2.40:1 or 1.85:1,” states Barham. “Moondog Labs, which created a 2.40 anamorphic lens that would stretch the image over the sensor, came to us in Beta to create the software that de-squeezed that image so you could look at your full-resolution 2.40:1 aspect ratio in the library and it was done. Now anamorphic lenses far outsell wide and telephoto lenses in the mobile space. It is a great affordable option for $150 when years prior it would have been a multi-thousand-dollar Angénieux lens. Now you have the YouTube and TikTok generation saying, ‘I can do whatever I want.’ People from the ages of 13 to 15 have a skillset that so far exceeds what mine was when I first entered into film school. That is incredible, and it’s starting to reinvent the format and create new and compelling mediums that didn’t exist before.”
It is important to keep in mind the capability of the CPU and GPU to ensure that the system does not crash. “When an important process is in effect, we divert resources through other things,” explains Christopher Cohen, CTO of FiLMiC. “For example, if we’re refreshing our histograms at 30 fps and there is something that needs more compute in the system, then we may bring that down to 15 fps and interpolate between histogram models. You won’t see a difference as a user, but we’re able to move that compute to where it’s most needed.” Technology is always changing but some things stay the same. Adds Cohen, “Audio solutions have become more important with filmmaking on mobile. Tripods are probably more important now than they’ve ever been as people are trying to up their game. We’re starting to see multiple axis stabilization in phones now as well as quantum dot deposition technology. If you were to do that on a full-frame sensor it would be astonishingly expensive, but on smaller sensors it becomes an economic reality.”
One thing that cannot be coded is the portability of smartphones. “You can walk into a protest in a crowded street, pull out your phone and you’re one of hundreds of other people doing the exact same thing,” notes Cohen. “You are the invisible filmmaker, and that is incredible because you can get authentic reactions from people without coaxing. You can record real life in a way that you can’t with a RED or ARRI device.”
“5G is supposed to go up to 10 gigs a second for a data transfer. You would be able to go through and stream through edits in real-time. It’s mainly a question of how much companies are going to charge us for data transfers! 5G is going to make the whole concept of working in the cloud a bigger reality.”
—Joseph Kasparian, Visual Effects Supervisor, Hybride
“We’re starting to see multiple axis stabilization in phones now as well as quantum dot deposition technology. If you were to do that on a full-frame sensor it would be astonishingly expensive, but on smaller sensors it becomes an economic reality.”
—Christopher Cohen, CTO, FiLMiC
Then there is the matter of quick camera setups. “Claude Lelouch told a great story about shooting The Best Years of a Life,” recalls Barham. “He initially intended to shoot a few shots in crowded spaces on the iPhone and do the rest on 35mm film. Early in production, instead of just sitting around not doing anything while waiting for the camera to be set up for the next shot, Claude grabbed the iPhone and asked his actors to take a couple more shots. After doing that for two or three days, he decided to put the big camera away and shot the rest on the iPhone. He is shooting his next film on the iPhone as well. It is a real revolution, not just an affordable convenience which might have been the impetus initially.”
As smartphones have gotten to be more sophisticated, certain complications have arisen. “Taking advantage of new APIs used to be clean-cut in that Apple would use the third-party development ecosystem as a way of vetting ideas before incorporating them into their own camera,” states Cohen. “Where things have gotten tricky is that they’re a huge company with a billion users and want to keep those users as happy as possible. This means that their tech is often geared towards a broadly palatable consumer application. That runs counter to our intent to satisfy the highest end of the professional market. A lot of the things now is how can machine learning and AI take away the creative choice from a user and basically automate a satisfactory result. We’re specifically interested in the 100 million who have a specific idea and look. We have occasionally butted heads against that broad consumer solution.”
When it comes to funding R&D, very few companies can match Apple and Samsung. “Apple held out on OLED for a long time while the rest of the industry adopted it, and the reason was it couldn’t fully support the P3 color space,” remarks Cohen. “They waited until the color accuracy was there. The only thing that I would caution mobile filmmakers is that many OEMs will do automatic white balance adjustments to the display itself. I would just go ahead and turn that off.”
Being able to shoot in broad daylight is an area that needs improvement. “Neutral density filters are one of the essentials that make the difference as to whether you have great usable footage that doesn’t have blown-out highlights in the sky or dramatically underexposed landscape in the majority of your daylight exterior shots,” states Barham. “Is that something which will come to the native hardware in the next two or three years? I would not bet against it.”