Virtual cameras meld digital effects and live action

Seeing through the camera

What’s more fundamental to the act of making a movie than the camera? It’s the camera that defines a movie and separates it from a play and as camera technology has changed over the years, they’ve changed the art of movies. When first used to make movies, cameras were fixed—they were giant things that stayed put and captured longish takes then they might be moved for a different viewpoint, for a close up, etc. All that came with a more sophisticated approach to editing. Then along came more mobile cameras, inexpensive digital cameras that could be mounted inside vehicles, on rails, and shot from canons if that’s what the scene called for and with each advance the movies changed. Special effects were first created in-camera, then digital techniques enabled effects to be edited without a physical camera.

It’s all magic, but digital technology has also changed the dynamics of filmmaking and the filmmaker is sometimes one step removed from creation. For directors and cinematographers, it’s incredibly valuable to be looking through the camera with the same point of view the audience is looking at the screen just like it used to be before the arrival of digital technologies, and doing it in post.

Perhaps that is why the use of virtual cameras is increasing and techniques are being refined. Game engines are an important part of the equation.

In a recent article in Variety, visual effects innovator Rob Bredow talked about the work ILM did for the Steven Spielberg film, A.I., released in 2001. He said they used a game engine to feed images to the green screen to give actors a reference for Rogue City, the movie’s futuristic take on Las Vegas as the land of temptation. ILM’s tracking system let actors see where they were in the city. At that time, they repurposed the engine for Unreal Tournament.  ILM then created their own engine for bringing the digital components into the real world of film production.

It’s not surprising that Spielberg is an early adopter of virtual camera techniques because as a director who relies heavily on special effects, he has suffered mightily as a result of stunts that didn’t quite work or effects that were much harder to make correspond to his visions. The many failures of Bruce the fake shark in Jaws is a perfect example. Various incarnations of the robot shark exploded, broke, or simply looked ridiculous. Digital effects open up new doors, but still the director usually didn’t know what the finished product would look like until the post process and sometimes late in the post process. With Jurassic Park, the robots originally designed for the production by effects geniuses Stan Winston and Phil Tippett had to yield the stay to CG animals that were able to move more realistically (and reliably). Tippett says he immediately realized physical effects tools like robots and actual models were doomed and it broke his heart.

He was right, something was lost when the actors aren’t interacting live with creatures and effects. Shooting with a virtual camera is a way to right that wrong. Spielberg has continued to help advance the capabilities of digital effects, but he has always wanted to have as much control and interaction as possible. The work done building a virtual system for Ready Player One added the element of VR to help immerse the production team into an imaginary digital world.

Steven Spielberg behind the wheel of the virtual camera system used for Ready Player One. (Source: Warner Bros. Production)

A look at the tools being developed demonstrates a range of capabilities and accessibility for virtual production.

New development

At FMX 2019, cinematographer Matt Workman revealed the work he has been doing to develop Cine Tracer in Unreal to enable filmmakers to block out scenes, try out lighting, quickly build rough sets. He’s also built plug-ins for Maya and Cinema 4D, but he’s using Cine Tracer to dynamically work on films with directors and DPs. It works like a video game and in fact is available through the Steam store as an Early Access game. It includes story boarding features as well as realtime scouting.

As he develops the program, Workman is busy adding real world cameras and lighting. It’s important to be as realistic as possible, he says. For instance, he says it’s important to use real focal lengths because “a DP will think something is bogus if a strange focal length is offered.”

Essentially, Workman has made an application to streamline pre-viz and production camera work and he’s short cutting a lot of the DIY work that went into early efforts to use game engines for pre-viz. Significantly, his tool and others like it are also helping bridge the work of cinematographers, visualization, and VFX.

 

At Sundance 2018, Habib Zagrapour and Wes Potter of Digital Monarch Media (DMM) talk about their virtual camera system and its potential. (Source: Unity)

Realtime synergy is a gift that comes with virtual production as was evident from the compelling performances that were captured during the making of The Jungle Book. Unity was adapted for film use to enable director Jon Favreau to be able to interact with real-life and digital characters on the set simultaneously. As lead technical director at Digital Domain, Girish Balakrishnan built the virtual production system.

Here’s your kitty: early release shot from Jon Favreau’s The Lion King. Favreau’s adoption of virtual cinematography to blend real world actors with digital characters was an early use of a game engine in production and a defining moment for filmmakers. (Source: Disney)

Director Favreau has committed once again to virtual production for his current film The Lion King, which will be a live action version including a very cute baby kitty, and he’s tapping the many of the same people for the project including Girish Balakrishnan as Virtual Production Supervisor for MPC. For the production of Lion King, they’ve also worked with VR studio Magnopus to take virtual cinematography further. Balakrishnan says the virtual production system used motion capture and VR/AR technologies and enabled the director, cinematographer, production designer, and visual effects supervisor to interact directly with digital and real life elements on the set.

More building blocks

Unity has been buying the technology that helps bring the game engine into production pipelines.

Unity Creative Director Adam Myhill developed Cinemachine to emulate real world cameras and lighting within the Unity game engine. Unity acquired Cinemachine and further developed Timeline, which provides interactive creation within the game engine

Myhill too has a cinematographer’s sensibility. His work has been to bring an understanding of the visual language of film to game engines and cinematic literacy for game engines makes them more useful to the people making movies. It’s a two-way street. Cinematic literacy also makes for more compelling game development and better animatics, the filmic sequences used to introduce game story lines, and provide transitions.

For film, Myhill says the tools have to move, record, and present content in the same way that actual cameras do. Audiences, he says, have been immersed in 100 years of cinematic language. People can feel it when it’s not right, and that goes for the people on the set as well as in the audience.

Last year, the Unity bought Digital Monarch Media, a company founded by virtual production developers Habib Zargarpour and Wes Potter. They developed the system used on Jungle Book, and also Ready Player One and Blade Runner 2049. The two have also worked on games including Ryse, Son of Rome, and Need for Speed.

Over a period of 10 years, Potter and Zargarpour have been building and refining their virtual production system, which includes Expozure, a virtual cinematography environment built on Unity and Cyclopz, which enables users to work with mobile devices including iPads and mobile phones. The system has lighting and lens tools that reflect real world equipment. In addition, sets can be manipulated and adjusted in realtime.

“We don’t see it as a box product,” he said, “every single film is different.” While working on Ready Player One, Potter says he was able to build a pipeline sitting right next to Steven Spielberg to enable him to do a shot in a different way than previously planned. The power of game engines, says Potter’s partner Zargarpour, is that “you’ll be able to get finished, quality visual effects in realtime.”

Zargarpour and Potter say the people working on the film loved the tools because it made creating special effects more like playing a video game, which suits directors and the production team.

New player up

Glassbox technologies is a startup founded by people from the Foundry and Norman Wang who founded Opaque Media. They’re trying to make virtual production as easy as capturing movies on an iPad but powerful enough for professional production. To do so, they teamed with The Third Floor, a virtual production company that has worked on almost every major movie using virtual production technology. Founded in 2004 by a team at Lucasfilm, who had worked on Star Wars: Episode III Revenge of the Sith, the Third Floor is known for their work on James Cameron’s Avatar which signaled the birth of performance capture.

Glassbox is debuting their first product, DragonFly, which works with Maya, Unity, and Unreal. It’s a low-cost system that will sell for $750 including support and updates for a year. They do see their product as one that comes out of the box and enables small teams to use virtual cameras for filmmaking and VR development.

The company is also developing its software BeeHive, which will enable virtual scene syncing, editing, and reviewing. Glassbox will deliver BeeHive soon after DragonFly. The systems will support multiple users and multiple platforms.

The DragonFly system by Glassbox includes the wireless mount with handheld controls. Glassbox says their system is designed for the people working on set and does not require knowledge of host software such as Unreal, Unity, or Autodesk Maya. (Source: Glassbox)

DragonFly is being used by VR Playhouse and partner company Strange Charm which has been working on several projects including a digital human piece for Unreal at Siggraph 2019. The system is also in use by the Cameron Pace Group for a Chinese production of Ghengis Khan.

Virtual camera systems like Cine Tracer and the tools from Digital Monarch and Glassbox are designed to take quick snap shots to provide instant story boards. They can block out a scene with camera movements and increasingly they’re moving into actual production as we’re seeing in the work Jon Favreau is doing with MPC.

Going for realism

Over the last two years, the realtime rendering capabilities of game engines has improved to the point that filmmakers are not working with simplified visualizations of digital sets, but full rendered versions that are closer to the finished version. Unreal and Nvidia have teamed to push the rendered visualizations deeper into the production pipeline and talk about a time when the final render can be achieved in the game engine environment. That vision may oversimplify all the steps that are involved, but it points to a real advancement in content creation for film and video as the production team is reunited with the effects team and more of the work done in post, is done concurrently with production.

 

Epic’s Unreal team makes its case for virtual production (Source: Epic)

For further information:

Visit Girish Balakrishnan’s website where he provides a wealth of information on the virtual production process.

Related posts