At GDC 2017, Unreal teamed with the Mill to produce a short film The Human Race to demonstrate the power of virtual filmmaking. Their technology is an expansion of virtual camera technology that enables action capture.
Over the past year, the message from the game engine companies has been that game engines can do a lot more than just make games and the message from GDC has been that games can go to totally new places thanks to the advances being made in game engine technology. Also, AR and VR are changing the way game developers are thinking about games.
Epic Games CTO Kim Libreri and The Mill’s chief creative officer Alistair Thompson introduced the collaboration between Epic, Chevrolet, and The Mill FX house for creating and editing automotive content. The Mill created the Blackbird, a reinforced all-terrain vehicle for motion capture. The car can be tuned to emulate any car’s engine size, torque, drive, etc. It’s covered with targets and equipped with lasers, sensors, and cameras to capture and be captured for MoCap. Any car body can be mapped to it in post. Chevrolet loves this because they don’t have to show off their brand new super-secret car, which by the way may not even exist yet because it’s in production or the design phase. The Blackbird has been in use for some time and won an Innovation Lion at the Cannes film festival in 2016, but at GDC the Mill showed off their new system which includes a virtual camera rig, called Cyclops that can be used to track Blackbird in its environment so content creators can “see” how the car looks in the environment. How it’s picking up reflections, etc. Chevrolet marketing direction Sam Russell also demonstrated working with a Tango equipped phone to change points of view and change the car’s colors and lighting on the fly. All this happens in the Unreal engine. To demonstrate the possibilities Chevrolet showed their new ad, The Human Race, showing a race between a “real” car the Blackbird wearing the body of a 2017 Camaro ZL1 and digital prototype of the Chevrolet ZNR autonomous car. There is also a behind-the-scenes short film on Youtube showing how the cars can be edited in realtime.
In another application, Andy Serkis’s company The Imaginarium worked with Unreal and the Royal Shakespeare Company (RSC) to create effects for a staged performance of The Tempest. Serkis is best known for his mocap performances which have helped redefine the idea of acting and animation through his performances in Planet of the Apes and Lord of the Rings. The Imaginarium is an FX house offering sophisticated effects using motion capture and digital sets. The Imaginarium team and RSC worked with Intel, Epic, XSens for mocap, Vicon for tracking to create realtime effects around the character Ariel played by Mark Quartley. Quartley’s costume was actually a motion capture suit allowing him to control the character even when projected overhead, across the sky, or bursting into flames. The effects were coordinated and controlled from within the Unreal engine. For more information, computer Graphics editor Karen Moltenbrey has written a detailed post-mortem of the work. Moltenbrey reveals that Gregory Doran was looking for a way to create gangbuster effects for his staging of the Tempest on the 400th anniversary of Shakespeare’s death. Doran was inspired several years ago by Intel’s keynote in 2014 which featured a giant leviathan swimming around the auditorium … sort of, which only underscores the fact that nothing goes to waste.
In fact, if there was one major message of this year’s State of the Unreal presentation it was that the use of game engines as a central authoring system for content enables the content to be reused and repurposed. Something the movie, TV, and advertising industries have dreamed of doing. Now it’s becoming possible across a broader landscape of industries including AEC and manufacture.