GDC wrap-up: Epic moves

Three-part series covers some of the GDC 2018 highlights: mobile gaming, Epic’s move on game development, and gaming at Unity.

Benjie Jiang plays Siren, a virtual human. (Source: 3Lateral)

Before GDC came to town and wrecked our attention spans, Epic General Manager for the Enterprise, Marc Petit, did the rounds to talk about the tools in development at Epic for the non-game market, a potentially huge market for content creation in manufacturing, architecture, design, interactive experience, etc.

Among Epic’s largest acquisitions on this front was Datasmith, which streamlines the transfer of data from CAD and 3D modeling tools like Max or Maya into the Uneal Engine. The company has used the technology from Datasmith to build Unreal Studio, which is available now as an open Beta. Datasmith enables content to be brought into the Unreal engine with automatic lightmap and UV creation, scripted workflows to manage repetitive import issues, and easier geometry clean up.

According to Petit in just 5 months Datasmith and Unreal Studio had over 14,000 beta registrations and a survey report documented gains of 113% using Datasmith to import content.

The features of Unreal Studio include Datasmith for data transfer; learning tools with up-to-date lessons, tutorials, and templates; online support; and content including 100 Substance materials from Allegorithmic.

Earlier this year, Epic also acquired the Scottish company Cloudgine, who are the developers of cloud technology used by games to offload processes such as physics, AI, etc. to the cloud. Microsoft has been using it in its Crackdown 3 game to enable the destructions of cities and the Oculus game Toy Box uses Cloudgine for physics. The Cloudgine acquisition will enable Unreal’s developers to use cloud-based resources to offload compute heavy tasks in their games.

State of Unreal

Epic’s centerpiece event, the State of Unreal, which discusses new features and development in Epic’s game development engine, is one of the important anchors of GDC as was the talk by Unity which detailed the news coming from the competing side of the game engine landscape. Both companies put a huge focus on rendering. Nvidia teamed with Epic and ILMxLABS to create demonstrations of realtime rendering, which Nvidia has dubbed RTX using a Nvidia DGX 1 (see Jon’s article on realtime ray tracing) to render Star Wars scenes. The demos were of course impressive. RTX is also being supported by Microsoft with DXR, DirectX Ray Tracing.

Overall, the State of Unreal is good, said Tim Sweeney, Epic’s unassuming CEO and founder. There are over 5 million developers and Sweeney said the community is experiencing 100% employment. Sweeney went on to say that there might have been a distinction between Unity and Unreal—that Unity was being used more for mobile games and Unreal was being used more for desktop, but those distinctions are disappearing as the quality and complexity of mobile games challenge triple A games. “We’re seeing the best games of the high end coming to mobile,” said Sweeney. He cited the hit South Korean game Rocket League by Psyonix, which has versions on every platform. Likewise, the 2018 breakout hit Fortnite is available on multiple platforms. Sweeney says Fornite is the kind of game that almost has to have multiplatform capabilities that will allow players to smoothly move from platform to platform and pick up where they left off.

The latest version, UE 4 has live Record & Replay, which enables players and creators to create replay videos. Fortnight, as it turns out, is generating over 130 million video views daily. People are watching them on Twitch, YouTube, and Social Media. Gamer star Ali-A introduced a Fortnite video created with the replay capability coming in the next release of Unreal 4.20 in the summer.

The issue of multiplatform dovetails nicely with the reboot of Reboot, a popular animation show of the nineties about heroes who live in the guts of the computer fighting good and evil. In the new series, the live action characters interact with animated characters in their day to day high-school life … now that’s multiplatform. The new TV show is being developed using Unreal for fast episodic turnarounds.

And the world comes full circle. The cartoon series Reboot debuted in 1994 and was created by Gavin Blair, Ian Pearson, Phil Mitchell, and John Grace of Mainframe Entertainment. It was the first CGI TV show and was made on Silicon Graphics machines using Softimage software. In an interview with Blair conducted in the 90s, he told me that he wasn’t so sure they saved much time at all, and ironically they were still shipping dailies down to LA for approval.

Unreal also announced free access to all the assets, textures, skins, environments, and even dialog from its almost-hit game Paragon. The company announced it would be shutting down Paragon because, the company said, it doesn’t have the resources to nurture Paragon to find a large enough audience to sustain it. The success of Fortnite has been blamed for sucking up all the resources. Reportedly, there are 45 million players playing Fortnite—sometimes 2 million at a time. So, Unreal is making lemonade out of the decision by giving the game to its community developers. Paragon will live on in some form and probably many forms. The company said it had sunk $12 million into Paragon.

Andy Serkis’ performance is mapped to the model of Osiris Black. (Source: 3Lateral)

Unreal and partners 3Lateral, Cubic Motion, Vicon demonstrated advances in virtual humans with the debut of Siren, a beautiful woman, and Osiris Black, a demon. Serkis drove the performance of Osiris while Siren spoke for herself, or rather her real-life counterpart actress Binjie Jiang spoke for her. The companies are the same group who put together the demonstration of virtual performance two years ago for Hellblade’s Senua’s Sacrifice, and last year’s MeetMike demo at Siggraph.

Osiris Black delivering the “Out, out, brief candle” soliloquy from Macbeth was weirdly restrained compared to Serkis’ own bizarre teeth gnashing performance. Serkis looked as if he was doing the speech as an ape from the Planet of the Apes franchise.

Not-Andy Serkis. (Source: 3Lateral)

3Lateral has developed the Meta Human Framework that merges volumetric capture, reconstruction and compression technology, which was used to create a digital replica of Serkis. Cubic Motion has developed the performance capture rig for faces. It’s a really stunning recreation, though the eyes are dead—which is just fascinating because they also seemed perfect, but you could just tell that no one was home. Weirder still Osiris seemed in some way more realistic and no, I’m not making a snarky joke. I think it’s a perfect demonstration of the uncanny valley.

Continued in…

Part 3: Unity in the game

Also see: Part 1: GDC wrap-up: The engines are running strong