GDC wrap-up: Unity in the game

Three-part series covers some of the GDC 2018 highlights: mobile gaming, Epic’s move on game development, and gaming at Unity.

Unity took advantage of its downtown San Francisco office space to have its own lineup of Unity sessions. As usual the company unveiled its latest demo, Book of the Dead, and as you might imagine it exhibits all the usual good cheer, whimsical lightheartedness of other titles by Unity demo team. (Note: it’s bleak.) Producer Silvia Rasheva of the Unity Demo team celebrated the group’s work and said the demo team does have a distinct creative style and it’s evident through the annual demos including the Blacksmith (2015), Adam (2016), Neon (2017), and Book of the Dead (2018). The focus this year was on image-based shading, photogrammetry support and new rendering options and accordingly the demo featured plenty of foliage, rocks, stone, etc. Textures were created from backyard images, gathered textures, and also textures from Quixel. Quixel scans will be available in the Unity asset store including those assets used in Adam and Book of the Dead.

Unity has multiple rendering options. In addition to the built-in renderer, the company has added the Scriptable Render Pipeline, which can be optimized for the developer’s project and it can support the new High-Definition Rendering Pipeline (HDRP), and the Lightweight Rendering Pipeline but not both at once. Developers must choose. The HD RP is designed for high-end platforms and requires GPU support. The LW RP is designed for mobile applications or demanding applications such as AR/VR.

Other new features take advantage of the GPU for the progressive lightmapper, which enables artists to tune lights and bake them. AMD was a partner in this effect.

Otoy’s Jules Urbach announced the latest upgrade for the Octane renderer for Unity (and for Unreal). The Octane Renderer has been designed to take advantage of GPUs to speed the rendering of VFX. He noted the Octane has been seen on TV for the opening of HBO’s Westworld and also for the Crown and the Alienist. Urbach is using AI to speed the process of rendering by anticipating the finished look and eliminating the noise of intermediate steps. (Similar to Nvidia’s approach in its Optix renderer.)

Urbach, by the way, is another developer who tends to grab for brass ring buzz words with every go round. This time he’s got AI in his renderer but he’s also announced plans for blockchain based rendering. So yes, that is worth of a slight eye roll, but the point is that blockchain, could be used to write contracts across the computers of the world for off line rendering time, allowing a massive collaborative rendering cloud. (I immediately started wondering if I should let my sweet little computer be used for rendering massive shoot outs, or military simulations. Theoretically, blockchain technology would let you define the jobs you would allow your computer to do. No mention of how people might be recompensed for donating their computer.)

The Unity–Autodesk connection

Last year, Autodesk decided to back away from its Stingray game engine and announced that the company would be working closer with Unity to improve the process of transferring data from Max and Maya into the Unity Engine. The company said they will improve FBX to make round-tripping. Presenter Lawrence Cymet, Autodesk Product Manager apologized to the audience for past problems in handling content between Autodesk’s tools and Unity and demonstrated how people can better work between the two worlds. He said he hoped that by next year, there wouldn’t be a need for such a session because the workflow would be so straightforward.

Autodesk has said that its customers have made a choice for commercial game engines and really don’t want that functionality from Autodesk. In the last few years, Autodesk has amply demonstrated that it didn’t really know what it wanted to do with a game engine and its customers had wandered off to play with Unity on their own. It seems logical that similar announcements for Unreal could be coming, but at the moment all we know is that Autodesk has an active collaboration with Unity.

Unity has made some additions including adding the ProBuilder team Gabriel Wiliams and Karl Henkel to the Unity team. ProBuilder is a level building tool that lets developers quickly create a level design in Unity. At GDC, Unity demonstrated the capabilities of ProBuilder and a new tool Polybrush which enables sculpting complex shapes, painting lighting or coloring and blending textures across meshes. The new tools including the FBX Exporter fit in with Unity’s new features of 2017 including Timeline and Cinemachine. Developers can quickly create grey box levels and scenarios.

Unity says they are improving machine learning in the game with tools that let systems learn during game play. The new features in ML-Agents 0.3 are designed to automate basic machine learning processes.

Another major push for Unity is performance tuning and performance improvements. The company says the top complaint given for games that don’t score well is bad performance. The company’s LiveTune feature is now in beta and lets developers interactively improve game performance on all platforms. The company says it’s working with its best game developers to help all developers get the most performance from the game engine but in addition, the company says they are working towards “Performance by default” an evolution of the Unity system to develop a multithreaded system that can take advantage of multicore processors. The company has added a backend compiler technology called Burst which can produce highly optimized code.

Both Unity and Unreal have announced support for Magic Leap and the new head sets arriving on the scene including Oculus Go, the Daydream, etc., which brings Unity back into familiar territory. The company builds tools for constrained environments. They excel at 2D content and now the company says it will evolve to support new devices including wearables, IoT, the web, and of course headsets. The company says it’s building a highly modularized architecture with a tiny footprint. The company says they can support web-based deployment with a compressed core runtime at 73KB.

What do we think?

From here, the GDC conference seemed relatively tame. So far, no reports of anyone getting fired for being a jerk online or on the floor. I checked Reddit and the controversy quotient seems pretty low. Oh lord, I forgot, Nolan Bushnell was supposed to be given the Pioneer Award for his work in gaming, which seems like an obvious choice—why in the world hasn’t he been given it already. Apparently, there are accusations of harassment in Bushnell’s permanent record. We are going to have to come to a point, somehow, we are going to have to find a place of balance between a person’s private life and contributions as the pendulum swings and wrongs are what? Righted? Recognized?

But none of that is what I want to talk about. Game engines are the heart of the game industry and there are lots of them. Brave companies are putting their own engines out there for sale all the time, mainly because they’ve built a game of their own that wasn’t a great success and so now they’re hoping to make money back in games. There might be a lesson there and I think there’s a definitely lesson in Autodesk’s attempt to build a game engine with the Stingray technology. If you build it, they’re not necessarily coming.

Game engines are not a product, they’re a development environment that has to constantly change and keep up with the industry they serve. So, the changes they make and the features they add are good indicators or what the market wants.

Some interesting points from what I’ve learned this year:

  1. Ray tracing is big. Real big. Realtime rendering is possible, it’s going to happen, Jon says in six years. What we haven’t talked about is what does this mean? An animated series like Reboot can be made, rendered, published in a week—realtime. Okay, after you’ve created all the characters, sets, textures, everything, but still…
  2. We’re just scratching the surface of what AI can do. I was fascinated that the game development community is thinking about AI as something to slow down games, for less adept players. There’s something decidedly 1984ish about that. But also, I’m fascinated by the issue of trust. It’s one thing that a character might lie but it’s mind blowing, that a NPC might change sides, and manipulate players. Heck, they could even gang up on us … it’s like the worst game of Risk ever.
  3. Blockchain has a huge role to play in gaming. Honestly it’s as if the technologies were made for each other. There have been multiplayer games that have their own currency already and some have even made that currency transferrable to cash, but blockchain can make that currency portable within games—and really that’s kind of an obvious capability. The technology can also keep track of characters’ capabilities and status and make that transferrable. There is so much more to come on this front that I’m forced to fall back on the utterly lame canard: we don’t even know what kinds of changes blockchain will ultimately bring.

See also:

Part 1: GDC wrap-up: The engines are running strong

Part 2: Epic moves