Realtime ray tracing on your phone

VR & AR with realistic images coming soon.

People have described ray tracing as the holy grail of computer imaging. Ray tracing generates (if done properly) a physically accurate representation of an imagined or reconstructed scene. The number of variables are bewildering, from location and number of light sources combined with the style of each individual light (point source, diffused, flickering candle, etc.), to the materials of the elements in the scene (leather chairs, glass table tops, gravel road, broken wooden shingled roof, etc.). The materials, which can include multi-layered automobile fenders with different indexes of refraction that reflect and may diffuse different frequencies of light differently, are the trickiest part of the problem, and the place where most short cuts are taken. Regional or hybrid rendering is another trick where only the foreground, and possibly foveated part of an image is treated, or maybe only certain highly reflective elements are treated to ray tracing. All those tricks are done to buy time, to allow the processors to have as many cycles as possible to run the algorithm. The ray tracing algorithm itself is relatively straight forward, but characteristically it can be easily interrupted, and  predictive branching isn’t really helpful, so a GPU can be tripped up by ray tracing, and a CPU just has to grind on and on until the GPU catches up again. Because of those characteristics, a few companies over the years have tried making ASICs for ray tracing, and whereas they’ve done a decent enough job, they couldn’t reach the volumes of production to reach the economy of scale needed to make them commonly affordable—as peculiar as it may seem, not everyone needs ray tracing. Almost everyone may want it, but few will pay extra for it, we’ve become that good at generating rendering tricks.

So ray tracing is one of the great challenges in computer imaging, maybe even a grand challenge. Those type of challenges attract smart people, and dumb people who didn’t know it couldn’t be done so they did it. Dr. Reuven Bakalash, founder and CEO of Adshir is one of the smart ones, and he’s figured out a way to short cut through the projection process of ray tracing. We reviewed the approach in TechWatch (V16, no16, 2 Aug 2016, page 6, and v17, no 20, 27 Aug 2017, page 6).

The company has since adapted their LocalRay technology to smaller handheld devices and applied the capability to AR and VR applications, and they brought to a hotel in Las Vegas during CES to show people. They used a Microsoft Surface tablet to run it, and showed a miniaturized dinosaur walking across surfaces of different material, reflected accordingly.

Adshir AR ray tracing demo running on a PC.

The demo is significant because it shows light in the room reflected in realtime as the creature ambled across the table, casting shadows and appearing in the objects, it even walked across a phone on the table—and left footprints on the black touchscreen.

The dinosaur is a 20k model, rendered and animated in Unity, uses the PTC Vuforia AR toolset and was rendered in realtime at 60 fps.

Adshir AR ray tracing demo running on a mobile phone.

LocalRay uses proprietary, patented algorithms designed for physically accurate ray tracing in VR and AR applications. Adshir’s approach is based on the elimination of the acceleration structures (AS), which are a core component in every ray tracing system. This elimination reduces the time-consuming traversal time, and saves the repeating reconstructions of AS for every major change in the scene. Both, traversals and reconstruction, which are stoppages for realtime, are now a thing of the past, claims Adshir.The net result is LocalRay requires fewer rays. This AS replacement is called the DAS (dynamically aligned structures), and it’s proprietary to Adshir, however, it uses a conventional GPU pipeline. Also, LocalRay is battery power aware, and Adshir says that performance isn’t affected by it.

Adshir has 10 granted patents and another 11 pending, most of them around LocalRay. The company will introduce an SDK for licensing soon, and will be a plug-in to Unity, Unreal, ARkit, ARcore, Vuforia, and more.

What do we think?

Realtime ray tracing is and by itself an amazing thing to contemplate. Running on a smartphone seems almost like something out of StarTrek. Yes, compromises still have to be made (the model for instance was reduced from 40k+ polys to 20k—notice its smoothness in the phone example above), and the resolution has be dropped down a bit, but those are parameters that are tied to Moore’s law, and so will get better over time. And then there’s the magic factor—Adshire could just surprise us again and come up with a tweak to the algorithm and increase everything. And lest we forget, this is a full-screen rendering, not a zonal rendering.