Realtime: Adshir’s LocalRay is 100% organic ray tracing

New tech from rendering startup tested with and against Embree—it’s all native now.

Regular readers will recall that we have been following Adshir’s developments for a few years now. Each time we see them, they have made significant advances. Our last encounter was at Siggraph in LA, and we wrote all about that here. At a private showing, they demonstrated realtime, 4K ray tracing on a powerful laptop, realtime, 60 fps ray tracing on a battery-powered device, and realtime 45–60 fps ray tracing on mobile device doing realtime AR.

Adshir’s strategy is based on two key questions regarding advanced gaming:

  1. How to gain an immersive consumer experience in gaming industry?
  2. “How to bring mobile VR/AR into line with experiences enjoyed by consumers using high-end home PCs?” (As asked by Paul Brown, HTC Vive)

For item one, not too surprisingly, Adshir suggests ray tracing, the technique for a physically accurate simulation of light, capable of producing life-like images. The company’s position is that the way to bring mobile VR and AR into line with high-end PCs is to get realtime ray tracing on mobile devices. The backbone of a revolutionary change for immersive graphics will be a combination of realtime ray tracing, the cloud, and 5G connectivity.

In the company’s previous incarnation, some three years ago, they used Embree for performance comparison, but not as a part of their code—Adshir performed ×5 better than native Embree. When the company switched to AR, they changed their algorithms and stopped referencing to Embree. When they compare today still images to Vray, Adshir performs ×200,000. Probably the same with Embee. Adshir is a total software solution and currently interfaced with Unity but, the company says, they are not limited, nor contractually obliged, to use Unity.

But how will mobile ray tracing match the cloud? Well, for one thing, the client has to have or get, realtime animation, strong self-sufficient capabilities, with low power. And the server has to be low cost, offer a large number of concurrent online streams, has power efficiency, fast throughput, and low lag. No big deal, right?

Adshir points to some ray tracing milestones to make their point about the desirability, and feasibility in bringing ray tracing to the masses.

  • Film production: Off-line full ray tracing on cloud servers

2010—James Cameron’s Avatar. A record-breaking opening weekend of over $77 million
2013—Pixar uses ray tracing for first time in Monsters University

  • Gaming: On-line hybrid ray tracing

2018—Nvidia announces special purpose hardware for realtime ray tracing on high-end platforms
2018—Adshir demonstrates at Siggraph 2018 a realtime ray-tracing software for battery-powered devices

Nonetheless, ray tracing comes with some major challenges for still images such as heavy traversals of static accelerating structures which result in high complexity, high power, reduced performance, lack of coherence in secondary rays that limit the use of SIMD mechanisms, and noise which reduces quality or performance. Noise is a function of the samples per pixel (SPP).

Sample rate and noise in a ray-traced image.

The challenges are even worse for animation because accelerating structures must be repetitively rebuilt so there can’t be any skinned animation, and again noisy  “film-grain” which results in low quality.

So Adshir has come up with a way to sidestep most of these issues with a novel and patented (30 of them) technique they are calling LocalRay. And the company has the audacity to suggest it is the first and only realtime ray tracing (RTRT) solution on battery-powered devices. Using proprietary dynamic data structures, the company reduces computational complexity. The net result, says Adshir, is a software solution that doesn’t need any hardware accelerators, low cost, low power, and has compatibility with all game development platforms (Unity, Unreal, gaming engines, etc.).

Adshir accomplished this breakthrough because their approach turns the whole laborious tree-intersection branching process upside-down—literally. The net result is they don’t have to do the rebuilding, branching, and testing and therefore can produce an image without requiring a denoising stage. They can support ray-traced skin animation with no penalty … and can run in realtime using a conventional raster graphics pipeline.

Audaciously, the company made a comparison to a static rendering versus a dynamic rendering.

So, Adshir thinks they’ve met all the requirements and claims that LocaRay has connected with the cloud. They say it’s low cost because it can use off-the-shelf GPUs without accelerators, can use existing infrastructures, and can run a large number of concurrent online streams. That, the company says, provides strong self-sufficiency for clients, as well as power efficiency, and results in low computational complexity, less data to move, and low lag, fast throughput using 5G connectivity.The Radeon 5700XT GPU ran at 60 fps, the mobile tablet with Iris Pro GPU ran at 15 fps (modified for mobile).

https://www.amazon.com/Ray-Tracing-Tool-Jon-Peddie/dp/3030174891