Frameless rendering for VR?

Frameless Rendering for VR?

I'm currently doing a fair bit of thinking about rendering (nobody pays me to think about hilarious cats pics yet but one day!) and in particular VR/AR low latency rendering.

One idea that keeps popping into my head is decoupling shading rate from display rate, in a conventional render path we shade lighting etc. at the same speed with display but VR has started to change this with asynchronous time warps.

Time warps use the previous frame warped by the current VR pose to give the feeling of a faster refresh rate than the renderer actually outputs. This works because the different is only displayed for a fraction of a second (90 or 120th of a second), and the amount of change in that time is fairly restricted.

I've started to think about taking that to its logical extreme, if the display rate is fast enough (90+ FPS) do we actually need every objects to be completely up to date, all that matters is that over a few frames every object gets updated. This idea isn't new and is more how our eyes work than the 'shutter' based systems with frames.

There been work on this idea for ray tracers, Adaptive Frameless Rendering is one, but I starting to think that it makes sense for our modern rasteriser/compute hybrids VR engines. Frame rate is guaranteed with a time resolution down to the pre-emption quantum of the GPU. As there is never a 'finished' frame, its simple grabbed at whatever state its currently in.

There could be many artefacts but I'm guessing that at high frame rate and in stereo, you won't notice them as they will be corrected in a short amount of time.

Needs exploring...

Deano out

Comments

Popular posts from this blog

Machine Learning Performance

Flu + teaching AI to play games

A large scale archival storage system - ERC Reinenforcement