Posts

Showing posts from March, 2016

Skinning Roger Rabbit AKA Are we there yet?

Image
I was lucky enough to grow up in the era when home computer and consoles were new, and movies using CG was something worth talking about if anybody actually used managed to actually used it (i'm looking at you Max Headroom and Tron, both designed to look like CG but mostly used old fashioned effects as computers weren't fast enough). One of my favourite movies didn't (AFAIK) use CG but now would use it a lot, Who Framed Roger Rabbit . Roger Rabbit is a clever comedy/who dun it about a murder with the main suspect being the eponymous Roger Rabbit. The twist is that Roger Rabbit is a rabbit, a cartoon rabbit. The world is set up so that Hollywood has a section called Toon Town, where cartoons actually exist. Bugs Bunny, Micky Mouse actually exist and act in TV shows and movies just like other actors. The star (apart from Roger) is Eddie Valiant played by Bob Hoskins who is a classic 20/30s grizzled private eye who hates Toons after one kills his partner. The part

Multi-frequency Shading and VR.

Multi-frequency Shading and VR. Our peripheral vision is low resolution but high frequency, our focus vision is high resolution but slower frequency updates. This is one of the reasons 60hz isn't good enough for VR, at the edges most people can consciously see the flicker still. Additional of course VR is stereoscopic, so requires two views of everything and low latency response, whilst you can just run everything at 90hz or even 144hz this is expensive performance and power wise. Multi-frequency shading solves the issue by sharing where possible some calculation over time and space (view ports). Of course for this work, we need to break down the rendering into parts that are the same (or close enough) to be ran as shared. Perhaps the oldest split has been diffuse versus specular. diffuse lighting is only dependent on light position and the surface being lit, so camera changes can be ignored. This has been used in lightmaps for a long time. For VR this means that diffuse li

Frameless rendering for VR?

Frameless Rendering for VR? I'm currently doing a fair bit of thinking about rendering (nobody pays me to think about hilarious cats pics yet but one day!) and in particular VR/AR low latency rendering. One idea that keeps popping into my head is decoupling shading rate from display rate, in a conventional render path we shade lighting etc. at the same speed with display but VR has started to change this with asynchronous time warps. Time warps use the previous frame warped by the current VR pose to give the feeling of a faster refresh rate than the renderer actually outputs. This works because the different is only displayed for a fraction of a second (90 or 120th of a second), and the amount of change in that time is fairly restricted. I've started to think about taking that to its logical extreme, if the display rate is fast enough (90+ FPS) do we actually need every objects to be completely up to date, all that matters is that over a few frames every object g