Brainwarp 1.0 (featuring “Smart Smoothing" and "fixed foveated rendering")

Ah, this explains why so little apparent progress was being made on PiTool improvements (like adding a brightness option). Inventing / implementing an algorithm similar to ASW is quite difficult. Basically, in involves extrapolating a new image using your most recent head position and a warp (stretch) from the previous frame and its associated Z-buffer.

Unfortunately, I couldn’t find the excellent article I read about ASW a few months ago, so instead, I will share an over-abundance of links I found, covering differing aspects of this technique…

. https://www.roadtovr.com/steamvr-motion-smoothing-asw-alex-vlachos/
. Asynchronous Timewarp Examined
. Asynchronous Spacewarp 2.0 Will Help VR Run Dramatically Better on Rift | Oculus VR News
. Asynchronous TimeWarp (ATW) | Oculus Developers
. https://www.vrheads.com/what-asynchronous-spacewarp-and-why-should-you-care
. Asynchronous Spacewarp
. NVIDIA's Asynchronous Warp Made Me Love Virtual Reality - SlashGear

8 Likes

I guess I can not resist dropping in a post just to sullenly reiterate that the best thing remains, of course, to not need any artefact-ridden synthesised frames in the first place.

I really think it’s naive to assume that no developers take fire-and-forget reprojection techniques being in the VR runtimes, as grounds for not optimising their games.

Sorry, rant over. :stuck_out_tongue:

2 Likes

Tell that to those with weaker than 1070s. If not for motion smoothing my friend with his laptop could only dream of getting a vive without getting a new laptop at the same time as his has no support for an external GPU.

Sure it is not something that should be targeted, but being that the Pimax is much more demanding than the norm(oculus and vive). It only makes sense that a great deal of focus is to get good methods to lower the required GPU.

5 Likes

The proper way to accomodate weaker GPUs, is for the games to be built in such a way-, and to offer settings that makes them run at the target frame rate.

When reprojection becomes not something that catches an unintended hitch every minute or so, but a crutch that carries the game throughout, with the tacit understanding of the developer, then that developer has failed.

It’s like all the games that come out running at “cinematic” framerates, or lower, because fancy screenshots sells, so the developers crank the graphical fidelity to eleventyeleven, sacrificing playability for more bling. :7

1 Like

I think this MUST be how brainwarp works and would also explain why v1.0 of ‘brainwarp’ only does ASW. I think ASW is indeed the stepping stone needed to do brainwarp. Exactly like you said, I think brainwarp only renders 1 eye (left/right alternating) and then uses ASW to render the other eye. I’m pretty sure that’s it.

3 Likes

Synthesising the image in one eye from the one in the other would require correcting for the interpupillary offset, though, which should be a more difficult task than rendering both at a lower rate.

1 Like

But they could just render one eye based on the previous render for that eye. Exactly like ASW works but instead of both eyes just one eye each time.

3 Likes

Exactly. :slight_smile:
2020202020

2 Likes

This is good news indeed, hardware is nothing without the software to run it, Pimax 5k+ with motion smoothing or similar will take thie already excellent product and place it firmly at the top of the consumer VR HMD pile :wink:
I already love it so much, Elite Dangerous is absolutely mind blowingly glorious, to think I will be able to add more detail is already blowing my mind hahahaha Pimax I love you !!!

6 Likes

Fovated rendering does not seem to be a the driver level either, which would mean titles would have to support it

With video you know all the frames already, they are already available. With real time rendering, you only know one of the frames. You also have to worry about the Z axis, were in 2D video you don’t.

On PC’s the developer can not code for every possible system configuration. They can code for minimum specs but what if the hmd has a wide FOV or has some background process that’s eating gpu/cpu? Or has specific hardware that could be used to improve anti aliasing. Also if you have a game that was not built from the ground up for VR it can be hard to optimize it.
This is where these techniques can help the end user.

2 Likes

Optimisation is more than just checking for- and utilising system resources, if available.

-E.g:_ Do I need 1000 indvidual highly detailed physics objects worth of clutter in this room, or could this row of books, in this here shelf, be simplified as a static cube with a normal map? -Have I set up occluders, to properly cull unseen geometry? This computer seems to be lagging/running out of VRAM – can I kick that distant building down a LOD level? (EDIT2: …or use a simpler shader… maybe bake shadows into the diffuse texture…) (EDIT3: These are all ancient simple tricks.)

There is a ton of VR specific things that could be done, but almost nobody does, such as lens-matched shading. and MVR that was mentioned earlier in this thread. These are things that fall on the shoulders of application developers, and not the VR runtime.

(EDIT: Already the robot repair demo in The Lab kicks down shading in the periphery to checkboard, in its lowest level of dynamically adjusting quality rendering, without needing the latest and greatest GPUs.)

EDIT4: (Might as well…

Not really the point. With video you interpolate between two know frames - one past and one future; With ASW you extrapolate from two known old ones – the motion analysing algorithm is the same, and the effect is a 2D one, that does not care about movement of the device along any axis: Motion into the virtual scene will produce a radial optical flow, and that map of offsets is all the algorithm cares about.

Oculus’ ASW2.0 will be a somewhat different matter, but that is not public yet.

1 Like

Post the motion smoothing update please. We all need this pretty bad. No matter what your hardware is.

2 Likes

yeah but these guys have some shady geniuses working for them. How they managed to launch oculus software without revive is pretty crazy.

And if you watch tyriel woods pimax video, he mentioned something about reinstalling pimax software, broke his oculus setup.

“Sketchy Genius indeed”

They probably hacked oculus ASW or steam smoothing somehow.

2 Likes

raises hand :raised_hand: I’ll be a beta tester

3 Likes

They were launching Oculus software back in the p4k in 1.1.92 & previous versions when they had varies modes. Un later versions you needed revive as they went to 2 modes.

Reactos was making a binary compatiable clone to Windows XP(NT) & had decent prigress but never left alpha.

Actually, this type of “correction” is called “multiview rendering” :wink:

Anyway, it seems the only developer who now cares about optimizing the whole rendering chain for VR is Oculus with their mobile line, as they have: a) no other way, b) John Carmack.

Everyone else (read “engine developers”) seems to just hope for brute force approach, i.e. that GPU hardware raw power will automagically progress to the point that the optimization will no longer be necessary. As we see it works in a rather opposite way so far, HMD hardware requirements are raising the bar faster than the GPU hardware can catch up with.

4 Likes

I kind of wish they didnt say anything about brainwarp until it waas out because now im here waiting for it

6 Likes

Yeah until when do we have to wait for this? @deletedpimaxrep1 @anon74848233

4 Likes