Brainwarp 2 x 45 Hz

Does anyone know if targeting 45Hz with Brainwarp could work? This would give an effective 90 Hz apparent framerate but would reduce GPU workload significantly. Perhaps making a 1070 viable?

[quote=“colgate1974, post:1, topic:9298”]Does anyone know if… Brainwarp could work?
[/quote]

Corrected it for you.

7 Likes

I mean, have you ever tried 3d shutter glasses… they work at 120hz, 60 hz per eye and they give me a headache after about an hour… I’m going to go out on a limb and say no.

At first Brainwarp was presented to basicly just double the perceived pics per second.
Now pimax refers to Brainwarp as a set of features, including reprojection.

Fact is: nobody knows until now how the „perceived doubling“ should work (except pimax)
And, again: i doubt that it (this framedoubling/delaying/whatever) works as promoted.

To make sense as you suggest (one frame left eye, then one frame right eye) it needs support of the game to calculate only one view per eye and then again physics and game stuff and finally the picture for the right eye. If you have no support of the game engine you end up wasting time only (rendering both eyes and then displaying left eye and after delay displaying right eye). That in vr makes you certainly sick. Pimax could somehow include the new head position for the right picture, but then the right eye would always see a fake warped picture which is certainly not a great experience either…

@deletedpimaxrep1 @PimaxVR any update on progress with Brainwarp?

4 Likes

timewarp is done by the driver and in brainwrap mode you apply timewarp on one eye first, you send it to the headset, only then it apply timewarp on the second eye then send that one to the headset too. Brainwrap original goal is to solve MTP and i don’t see how it changed from the first day they presented the idea.

I remember reading about brainwarp for the Pimax 4k which is at 60hz if I am not mistaken.

Apparently there were artifacts at that speed, which indicates going lower might be a bad idea…

1 Like

I had similar problems with 3d vision at the beginning but after using it more regularly I have no problems even in longer sessions. I suppose it is just matter of getting used to (like in VR) and perhaps not everyone can. Also depth and convergence settings are important, not many people adjust them but if wrong they can cause issues (incl. headache). In games biggest problem is that scenes can change (distant/close-up) and same depth/convergence does not work well for both.

1 Like

Did you even try to grasp what i am talking about?

driver level delaying and warping one eye is certainly not a good idea.
Check the kickstarter page about brainwarp, then you will understand the problem.

And in one of the videos of mrtv or voodoode they state that the used atw (reprojection) is a part of the brainwarp-feature.
But if you know how brainwarp works in detail, please enlighten us.

I think that is problem… Nobody knows how it works and @PimaxVR are evasive about answering any questions about it or any progress.

I think they had an idea (framedoubling) and soon found out that in vr that is simply not that good.
They would need game engine support to do it right.
So they start to keep low profile and call their atw and hopefully soon to come asw part of the brainwarp ip.

Edit: for how long they talk about that mystical brainwarp? 3 or 4 years? wasn’t it a feature announced for the 4k?

I suspect that you are correct! I wish they were upfront about it. Most of us are grown up to accept that it may never be possible to do what they intended.

1 Like

i am just trying to help, the problem brainwrap try to solve is since at this resolution the time it take to transfer the video from your GPU to the headset is so long that by the time the second eye is displayed in your headset the position is completely misplaced. So instead of sending both right and left image at the same time through the capable, they delay the second eye timewrap ( which is the reprojection for people who don’t know ) to only happen just before sending it.

So think about traditional timewarp vs brainwrap. the fact they work async should be a given. Now how does that feel?i agree we don’t know but i would disagree that Pimax promoted anything different before, i doubt recomputing a whole new eye was ever an option.

2 Likes

@PimaxVR @deletedpimaxrep1 please comment on this and help us clear up the confusion. Thank you!

Your idea about the graphics pipeline and timing is clearly flawed. It takes negligible time for the rendered views to reach the headset. What needs most time is the preparation of the views (CPU Frametime) together with the rendering of the views (GPU frametime). Then the composer warpes both views as needed or wastes the frames and adds atw\asw to the old views and sends the result to the headset.

I think most of us try to help. Please try not to spread misinformation. Thank you.

1 Like

interesting, how much time do you think it take?

The information in the wire travels through electrons roughly with c/3 speed. I recall a Carmack presentation about atw where he shows a chart with the scanout timed around 1ms.

You should read https://developer.oculus.com/blog/asynchronous-spacewarp/
And the corresponding embedded links for better understanding the subject.

1 Like

cables are bandwidth limited, so when you see a cable spec which say max bandwidth=4K@100hz it mean it take 10ms to transfer one 4K image through that cable. Now this is not a big issue for the Rift since we are well below the max bandwidth and it’s not even an issue for mobile. When Carmack talk about the scan out time this is about how long it take for the panel to refresh all its pixel, nothing to do with the cable.

back to the subject, the Pimax 8K bandwidth requirement far exceed what their cable can handle and that is the main reason why they use a upscaler, it’s also the reason why the 8KX will have to use two cables (unless they decided to use a better cable like few backers suggested) and it’s also why the industry keep coming with better cable to support next gen VR.

1 Like

You are right. Cables are bandwith limited.

Back to the core of my argument: do you now understand why brainwarp can not work as you suggest?

Say it take 10ms for the 2 eyes image to go from GPU to Headset. that mean by the time it’s shown to your eyes the position will have a 10ms lag. Now if you use brainwrap like i said it mean each eyes will have a position lag of 5ms.

your argument is applying timewarp on the second eye later will create a fake wrapped image, but what I’m trying to explain to you is if you don’t do this you end up with far more sickening image with 10ms position lag.

1 Like