Version 3 prototype review

I think this is Very important!

I also wouldn’t mind if the refresh rate could be set anywhere between 75HZ - 90HZ, maybe in increments of 5HZ.

This is how brainwarp works:

You take your money out of your bank, place it in an envelope and send it to me. Now you do this without knowing…that my friend is brainwarp!

2 Likes

It’s gotta be atleast 90.


Gary6h Sofian
I asked the CEO what refresh rate it was and at first he said 90hz, but when we pushed him and asked if this demo here was actually running at 90 he then said no

sounds like a lie to me

all he had say was no but 90hz is coming and i would have been fine with that”

Excactly.!!!:+1:

I totally agree…i was about to pledge…but then i decided not !!!
something fishy or they are not clear about what to show
not saying cheesiness cheap trick …but not quality either !!!
only the 200 FOV madness

I hope they got the thing right…don’t rush it…and be transparent
900$ (8k full) is no cheap…but is it premium ??
Thats the real Q

They will demo brainwarp at sometime after the hardware/software is ready for it. Atm they need to get this debugged before adding & testing this feature as adding it before can often complicate identifying the root cause.

It also when demoed would need to be identified it is using Brainwarp.

Well, the 8K Kickstarter campaign effectively is over as it has sold out, save for the multiple HMD packages. Which may have two reasons - firstly they might have been concerned how many HMDs they would have to ship at the cost involved in the current deal with all the add-ons we reached, and in addition they may have felt an expectation to provide some kind of additional add on on top of them all for the 3,5 or 4,0 mln. mark. Which now is no longer required as it will not be reached.

Now let them concentrate on fixing the open issues and getting us a stable quality HMD, bei it in January or March. All the add-ons hopefully will be pushed to later so they don‘t get distracted with getting the core system right. As some have already pointed out, I don‘t need a well-performing eye-tracking or wireless module if I cannot get the HMD to work properly. I for my part will be 99% satisfied if the HMD performs as the V2 prototype I tried with corrected IPD feature and as plug & play with SteamVR, and ReVive please for the Oculus part.

So I suppose we will see a bit of calm after the storm in the coming weeks, hopefully capped with some Tested etc. reviews of the V5 which confirm they have nailed it and can start to gear up production.

The misinformation, be it deliberate or due to the language-barrier, or a mix of the two, was not ideal, true, but here we are and I assume that most of us will stick to backing them come November 3rd. So let us now let them work in peace and not try to get them engaged in communication with the community all the time - we would be adding to unnecessary distractions where such communication is not aimed at responding to queries they have set out but rather in re-affirming the status of development every second day.

6 Likes

At present in Piplay/PiHome we have a render multiplier like steam does in settings for the 4k model; we also have accesd to a few brightness levels & and to set input Resolution for lower spec machines to 1080p; where the default setting is 1440p.

Now of course with the 5k/8k the default input would be dual 1440p. If i have read correctly we may also have a dual 1080p (not positive on this one though) so “may” is the key word.

You do know rift/vive are somewhat variable on refresh. Hence Async is adaptive flavor of Adaptive Sync. Nvidia’s adaptive sync is called G Sync & AMD is called freesync.

& with higher resolution refresh doesn’t need to be necessarily as high. Case in point the french fellow who published his review after him and a friend drove from Montreal to get a demo in Toronto. Is VR sick sensitive & didn’t feel sick from the v2 unlike from Rift/Vive. His main concern was the brightness as he needs brightness more on par with Vive due to low light vision issues. And he said a bump in contrast.

1 Like

huh , i would be curious to see what upscaled 1080 would look like beside upscaled 1440. curious which would look better, a game at upscaled 1440 at medium settings or upscaled 1080 maxed out to space.

Okay i will give this a try.

If you remember when 3d shutter glasses first came out? They were low refresh. If memory served i think it was something like 30hz per eye of alternating on of in sequence. So 60hz. The problem of course this was too low and resulted in discomfort/motion sickness from 3d. It was determined they needed 120hz min for 60hz/eye. Fluttering in sequence.

Brainwarp works similar.

So now. My understanding
1)Game Renders image. Basic left & right or 1 complete no sbs

  1. VR Render driver re-renders image with distortion & calculations on geometry etc.

3)passes to headset which then inside the headset sends to left eye, slight delay then transmits to right eye.

The brain for tha breif moment uses info from the left to create a static image to the right eye & vice versa. Each eye recieves 75hz but instead of simultaneous; sequentially.

If you had an old view master they often had scene say with the sunbin it creating a 3d effect. The Sun fior examplebis only displayed to one eye. The brain makes the sun seen by both.

But what is the advantage of this? If everything at the end is 75Hz and not double 150Hz.
I had understood that with this technology (Brainwarp) a more modest GPU could achieve higher refresh rates. But apparently 120 - 150 Hz is limited by the processing power of your GPU, so in the end we will need at least a 1080Ti (if not better) to have a higher refresh rate and any gpu below that will get less than 90 Hz. I saw no advantage in this technology in that way.

@Matthew.Xu

As I understand it, brain warp essentially doubles the perceived refresh rate depending on what your PC is capable of outputting. It makes sense that if your PC was only able to handle 60hz, then brain warp would increase it to 120, if your PC can only do 75hz, then with brain warp that would be 150hz and if your PC can handle 90hz, then with brain warp enabled you would perceive 180hz.

Can anyone confirm?

1 Like

This confirmation I would also like. But from the Pimax support post what they wrote does not imply that.

“If your GPU could support”

I’m pretty sure that says exactly what I just explained lol.

2 Likes

Why not just go with 2 dp’s?

It’s a matter of making it compatible with lower end graphics cards, not a bandwidth issue.

Brain warp will also be extremely nice for games that already run at a lower frame rate just because they are more intensive.

If I understand you right then both eyes would be rendered at the same time and one of both will only be shown with a slight delay? Wouldn’t that be even more confusing to the brain as new information pops up (new image asking for attention) that just brings information that matches to a head position even longer ago?
Example without this interpretation of brainwarp:
x) head is at position 1
x + 1000/90 = 11 ms ) left+right image for head position 1 rendered and shown, head is at position 2
x + 22 ms ) left+right image for head position 2 rendered and shown, head is at position 3
etc. => 11 ms latency between action and showing corresponding image

Example with this interpretation of brainwrap:
x) head is at position 1
x + 11 ms) left+right image for position 1 rendered, left image for position 1 shown, head is at position 2
x + 11+11/2 = 16 ms) right image for position 1 shown
x + 22 ms) left+right image for position 2 rendered, left image for position 2 shown, head is at position 3
x + 27 ms ) right image for position 2 shown
etc.
Thus the latency for the left eye would now still be 11 ms while the latency for the right eye would increase to 16 ms - which is worse as if the images for both eyes would have been shown right away.

To get the actual latency down you would have to

x) head position 1
x + 1000 / 180 = 6 ms ) image for left eye for position 1 rendered and shown, head position 2
x + 12 ms ) image for right eye for position 2 rendered and shown, head position 3
etc.
But then you couldn’t render the images for both eyes together as both eyes don’t only show different (fixed distance) perspectives but also newer content 6 ms later - so not much could be shared between the two calculations.

That’s why I am confused. I don’t understand the benefit of variant 1 and have some doubt how realistic approach 2 would be with a 1070 (or even 1080) GPU, looking at benchmark results for even low-demand non-VR games.

1 Like

It creates a percieved higher framerate. Higher than what most gpu can produce on a single output.

For example by looking DP 1.2 specs (980 ti) can only muster around 75hz for dual 1440p output (more or less 4k equivilent) the max real bandwidth limit of dp 1.2 spec.

The 1070 i beleive has dp 1.4? Then it will have tthe exra bandwidth to possibly acheive the bandwidth to get the 90hz.

Optimus based Laptops Nvidia cards are restricted to 60hz output.

Hmmm… An interesting idea on reducing gpu stress would be to have the option of choosing base refresh for Brainwarp & might be the idea of it.

So here’s a thought on way it could be used to reduce gpu stress.

Piplay runs a test on your pc & determines you can run the full 90hz. This means at no additional gpu power you can run brainwarp at the full perceived 180hz.

Now you want to try reducing the pressure/load on the gpu.

Since you qualify fir the full 90hz. 3 settings.

  1. 90hz (180z) Full load
  2. 75hz (150hz)
  3. 60hz (120hz)

2 & 3 put less stress on your gpu. If your a power user you might want to try option 1 to see if it “looks better” than Simultaneously mode at 90hz opposed to 90hz alternating.

This I submit would also benefit the 8k-X greatly.

@PimaxVR am I on the right track on how Brainwarp would be used to reduce gpu stress?

1 Like