Brainwarp Software Progress?

I just can’t believe that tested noticed frame drops on fruit ninja. Sounds funny tested noticed it wasn’t running at 90Hz then it comes out weeks later it wasn’t. Anyway getting off topic now so I’ll drop this argument

I think it is perfectly ontopic because the fact that they could see framedrops means that there was no brainwarp used.

The reason for framedrops on fruit-ninja could have been the state of the composer-rendering-software or the tracking implementation or simply running on laptop (thermal throtteling of cpu or gpu, throtteling because of wrong energy saving settings, etc).
Please do not forget that it is a massive amount of pixels which have to be rendered, even at native 2x 2560x1440. I use a gtx1080ti with a pimax 4k and i can easily bring my fps down with higher SS settings. And because of the fact that a high SS improves the picture massively (jittering, etc) we could argue that they run their demos with (probably too) high SS settings because they want to impress with the visuals…

So it is time for pimax to learn that to be open and honest is the only way to be successful in a long run…

It is funny to read some statements that they use reprojection etc. while we do not have any facts about brainwarp except this one picture of how it is supposed to work.

Again, i think it is time for @PimaxVR to shed some light into this topic! Please. Thank you.
A no-response for so long is, well, confession that it is not more than a brainfart?

Exactly. No response in a thread this active recently is highly suspect.

it’l be grand .

whether it renders one at a time or both at once and performs aws/ats on the second , its all good. it just has to work. Your money is spent now, demanding explanations and having it running now wont change whether it will be sorted for release or not. Just relax and wait till January. All obsessing over this now can do is stress out all parties.

Not at all upset. I mentioned awhile back in this thread that I am pumped and have no regrets on the purchase. I just noticed that after we discussed it for a bit that the conversation was being ignored (and it is).

I’m not at all trying to be negative in that way. But when somebody is feeding me bullshit, whether friend or stranger, I’m sure to let them know that I recognize it.

This.

Plus it makes me curious that pimax tried to excuse the 90Hz problem with the „fact“ that we will have all brainwarp anyway and so we will all be happy. That is clearly not honest. Because so far there simply is no brainwarp. Nowhere.

But yes, i am happy. Even if brainwarp is not going to happen. I will enjoy the 8k and the 8kx for sure, especially when doing iracing on my moving simulator :wink:

If you read the original apology issued by Pimax, you would notice that they actually confirmed that V2 prototype “was not stable” too. Whether it means they had to run it at 85Hz or at 90Hz with occasional drops, they did not explain. Maybe the latter. And for the sake of understanding the situation at tested it does not matter either. Tested spotted it. The CEO response can be understood in this context.

Besides, if you watch the video from tested you will notice they had some trouble communicating due to the language barrier.

You may interpret it as being lied to, or simply being misunderstood, or being misinformed. Anyway it does not help neither party to pursue this point now, when v3 is already out, which, as we know, runs at lower refresh and there is no longer any dispute about it.

The reason why I brought it up was because to put some context on the brainwarp.

1 Like

IIRC they said that for v2 it sometimes took them an hour of resetting to get it to work at 90hz.

I assume that means that once they did manage to get it working at 90hz, that it stayed at 90hz until they disconnected it or turned off the PC, at which point they would have to try again to get it working.

@Matthew.Xu Please clarify this feature status for us users/backers.

1 Dec update (matthew.xu) http://community.openmr.ai/t//4568 :
Software:
a. Optimized optical design
b. Debugging with Brainwarp :clap:
c. Improving ASW algorithm
d. Developing the production testing software and tools.

2 Likes

How was brainwarp envisioned(my interpretation):

GPU renders: 5120x1440 @ 80 FPS

Brainwarp splits that into 2x 2560x1440 @80, and double it to 160FPS via ASW technique, send to HMD 2560x1440@160Hz/eye

Brainwarp per eye simulation(left eye for example):
4K_R(right part of the 4k screen)
4K_L(left part of the 4k screen)
ASW - warped image to simulate new image from game world, not actual rendered frame from GPU, so using math - no game engine for this. Reprojection(just repeat frame) used if no head movement detected(from sensors).

T0 0ms-6.25ms:
GPU image: from T0(actual game world data) available(non warped)
4K_L display original(from GPU) half of image(only lens corrected)
4K_R no image displayed

T1: 6.25-12.5ms:
GPU image: from T0
4K_R if there was movement - get ASW image syntetic(warped + corrected), if not - only corrected image from GPU
4K_L no image displayed(my guess using shutter for this)

T2: 12.5-18.75ms:
GPU image: from T2
4K_L show image from GPU(T2)
4K_R blank

T3: 18.75ms-25ms:
GPU image: from T2
4K_R if movement - (ASW+lens correct) else show image from GPU_T2(lens corrected)
4K_L blank

For actual data needed, since we refresh 1/2 of 4K screen @160Hz, we need per eye 1280x1440@160, transfer from PC-HMD. So for whole HMD 2560x1440@160.
Challenge is if one ANX7530 can handle this signal to split it into 1280x1440@80/half_eye .
Nvidia has also idea to use ASW/ATW to boost refresh rate up to 16000Hz, Gen 3 anyone? :slight_smile:
But it requires more processing on HMD, for that we must wait for Moore’s law…

1 Like

Brainwarp does not duplicate the speed of the screens, what it does is fool the brain drawing only half of the images needed and making it imagine the missing part.
The apparent gain of speed is because the two screens are not updated exactly at the same time.

1 Like

It is not easy to understand, you actually get synthetic Frame after 1/160 ms, ASW(Rift) does this with 45 to 90, this is same technique but used for 80 to 160. And to make screen appear like 160Hz I think Pimax use shutter, so it hides half of the screen after 1/160 ms(6.25), so brain perceives 160. GPU render at 80, ASW algorithm create based on sensor data missing frames. Understand now? And is not two screens it is two halves of the same screen :slight_smile: . So update 1/2 of one eye, not whole eye(screen). 4K already has this solution… They call it async 90Hz I think.
It is using math to generate “fake frame” that will reduce MTP.

At least only this makes sense to me… If not work like this, Brainwarp will not reduce perceived MTP time.

No, they render an eye with the corresponding part of a frame and erase the image of the other eye; shortly after, they erase the image of the first eye and render the image of the other with the part corresponding to the image of the next frame.

An eye sees a complete image, its corresponding half of a frame, while the other sees nothing.

I do not know if they also use some kind of ASW, I think they refer to that way they get more frames.

[quote=“SpecsReader, post:94, topic:4130, full:true”]
And is not two screens it is two halves of the same screen :slight_smile: .[/quote]

Its not, there are 2 screens.

@Cdaked
That was my first assumption, but I see how Brainwarp works on 4K, async 90Hz etc. And Pimax write ASW a lot, so I guess it can be used for reducing perceived MTP.

@industria
I wrote per eye simulation. 4K uses 1 screen and Brainwarp… Based on that I made my interpretation. Read carefully. You also misinterpret ASW with frame doubling, ASW uses math(algorithm) to warp image, based on sensor data, so not doubled frame, but new synthetic frame, but made by math not Game engine. So it can be faster than GPU can render real frames(from game engine).

Maybe for Pimax 4K, but remember that it has only one screen. Its not the same with the 8K version, that has two.

My guess is again, they refresh 1/2 of 1 eye at 160Hz, panel support this function(on input - async), I know how many panels HMD uses lol xD

Nope. They can’t refresh all or part of one screen faster than 82Hz.
They should also erase the other half of the image as well.

I forgot that you can’t just show 1/2 eye image, so shutter is used to close whole eye for time when halves don’t show same frame. Since they refresh async.
Just a reminder, Pimax 4K had 90Hz async, problem with that was that there was added delay between game data and what is displayed around 1/120 ms, since shutter run at 120Hz and games at 60fps. So was not optimal solution.

In Pimax 8K, we can send actually new game data(since DP allow 1440p@160Hz input), but we won’t run game at 160FPS, we will use ASW for perceived double frame rate. I think that was idea, if not, my bad :slight_smile: . Shutter will run @160hz, I think that is safe range to not flicker.