Pimax brainwarp technology

Pimax said:

“Brainwarp: render and display image in a sequence. i.e. For each time, only one eye can see a 4K image. Pimax 8K renders a single 4K image at 150/180 times per second, but users perceive a complete 8K at 150/180 Hz with high frame rate.”

This is how I interpret that:

But this is obviously different from their own picture. In fact in their own picture they are NOT displaying R1 4k. This makes no sense to me at all.

So I think I’m giving up, all this does not make any sense.

Ouch! My Brain hurt. Let me start warping back to my original brain!!

I think brainwarp is one more “genious and patented” solution like focus autoadaption.
But really its just about lcd matrix 60 Hz and for decrease flickering. Phase of refresh rate first and second screen and his lcd shutters just shifted 8.3 ms. Blank half frames its just close lcd shutter.
Just read old uploadvr article. 150/180 real fps 4k ??? Its just madness.

I would say much like async timwe warp. Controlling dual displays to create a pseudo frame increase that 1 panel would not be able to achieve but putting display timing off between 2 panels so refresh timing off set.

How else do you exceed a display panel max refresh rate?

The same idea is an effect of the shudder glass in the 4k model.

Here’s an idea: why not let both panels run 90 hz synchronously, let the game render at 45 hz, both left + right eye and let steamVR interpolate the other 45 missing frames, exactly like it’s doing now. Wouldn’t this look at least as good as this whole Pimax ‘brainwarp’ idea ? I think so.

I wish Pimax would stop mentioning brainwarp altogether, it seems false, misleading marketing to me. They don’t need it, they have an extremely awesome product already. No need to mention this brainfart … err brainwarp thing :slight_smile: Rather focus on the REAL reasons why you can enjoy the 8k with an, let’s say, 1070 GTX already (up sampling from 2x1080p to 2x4k for example, eyetracking could make a difference in distortion rendering, slightly lower game quality settings might already do it etc).

Agreed just try it on. In the end it don’t really matter how it does it. Simply that it gives awe to those who bestow it.

Been trying to explain black frame insertion to these guys for a while now lol. Also, if brainwarp can interpolate frames like Asynchronus space warp (not time warp ) does on the rift, then it may work.

“However I’m pretty sure this is nonsense. No way the brain is going to feel it’slooking at at 90 hz game when you forward 2images that belong together, created at 45 hz,async, I don’t buy that at all.”

This is exactly what asynchronus space warp on the Oculus rift does. Space warp is different than asynchronus time warp. Async Space warp runs the game at 45 fps and then creates synthetic frames to interpolate up to 90hz its why Oculus was able to lower its min spec after Oculus connect 2.

2 Likes

I recently found out that Virtual Desktop added Multi Gpu and 8k video support so I’m no longer super worried about brainwarp. So long as it fixes the ghosting issues present in the 4k and doesn’t cause any flickering I could not care less

No it’s not. Not Found| Oculus

"ASW applies animation detection, camera translation, and head translation to previous frames in order to predict the next frame. As a result, motion is smoothed and applications can run on lower performance hardware.

The Rift operates at 90Hz. When an application fails to submit frames at 90Hz, the Rift runtime drops the application down to 45Hz with ASW providing each intermediate frame.

By default, ASW is enabled for all supported Rifts.

ASW tends to predict linear motion better than non-linear motion. If your application is dropping frames, you can either adjust the resolution or simply allow ASW to take over."

That’s exactly what I mentioned in my last post. I think this makes much more sense than Pimax’s brainfart. ASW just runs both eyes synchronously and interpolates the missing frames. Pimax brainfart runs a-synchronously and that seems all it actually does: insert half a phase delay between left- and right eye.

Of course they could combine it with interpolation but then this would effectively mean that 100% of one eye images are interpolated. It just doesn’t make sense. Oculus ASW makes much more sense.

But even then. No way 45 hz ASW is going to yield the exact same experience as rendered 90 hz. It’s better than nothing of course but interpolation can never be on par with the real thing.

1 Like

Simple truth won’t know how well it works or not til we try it.

You know I’m all for honesty. I understand that for Pimax the biggest challenge is that people need expensive hardware, which severely limits their market in size. To me it SEEMS they’re now trying to lure in people with this ‘brainwarp’ excuse: “No of course you dont need a 1080(TI) because we’ve got BRAINWARP”. To me it just really sounds like false advertising. But that’s why we need good reviews. If they get some industry peers to review it (before opening kickstarter), run some in depth tests, with different hardware with brainwarp enabled/disabled, then everybody exactly knows how good it is. But somehow I have this feeling that’s not going to happen. I really hope I’m wrong though.

I see you are easly geting to the point I’m talking to you.
so there is no 180Hz in any moment only 90Hz.
Point with VR in general it need twice performance than usual single display because of the need to proces picture for ich eye separately.
CV1 got dual display and Pimax4k have one but with split image so native resolution what you see is 1280x1440 but processing resolution is 2560x1440. Same with 8K native resolution will be 4K and that will be what you see but because separate eye picture processing will be processed like 8K.
And here is the catch if we reduce it on 45Hz(FPS) we are reducing this 8K resources drain to half and it is equal like 4K on 90Hz(FPS). Brainwarp now doing it asynchronous what fooling our brain because human eye frequency and brain see it as as one 4K picture on 90Hz even it is actually processed as 8K on 45 (2x4K 45Hz).

GSync and FreeSync is just example of same technology under different branding and dose not have nothing with Brainwarp or VR at all.
Same tecnology with diferent branding as Pimax Brainwarp is ATW/ASW on CV1 or Asynchronous Reprojection for Vive whot is more Copyright issue than tehnology but in essence is the same stuff.

Ah i am still confident that my r9 390 will be able to get reasonably good results.

Hell my 2x7950 cf does amazing for its age on the 4k.

We know optic foolery works. Just look at those pics you can view on a monitor that appear very 3d without a 3d display.

Not even a handful of VR games support multi GPU setups at the moment so that’s really not a good VR setup at all. My GTX 1080 does fairly well on the pimax 4k, but that’s only running at 1440p, a single screen and even then I can easily get it on its knees if I crank up details in some games. But some games will scale well, like @Headcool pointed out so with a GTX 1080 you can probably enjoy 2x1440p on the Pimax 8k in some games already with good detail. But I’m really not counting on enjoying 2*4k native resolution with a GTX 1080 with good game render detail. I really think we need Volta for that.

I’m really not sure what you’re going to do with that R9 930 … Sure if you run some indie games and/or set game details to the lowest you can probably run some things at 2x1080p… But it can hardly be an option for the serious VR gamer with the Pimax 8k.

Oh yea especialy with eye tracking which will alow this …

If the game supports it. I can guarantee you that by Q3 2018 there still won’t be even a handful of big titles supporting it. Most likely even 0. We need the big guys to jump in first. If Oculus or HTC bring out a headset with this feature THEN things will start to roll. Pimax is just way too small I’m afraid to move the industry. Sure some other smaller companies are now using eye tracking too like the Fove and Deepoon (if I understand correctly my E3 has it too). But still I highly doubt this will move developers.

Don’t get me wrong, it’s awesome that Pimax will support it ! But you make it seem like this is going to get us huge advantages right away. The only possible advantage for now is in the HMD rendering (and that’s just a minor GPU gain compared to game rendering). And seeing how slow Pimax development is … I doubt we’ll even have that at launch.

Oh and something else, while eye tracking sounds extremely potent in theory, it yet has to be seen how well it works in practice, depending on the specific implementation. Latency is a huge thing when it comes to eye tracking, you will want the next frame to be ‘in focus’. Also very important is accuracy. Like we now have all these problems with drift, I can see already similar problems with eye tracking where the game renders out of focus where you’re actually looking at. If that happens too often, you won’t use this feature anymore…

Sorry to be the party pooper here, LOL, but people’s expectations need to be realistic otherwise it’s going to be one big show of disappointed people after the kickstarter launch.

BTW I even doubt SteamVR supports eye tracking at the moment ? I’m googling but can’t find anything about it at all. In fact I’ve just spent 15 minutes to find a single VR app that supports eye tracking (to test with my Deepoon E3) I haven’t found anything at all … Pretty sure SteamVR doesn’t support eye tracking at the moment. So I have my E3 for several months already and can’t use it at ALL for eye tracking. I’m sure it will be the same with the Pimax 8k. We’ll have to hope that HTC supports it in their Vive 2, THEN steamVR will support it and then developers will start writing for it.

15 posts were merged into an existing topic: The Status of multi GPU support

I’ve moved the mgpu discussion to the right topic btw.

A post was merged into an existing topic: The Status of multi GPU support

I’ve been looking at some some benchmarks. You’re right about “ashes of singularity”, it’s quite amazing how well this scales, it indeed only takes about 20%-25% more GPU to go from 1440p to 4k (= twice the pixels). Amazing indeed. According to Anandtech:

“this is the game making the best use of DirectX 12’s various features, from asynchronous compute to multi-threadeded work submission and high batch counts. What we see can’t be extrapolated to all DirectX 12 games, but it gives us a very interesting look at what we might expect in the future.”

So that’s hopeful ! Then again, other games like Crysis 3 really scale horrible: FPS drops in half when going from 1440p to 4k. So it indeed really depends on the game, if it’s programmed right then you only need like 20% more GPU power to double the FPS.