Pimax brainwarp technology

Pimax, please kindly explain a bit better how the new brainwarp technology works concerning frame rate: on my TV screen I usually run games with resolution set at 4K, at present with a Gigabyte GTX 980ti G1 the average FPS is 30 to 60, what would happen with Pimax 8K, what would be the frame rate? I see there is confusion among community, I personally think the same as user “skindred” posted few days ago:

Like I explained before, you need to keep in mind that there are 2 render processes going on:

  1. Game renders left and right eye image, like you’d see them on your monitor
  2. SteamVR or the HMD driver takes these images and processes them to a format that’s suitable for viewing through a lens and forwards the images over HDMI. This is called ‘render distortion’.

So this implies that in all of the above T1, T1’, T2, T2’, the game will render both a left eye and a right eye image. The Pimax HMD driver could choose to only do the render distortion for 1 eye and forward only that one eye. This will save a bit of GPU power (that would be needed for render distortion for the 2nd eye) but really it’s the game render that takes the most time by far so the saving will be relatively very little.

The other (WAY better) option would be if the game would only render 1 eye per T1, T1’, T2, T2’ etc, like shows in your 120hz table. However this only will work if the game knows that it has to do it and knows how to do it. This means that a game has to be programmed specifically with this in mind (just like a game would need to be programmed specifically to take advantage of a multi GPU setup). Since Pimax seems to be the only one using ‘brainwarp’ (it seems their idea), I highly doubt any game at all supports this.

Long story short: don’t count too much on ‘brainwarp’, it seems more like a marketing gimmick than a real viable strategy.

1 Like

But are you sure that Pimax HMD driver can not do the render distortion for both eyes and use them in separate way? I would like to hear from Pimax experts, it sound strange they are cheating…

@deletedpimaxrep1 said above she’d double check with her engineers and come back to us. But what I just outlined IS the process how game rendering works so I’d be highly surprised if I’m wrong with my assumption that a game needs to support it.

But I’m hoping too that @deletedpimaxrep1 will update us with the reply from her engineers.

Well, the second option is not really better than the first option. Of course it would be more fluid, but it would also need much more processing power.
According to the first option, the game needs to render 2*4K@90fps. According to the second option, it needs to render 4K@180fps.
Doubling the fps means you need hardware that is more than twice as fast. Doubling the resolution reduces the framerate to about 65 - 85% of the original framerate (heavily game dependent). This is because only the last parts of the rendering pipeline like fragment shaders need to be executed more often. This is also the reason why the framerate of games running in 4K is not 25% of the framerate of games running in FHD, but much higher (about 50%).

However, since the Pimax 8K uses a 90Hz Display, I don’t think the second option is necessary at all. Resolution and FOV are the critical points with the current VR generation. 90Hz is not perfect, but it is definitely ok.

I also don’t think that performance is a big problem. Most VR games are indie games without modern graphics, which don’t need that much performance at all.
Also you can reduce graphical effects in more demanding games from Ultra to High settings. This is usually barely noticeable but gives a significant raise in fps. If that is not enough it is always possible to go down to 2xQHD or 2xFHD.
Still much more pixel than with Vive or Oculus and still no screendoor effect because of upscaling.

Also, there are applications where performance does not matter at all. Like watching movies, or using VR for a huge virtual desktop without the need of purchasing a ton of monitors.

1 Like

Yes you’re right. Brainwarp won’t cause any other GPU saving than just the render distortion. If you ask me Pimax should stop touting it as such.

Just the first idea where the game renders both eyes and pimax just processes/distorts/uses only one eye seems highly inefficient and a waste of GPU power. That’s why the other idea seems better to me. But you’re right I think, it makes sense what you’re saying about game rendering regarding fps vs resolution.

1 Like

@Headcool so to sum up:

  1. Game renders 2*4k (or higher/lower of course) at 90 hz. This would indeed be the best in terms of GPU optimization and it IS how it normally works anyway.
  2. When using brainwarp, pimax distorts only one eye and sends these at 4k (or lower/higher) at 180 hz to the HMD. So it actually does use both eyes but just needs 2 timeslots to send them to the HMD, at twice the framerate the game is running.

But would this have any advantage to the user over just sending 2*4k at 90 hz to the HMD ? Thinking of all this, what’s the user advantage of brainwarp anyway ? It seems more like at DP1.4 protocol requirement than actually beneficial to the user ?

Well, I think you got the assumption right. Both screens run at 90 Hz but async. Eye A gets the update as soon as the game rendered it, Eye B gets the update just a half frame later. So I think it is a artificial delay, which might have a positive psychological effect, since there is movement somewhere and the brain has to process this new information and is therefore is less aware that there has nothing happened at the other eye.

I don’t know if this is necessary for DP 1.4 since it can handle 8K@60Hz which is equal to 24K@120Hz. Pimax 8K needs 24K@90Hz. However that concerns througput and not latency.

2 Likes

This might actually be true. Honestly I don’t know if it works like this but it indeed might have a positive psychological effect on the brain, who knows. Yet Pimax touts “Brainwarp” as it’s THE solution for people with lower system specs. I don’t see at all how this could be true.

Anyway I really love this realization. You’re totally right. 2x4k is NOT going to take double the GPU power of 1x4k. More likely indeed about 70% more. And like you said, lowering in game game quality settings from ultra to high/medium will make a huge improvement. And you also made a very fair point that most VR games that are currently available are indy games that are not using advanced graphics at all. So even with a GTX 1070 you might enjoy some basic indy games already on the Pimax 8k.

The GTX 1080 offers about 70% improvement over the GTX 980. So if Nvidia repeats that again with the GTX 2080 then we’ll be good anyway, even for games with advanced graphics.

You might be interested in this article. The VR runtime does alot of work.

Here’s the article

From what i understand the game renders the complete non vr basic frame then the vr runtimes work together ie unreal & say oculus.

Either way a good article but still flows with the idea of frame rate based motion sickness.

So as you can see timewarp & brainwarp do not need to be supported directly by a program.

Timewarp is something completely different than brainwarp. Timewarp is NOT a game rendering process but a process done by the HMD renderer or by SteamVR: when SteamVR (or the HMD renderer) doesn’t receive game frames fast enough, it interpolates them itself.

Brainwarp on the other hand, is something invented by Pimax (?) which renders only 1 eye per timeslot. Either it takes two eyes and sends them async or it takes one eye and sends it in sync. In that last scenario the game would need to support it. However like @headcool pointed out this wouldn’t make much sense since it would take much more GPU power for a game to run double the fps with half the resolution than double resolution with half the fps.

So you’re right, most likely the game won’t need to support it. However it also won’t bring any GPU saving at all, while Pimax seems to claim it somehow does. And it yet really has to been if it has ANY benefit to the user at all. I doubt it.

Something to keep in mind.

During brainwarp it doesn’t necessarily need to double the frames. By having a blank for 1/2 a frame delivered to 1 eye @ 90hz the result is each eye gets 90hz but with the timing delay the eyes combined will see 180hz. Even though we don’t see it left & right eye combines the full eye frame with the blank.

Ie combined frame
Left pic. Right blank = 1 frame
Left blank right pic = 1 frame

With the delay the eyes percieve more frames than are rendered due to the blanks.

The article i posted from intel shows better how the game interacts with game vr runtime & the headset runtime.

Steam for example does the initial handshake then passes it off to the headset runtime (steam/oculus/osvr/pimax/etc)

3 Likes

So you’re saying:
T1: Right blank. Left shows pic.
T2: Left blank. Right shows pic

Etc. That’s indeed the idea. But what frequency is this then ? I’m not sure I’m following what you’re trying to say.

Remember that Pimax themselves stated:

Pimax 8K renders a single 4K image at 150/180 times per second, but users perceive a complete 8K at 150/180 Hz with high frame rate.”

So they render one eye but at double the framerate. This is NOT going to save GPU power at all.

1 Like

Even 70% more is a quite pessimistic assumption. I have looked at some game benchmarks and how the frame rate drops from FHD to 4K on a 1080 Ti across a handful of games. The framerate dropped to 40-90% of the original value. Some games like Ashes of Singularity only dropped 10%. The average was a framerate of ~55% of the original value. With the assumption of a linear (optimistic assumption) connection between frame rate decrease and pixel count increase that leads to following results: Everytime the pixel count is doubled the frame rate is reduced to 74%. If it is doubled two times like from FHD to 4K, its end result is 10.740.74 = 0.55 = 55%
Since 0.741.35=~1, this means you only need 35% more performance to render 2x the pixel count. However, the connection between frame rate decrease and pixel count increase is not linear. So 35% is not enough. It should be a bit more (maybe 50%+).
From a stock 1080 to a stock 1080 Ti it is a performance gain of about 30%. Overclocking the 1080 Ti about 20% leads to a performance increase of 56% (1.3
1.2=1.56) against a stock 1080. That means if a game runs at x fps on a stock 1080 at 4K, it should run on x fps on a overclocked 1080 Ti at 24K.
However keep in mind, that this is the average. Some games scale well, some don’t. In the end you will have to cut back the effects or the resolution in some games. I don’t expect games like Ark Survival who barely run at 4K at 60 fps to run at 2
4K at 90 fps without adaptions. But going from 24K to 2QHD and from Ultra settings to High settings should result in 70+ fps, which should definitely be playable. Going to Medium settings should result in 90+ fps. And that is a highly demanding game.
The brain warp technology might become handy in here, when dealing with framerates below 90. That is, if there is an positive psychological effect. But since people played games in Pimax 4K @ 60Hz without great problems it might really help. Adaption is also important. People might experience motion sickness with VR the first time, but after tens of hours spent in VR this effect reduces, since the brain “learns” to process contrary information. At least that is what other people described.

2 Likes

Hard to say. But i am thinking the headset internals are likely bumping the frequency much like the psvr box does to get 120hz.

But with the idea i presented the blank is merged with the pic & counts as a completed frame.

Much like i suggested the shudder glass frequency could trick the mind into perceiving more frames as the blanking of the shudder glass while not a rendered frame could emulate the idea of frame increase.

1 Like

BTW, I’m 99.9% sure I understand now perfectly what Pimax claims and how brainwarp works. This is what they said:

“Pimax 8K renders a single 4K image at 180 times per second, but users perceive a complete 8K at 180 Hz with high frame rate.”

So it’s actually quite simple indeed. In this particular case the game renders 2*4k at 90 hz. However Pimax claims that if they send the 2 eyes async at 180 Hz that the brain actually perceives it as 180 hz, just like @Headcool understood it too. That indeed MUST be Pimax idea of brainwarp, I’m 99.9% sure.

So IF this were true then a game could just render at 45 hz 2x4k. Then Pimax takes the left eye render, forwards it, then takes the right eye render and forwards it, so at 90 hz and then it would feel exactly like 90 Hz. However I’m pretty sure this is nonsense. No way the brain is going to feel it’s looking at at 90 hz game when you forward 2 images that belong together, created at 45 hz, async, I don’t buy that at all. There’s just half of the game data missing. I even doubt sending 2 images that belong together async makes ANY difference at all to the brain. Maybe there’s some slight effect, that the brain tricks into thinking the refresh rate is slightly higher than 45 hz but NO WAY it’s going to look as good as 90 hz.

I think it would already look a bit better if the game would actually render (alternating) one eye at 90 hz, so each update the HMD actually would receive new game data (representing a new situation in the game). However again, in this case the game would need to support it and secondly, it’s highly inefficient like @headcool outlined before. But I’m sure that in that case it would at least look better and closer to 90 hz.

1 Like

I’m not sure you are right. Brain warp is not unique technology and it is alreadi in use on CV1.
Brain warp is like Gsync and Freesync, same principle with different providers and different name.
Also if you logically put it up where is saving of resources if you increasing from 90 to 180Hz.
Point is Hz=fps and Hz is max fps what display can process. Everything above display frequency is waist of resources. That is the role of Vsync to prevent that. So lets go back to brain warp. 90Hz is 90Hz but this 180 is more correct to say fps not Hz because in essence it is 2x90fps.
So how it is work. Fps is droping on 45 per one display asynchronous but your brain see this as one 4K picture on 90Hz/fps so basically you using resources like for one 4K display not two what is like CV1 with supersaturating 1.8.

Most performance save I see with eye tracking module. Ofcourse all this must have software support but I believe will be because Vive will have same modul as well same as CV1 with brainwarp so support for it will not be limited by VR provider.

I’m pretty sure they’re something totally different. Gsync & Freesync are technologies to have a monitor process a frame once its available, so to align the monitors refresh rate to the games refresh rate. I’m pretty sure Brain warp has nothing to do with that.

I’m not sure what you’re trying to say here. But the idea is that the 2 panels in the Pimax 8k (can) run at a half a phase difference, so you can actually process 180 updates per second. At least that’s how I understand it.

Sorry but with sentences like this you really lose me … I’ve tried but can’t decipher what your point is.

Actually I must admit, I’m now starting to doubt my own conclusion again. The thing is that they’ve posted this image:

And then state:

“Brainwarp: render and display image in a sequence. i.e. For each time, only one eye can see a 4K image. Pimax 8K renders a single 4K image at 150/180 times per second, but users perceive a complete 8K at 150/180 Hz with high frame rate.”

But this text does not seem to correspond to the image because in the image the refresh rate doesn’t change.