Specsreader's BW speculations

Because they run with 80Hz input without Brainwarp on, you don’t need per eye refresh(“simulated 90” “per eye 90” “async 90” call like like you want) when you have low enough MTP with just showing whole image to both eyes at 80Hz/HMD. My theory for what means that Brainwarp as claimed “reduces perceived MTP, lowers the GPU load”. Comparing to 60Hz/HMD reduces MTP - in theory by showing brain next frame from different perspective after 5.5ms (check T2 to T3 transition) does that fools the brain that image is 90FPS, we are yet to know(no demo). And reduces GPU load comparing to what, they state that brainwarp save GPU load how? Instead of 90FPS on 90Hz/HMD, GPU renders 60FPS(for both eyes) being shown on 90Hz/eye, so it’s less demanding for GPU. Makes sense now?

This is your understanding, mine is brain gets same brightness on both eyes, 90Hz light on to left eye, 90Hz light on to right eye. They think that this will be perceived as motion to Brain, that it will perceive not loss of brightness(there is none) but warp perception to guess that from T2 to T3 is 90FPS motion. [quote=“risa2000, post:26, topic:4562”]
Why would anyone do that, if he could simply display 60Hz content at 60Hz refresh and stress the display much less? (Running the display at 180Hz is definitely more challenging than running it at 60Hz)
[/quote]
Why not display 60fps on 60Hz, you are kidding me? That is same GearVR MTP, for 6DOF 200FOV you can’t have such MTP. You need offset of 1/3 interval(they call is 180Hz) when eyes are not in sync, how to calculate that offset if not use 3x60Hz/2 - Pimax call this “90Hz/eye” . So you abuse this offset by sending some data to different eye to trick the brain. Bridge chip don’t do calculation, just can turn off panels backlight at some rate, that is whole 180/150Hz thing, not refresh rate/fps.

About 180Hz refresh of display, again misunderstand, this is backlight being off on panel, panel refresh rate is 90Hz/eye/panel, no more stress to panel than 90Hz refresh.

I just use data I have so far, 4K had 90Hz async, input was 60Hz, 8K has 90Hz/eye, I interpret for that you have also 60Hz input(per HMD) with Brainwarp to trick the brain that MTP is lower than nominal value (16.6ms+other constants) but (11ms+constants) aka time-warped effect, but done by brain, not by algorithm(as ASW,ATW). About how it tricks, hard to explain since it’s not pure tech(neuroscience involved).

Why not. When both images (for left and right eye) are available (as in your theory), it is definitely better to show them both right away and not withhold one for later.

Brainwarp can reduce perceived MTP if the interleaving images were rendered at different time points in the game. This is however highly impractical from the game renderer perspective, because then one cannot use stereoscopic rendering as each frame (for left and right eye independendently) is rendered anew.

Your example uses images GPU_0_L and GPU_3_R. GPU_0_L has been already displayed to the left eye for the whole T0-T2 period (i.e. 16 ms), so the perceived framerate remains 60Hz.
It is true that you may achieve faster rendering of the frame if the panel is running at 3x times the frequency of the input, but then it would be just easier to simply render both images three times in the row during one 60Hz period, for example render both GPU_0_L and GPU_0_R three times during T0-T2. This will also solve the brightness problem. There is no reason to interleave them.

MTP means motion-to-photon. In the old days, we called it lag and the history of competitive gaming is the history of reducing this lag. When I move my mouse, how long it takes my character on the screen moves? The lesser lag the better responsiveness. VR just made this much more important.

But if in your theory the image is calculated only at 60Hz rate, then MTP cannot be any better, no matter how many times or how fast the rendered frames are redisplayed.

This is not “my understanding” this is how PWM regulation of brightness works.[quote=“SpecsReader, post:27, topic:4562”]
Why not display 60fps on 60Hz, you are kidding me? That is same GearVR MTP, for 6DOF 200FOV you can’t have such MTP. You need offset of 1/3 interval(they call is 180Hz) when eyes are not in sync, how to calculate that offset if not use 3x60Hz/2 - Pimax call this “90Hz/eye” .
[/quote]

I have already explained why it is called 90Hz “per eye” in my previous post. You can ignore “per eye” if it bothers you. Writing the refresh rate of the panels is 90Hz would be equally correct (even though at the moment it should be rather 80Hz as indicated by Pimax). It has nothing to do with any of your theories.

You are also confusing MTP vs refresh rate vs rendering rate.

Refresh rate = rate at which the panel displays the information. By speeding up the panel (as you suggest in your theory) you can increase the perceived refresh rate (even when rendering the same content).

Rendering rate = rate at which the system (gfx card + game engine) are able to render the game state. Note that there are two factors here: the card and the engine and the result is determined by the minimum of the two.

MTP = time it takes to render some change in game state caused by the user input. This is limited by sensor speed (i.e. controllers sensors, headset sensors) and both above - refresh rate and rendering rate.

Ok I understand you are also tech guy. But let’s not not use our knowledge, but just logic for a moment.
I am having HMD with 80Hz refresh rate. I don’t want to market it as 80Hz, because standard is 90Hz for 6DOF VR in 2017. How I hide that I can’t run HMD at real 90Hz that gives better MTP, I put this idea about using Brainwarp, and numbers like 75Hz(this per HMD possible), then put 90Hz and after that “per eye”, then explain what “per eye” means, by referring to Brainwarp and put even higher numbers like 150/180. 4K HMD had Brainwarp(marketed as 90Hz async), downside was loss of brightness(shutters used) + there was no perceived MTP reduction comparing to 60Hz.

If we have same logic about this, lets guess input Hz for Brainwarp modes:
180Hz - 3x60Hz - 3x60/2 = 90Hz/eye, as marketed. 60Hz input per HMD(2x1440p@60)
150Hz - 3x50Hz - 3x50/2 = 75Hz/eye, as marketed. 50Hz input per HMD (2x1440p@50) .

Now lets use knowledge :slight_smile:
Can in theory work, ASW(on Rift) uses 45FPS from game world(rendered in engine, I know what is MTP/lag), but uses math to compensate for motion to make 45FPS appear as smooth as 90FPS. Pimax idea is to use Brain to compensate from 60 to 90, or from 50 to 75. That is as close to tech analogy I can get. If you disagree, ok, we wait for Pimax to (dis)prove my theory…
Risk to my theory is that Pimax say they work on ASW, not sure what they mean by that, maybe for 80Hz/HMD they will use ASW, for 180/“90per eye” Brainwarp.

If we put enough trust into Pimax we can even believe them that they were actually running the first prototypes at 90Hz refresh rate (though not stable) over 2x HDMI ports. They just put themselves into the trouble by switching to DP without actually doing some analysis beforehand. Not very professional, but far from the conspiracy theories you suggest.

I believe that they did not want to “hide” anything. They believed that they had panels which run at 90Hz and they thought they could run them in an interleaved manner at twice the speed (as I explained in my post about brain warp). Even now, if they only can do 80Hz safely, they can still implement brain warp on top of that at twice the refresh i.e. 160Hz.

The idea to do the interleaving is sound from the pure hardware perspective, but has also some drawbacks. For example, as I noted above - stereoscopic vs monoscopic rendering, or as already demonstrated by @LoneTech in another post - higher bandwidth requirements on DP port interface, which may actually be out of the spec. of the hardware Pimax is using.

So it is not clear if it will actually help, or not and that is also the reason why I am not really bothered whether they make brain warp working right at the launch or anytime later.

I tend not to trust companies, specially ones with marketing gymnastics like 4K(1440p input) 8K(4k/eye) 90Hz async(60hz input), 90Hz/eye(half time show half input), 160Hz 180Hz Brainwarp(neuroscience assumption). My bad :slight_smile:

I found post about PimaxSWD when he clearly states that 2xDP is needed for DSC on MIPI, why point out that? 2xHDMI could use DSC on MIPI to panel IC so could actually get 90Hz on panel, real deal, not 90/eye mambo jambo. If HDMI bridge had DSC encoder. After switch to 1xDP, DSC encoder not working, hence drop to 80Hz. Limit with V1-2 is HDMI 1.4 cable, with V3 is just DIPI bandwidth, without DSC to reduce it, 2xMIPI per panel can’t feed the panels at 90Hz. My understanding. Read BW MIPI analysis again. :slight_smile: Maybe wrong, maybe not. Pimax is silent about it so far… Too silent I think. 30 days no actual reason(for lower ref.rate) found, yea right xD .
I tend to use logic a lot, I think 8K X is made to allow DSC to work, but more expensive to make. So you have 8K and 8K X, two products from same initial idea.

Why DSC so important for future of VR, limits of cable BW is real hurdle when you try to send 2x4k@90 signal, hence the need for compression/ foveated rendering etc in gen 2 HMDs. Or use brute force, 2xDP 1.4 :slight_smile:

I have really great conspiracy theory, this will be fun, maybe not true, so I say now "This is only for entertainment purpose conspiracy theory lovers - enjoy"

I spend some time bashing my head, why use 2xHDMI on demo, when plan is 1xDP. So here is fun theory:

Laptop output DP 1.4 with DSC, DP to 2xHDMI adapter used, HMD accept 1440p@90Hz(DSC), per eye signal, give it to Driver IC, and here we go, we have answers about 90Hz issues with 1xDP input.

From BW requirements to send 1440p@90, this is sent as RAW, so can be DSC in my understanding. Or at least it needs to be possible, for conspiracy theory to work, or be fun at least.

Limits? Why bother with DP on HMD input? HDMI 1.4b when transferring 7.4Gbit/s length is limited to 2/3.5m as shown in demos. No room scale no fun? :slight_smile:
Pimax acknowledged we can get more distance by using breakout box, so we use DP 1.4 into breakout, from breakout 2xHDMI 3.5m, still can be too low, so probably why they decide on DP or nothing.

This can also explain whole DP 1.4 marketing from start, we know ANX7530 is not real DP 1.4 only latter, original idea, 2xHDMI also needs DP 1.4, because DSC is part of DP 1.4, so yes HDMI and DSC, clever for sure, too bad cable is short. I feel bad for Demo persons, when they defend Laptop from person playing fruit ninja on 2m cable, hope is paid well.(can’t find the video of it on YT, is really fun). And why use Laptop not desktop, well Laptop is to put it near user like with breakout box, same effect, only prob, some user may damage ~2000$ laptop, comparing to ~50$ box :slight_smile:
I already suggested to Pimax when we thought that limit is HBR2 speed, to use b/o box, cause analogix say they support HBR2.5, to take advantege of that, they already had same idea I bet, I am no smarter than them… Or this also just theory, I actually am? :smiley:
My idea was, make breakout box accept DP 1.3, output DP 1.25(HBR2.5), Pimax idea was out DP 1.4(DSC), send through 2xHDMI to HMD display 90Hz DSC image.

Had some fun reading? Am I only good at reading specs or also making(conspiracy) theories ? :wink:

Now for tech savvy, can claim “input is 4K upscaled to 8K” from FAQ be true? - Yes, yes it can:
3840x2160 is real input res, as they put in FAQ “4K(DSC 2:1) upscaled to 8K(4K per eye)”.

3840x1080@90 DSC 2:1 through HDMI 1(Left eye), 8.34Gbit/s , cable limit is 10.2 Gbit/s(RAW)
3840x1080@90 DSC 2:1 through HDMI 2(Right eye),

Ouput from GPU is 3840x2160@90 DSC 2:1 . I made wrong assumption you need 3:1 DSC, my bad :slight_smile:

EDIT: Hm, actually it was for sure compressed 4K, chroma subsampling not DSC, but that is going from conspiracy to real stuff. Need new post. I am not sure if DSC 1.1 is different from 4:2:0 I understand it also halves vertical res, and compress color data. Anyway, probably use GPU DSC encoder for this, hence the marketed “DP 1.4” even if used 2xHDMI 1.4b …

3840x1080@90 (4:2:0) through HDMI 1(Left eye), now we calculate 18bit/pixel not 24bit, 6.26Gbit/s, can be 3.5m, I think.
3840x1080@90 (4:2:0) through HDMI 2(Right eye),

He doesn’t seem to understand how brainwarp works. If he reads up on Nvidia’s cascaded displays he might have a better understanding as its similar idea (sort of); combining bw & cascade eould be very interesting.

Give us the diagram with actual time and what is on what eye from GPU then, enlighten us xD
Not sure even Pimax understand how Brainwarp works, its neuroscience assumption not pure tech. Not my field. I wrote what I think is done with tech. Prove me wrong with data- about Brainwarp,not nvidia research.
I have nVidia research data, “Brainwarp 16000Hz” :

Does the brain warp technology already works? :smile: :grin: lot’s of giggles… youtu.be/G3AtCeTuO4k?t=25m28s

1 Like

Simple 1 eye is blind while other sees image. Same idea that Active shudders do for active 3d.

There is more than enough posts that explains this already.

In non brainwarp; the displays show a picture to each eye at the same time.

With Brainwarp displays shows 1 eye a picture at a time. This makes the brain perceive double the frames.

Nope, that is not reducing perceived MTP… Animation can appear smoother, but when you move head, you must wait for frame to be rendered and displayed same time as original input Hz/FPS.
That is brainwarp on 4K what you talk, 90Hz async mode.

On 8K since we have panel per eye, we can use it to give one eye different frame for half time, as in my diagram. Please if not have tech knowledge on subject, don’t comment. I have SW background, you admit you are not tech guy. This topic is [for tech savvy] .

Nope. Uses a similar idea as Nvidia’s cascaded displays. Can’t explain something that a person has no interest in learning.

As was said its been explained enough times; you just seem to be stuck.

Perhaps you should have chosen “specbreeder” as you often promote mainlt speculations.

Ok, that is off-topic anyway. Brainwarp has topic, this one is about 90Hz problems :slight_smile: .
I wrote how I see(understand) brainwarp, if get proven wrong in Pimax demo of it, no prob for me.

Simple brainwarp doubles perceived refresh. Thus your idea bw is used to get 90 out of 60 is completely wrong. As with 60 + bw = 120hz perceived & not 90. But your right you did introduce bw in the wrong topic.

So for someone who claims to be technical your having mathematical difficulties. :stuck_out_tongue_winking_eye:

Here is what pimax has to say about how brainwarp works:
Pimax rep(SW background comment on BW video ) .
I write how it could work to reduce MTP, if for your brain Brainwarp doubles FPS, and makes it not FPS but Hz. That speaks about your tech knowledge on subject. You are just bashing me for sharing some info Pimax wanted to hide. Simple truth.
You are mixing FPS and Hz and calling me to not know math, I had passed Math(4x exams - hard math) on faculty of Computer Science/Engineering bro…

Its not hidden it is described accurately save should say perceived. Of which your mind will perceive double the frames.

How well it works will depend alot on the individual. Much like TVs that use frame multipliers on 60hz to 120hz trumotion. If course 120hz trumotion will not look as good as a monitor that supports native 120hz input.

Do you have MTP with TVs? You are just guessing even worse than I do, about how works something that is not shown to work…
For VR you have no use of higher refresh rate if it doesn’t benefit perceived MTP. Oculus uses ASW and 45FPS rendered(in game engine) to give 90FPS/Hz perceived MTP. Does BW does this by using brain instead of math(ASW algorithm), we will see. I guess it does(but from 60 to 90 to be clear, or 50 to 75).
I will not comment Brainwarp anymore until I see demo.

If someone could help with unity to get stable framerate?
Right now it’s simple gif files with transition variations. (myPhotoshop > myUnity)
BrainWarp demo =) Cross-eye 3D 3D without glasses, Cross-Eye HD - YouTube

**Epilepsy seizure warning!**

Pimax 3D BrainWarp demo - Album on Imgur


Another nice pic to train on (1560x1080 :japanese_ogre:).

Please don’t do this, you can’t simulate 180Hz(90Hz/eye) with monitors. You will hurt someone, I tried for some time and got problems. I am not diagnosed with epilepsy. Just wait for Pimax to Demo it…
In meantime read my latest posts, they are interesting. :slight_smile:

you can’t simulate 180Hz(90Hz/eye) with monitors

I have 144hz benQ gaming monitor, 72/eye possible? Let’s have fun :sweat_smile: