Pimax brainwarp technology

Some drivers for 3D shutter glasses on CRT monitors used that same mode as brainwarp.
I tried it and it was working, and was more logical, because the display was already alternating between left and right eye anyway.

I tried Day Of Defeat with it, so they found a way to hack the game engines with a driver similar to what Nvidia 3D vision does.

But there were issues.
The first issue was that a lost of sync and an inversion of the eyes, as the CRT or glasses couldn’t know wich frame was left or right.
The second issue was frame dropping, and the fact that when you drop a frame you can display a black frame, or display the previous one, which would mean going back in time, as the eyes are always display T+1.

So you are right in saying it’s a gimmick, because it means also the engine would need to render at 180fps, even if the GPU doesn’t compute more frames, that’s a lot of strain.

2 Likes

Alternate frame rendering sli mode yielded roughly 30sdditional fps in every title in my library, no game specific support needed. Sli in general for vr would need to connect several times higher to work this way though.

1 Like

By the way and take this with a grain of salt, but a guy from pimax at one point told me they have an asw equivalent, and I made a comment about the distinction between stw and asw, he understood and replied he understood.

2 Likes

Check this video out, its nuts! 3D video with the naked eye

1 Like

All this talk and I can only say one thing about brain warp, I can’t wrap my brain around it…

GTX 1080 user here and I can confirm that some games can run rather well but I’m definitely saving up for RTX 2080 Ti and I reckon I need to change my i7 7700 as well to avoid performance bottleneck.

First time trying the large option in PiTool and ran Beat Saber, well my advice is don’t do that unless you want to feel like having a bottle of Jack Daniel’s thrown at your head.

Fellow 1080 users should stick to small or normal for optimal gaming.

Now to the serious issue at hand. With HMDs such as Pimax being developed by a relatively small company whose name is still unrecognised by most tech companies and dismissed by third party accessories producers as having low number of adopters, chances are small that Nvidia will take VR seriously for their next outing. So for the foreseeable future, I reckon RTX 2080 Ti remains our go to GPU for Pimax.

2 Likes

Yes and even that is not enough

2 Likes

As I said, chances are small that that problem we are having with Pimax products will ever be addressed by the likes of Nvidia. There needs to be huge number of adopters for them to take notice and a revolutionary innovation to bump up GPU’s resolution output with current technology.

Apart from that, the way game engines render needs to also be changed. Currently graphics are rendered twice for each eye since they are using stereo camera in the game engine. So meshes are loaded in memory but instanced to be rendered twice for each eye. That is still taking up a large chunk of memory and reduce performance.

I reckon the better way of doing it is by going the same route that some of you here have proposed, which is to use the same methodology in producing stereo images in 3D Vision or 3D movies. Using one stream that is then split in two rather than the conventional way using two concurrent streams. But what do I know…

Oh and btw, having watched Sweviver’s videos, do you think 1080Ti is more reasonable than 2080Ti? Considering the matter’s performance bump isn’t all that much compared to 1080Ti?

What’s your take on this?

Yongkykun

Both Nvidia & Amd are working with pimax. For example on Nvidia front the last 3 pitools required latest Nvidia driver & pitools are being tested by both main 2 GPU manufacturers.

On the Kickstarter page under Partners says how many are banking on pimax.

There are quite a few listed there; Disney, Valve, Unreal & Unity, Amd & Nvidia just highlight a few of them.

@Heliosurge but that is for R&D purposes and may reflect their ongoing research for mainstream users which take up more human resources and power. I saw the same thing for other HMDs but remain skeptical if any of these small companies (Pimax mainly for this instance) will dictate the direction of those companies you list as a whole for the foreseeable future.

2 Likes

I understand your point, but nVidia should be doing everything they can to help Pimax, since large FOV needs more GPU power than even a 4K flat screen monitor. That boosts sales of high-end GPUs.

3 Likes

Exactly, if enough resource is given for proper R&D in regard to higher than 4K resolution bump, it will only benefit the company in the long run. Though I am skeptical since, as I stated earlier, there only so few of us Pimax backers and, also, mainstream gamers number much higher on any scale compared to VR gamers and as a company we can’t fault Nvidia for taking Apex or Fortnite players more seriously than us.

2 Likes

Ah then you missed that was the reason for the delay release of pitool 109 (106 was originally scheduled to release around Feb 18). Nvidia & Amd are testing the headsets with their drivers. Amd is still working with pimax to improve their drivers. Nvidia is in part why we have FFR support on the 20series cards.

With StarVR folding only pimax atm is pushing VR tech atm. Pimax also seems to have a deal to sell deals on pimax hmd rtx combos.

2 Likes

Still not gonna change the company’s R&D though. What you are describing is actually Nvidia supporting only on the driver and software side. Imagine if they didn’t, if the performance of their graphic card is poor then that’ll be egg on their faces and may potentially turn to a PR nightmare.

So no, I still don’t see them giving full on support and R&D resources on VR.

1 Like

Well the results will speak for themselves over time. But we do know the 3 are cross testing.

Will the i7700K bottleneck a rtx2080ti?
Are you shure about that?

1 Like

At stock 100% clocks cpu/gpu bottleneck calculator says yes.

27.x% anything over 10%.

3 Likes

Hello there,

I still have a z270 a pro motherboard with a i7700K and a rtx 2080ti.
Is there a cpu for my motherboard what doesnt bottleneck the rtx2080ti?
socket is LGA1151

1 Like

8700K for price
9900K for performance

They both use socket LGA 1151

1 Like

you have to check your motherboard manual, i had a z170 socket 1151 but the limit was 7700k

2 Likes