Ok. Now that we the KS has officially been funded and we are on our way…
Can we get more discussion going about how Brainwarp is going to be able to help us. Perhaps the Pimax team can put together a demo of sorts and talk about the progress. Brainwarp looks to be crucial in our play experience moving forward. I have seen the concept detailed out but would love them to talk about progress on the software side.
From my understanding it displays info by alternating back and forth one eye at a time at full frame rate. The brain then interprets this as twice the actual frames.
So a 90/180 Brainwarp setup would actually be still be using 90 frames per screen but the would alternate giving you the 180 illusion.
Anybody that can cosign this gibberish I’m spitting out?
This is accurate. Pimax have also spoken about having their own version of reprojection but that is not the same as brainwarp.
Brainwarp is not just a perceived effect in my opinion but more bc it is using more motion data than vive or rift, which take two snapshots per motion event- the pimax takes one, at twice the speed, and sends it to eyes alternated.
I undestood it like that, correct me if I am wrong. The GPU renders with let’s say 180fps, each display supports 90Hz (i’ll hope they manage to arrive there).
So Hz = 1/s means 90Hz = 90 images per sec = approximatly every 0.0111 sec a image can be displayed. The half of it is 0.0055s
GPU renders first image -> send it to left eye display -> t = 0s
GPU renders second image -> send it to right eye display -> t = 0.0055s
GPU renders third image -> send it to left eye display -> t = 0.0111s
GPU renders fourth image -> send it to right eye display -> t = 0.0166s
So every display refreshes with 90Hz but the GPU renders the double FPS. So the brain “sees” something in between 90Hz and 180Hz. I am sure pure 180Hz would be better, but this seems to me like a smart trick to make a better experience with the same hardware. It wasn’t possible before with Vive/Rift/etc because these HMDs had only one display for both eyes.
This is how I understand it as well. I just want to see something a little more concrete as all we have seen so far is theory. Do they have a working prototype?
That’s pretty simple. Each screen runs at half phase of the other.
One of my biggest anticipated features is playing games on vorpx at less cost per frame due to only having to send one single frame instead of making two frames and sending them. The feature im anticipating is setting a flat game on vorpx to run its in-game fps to 180, and having native support to display ALL 180 frames between both your eyes, and also be using vorpx’s “geometry 3d”, which is basically true 3d for everything in that games world. Essentially vorpx uses vr hmd to turn them into incredible 3d viewing of your flat games. Having the ability to drive those games at ultra high framerate and have those all sent to your eyes without rejecting any frames would be a huge pimax 8k seller.
Rocket league if you havent tried in vorpx, runs perfect, and would look incredible at 180 fps on a huge 3d screen. Excited now?
You need to articulate this as much as possible to make sure pimax gets what were asking for and how important this feature is.
So far Brainwarp is nothing more than a marketing buzzword.
It sounds simple, bur the proof is in the pudding…
We just know that ir sends the images for each eye with a half frame offset to the panels, which display the image in a sequence instead of simultaneous.
The problem with that: if the right eye always sees adelayed image, the the current head position (if moving the head) does not fir with the picture. This is why Oculus developed ASW, which is a warping of an old frame to accomodate for the head movement. If pimax is doing this then they need to develop their own timewarp but the problem of always the same eye sees a warped image remains. This is very likely the way they will go - at least this makes the most sense to me.
All the other ideas of how brainwarp will work do have the problem of neede support of the gameengines (e.g. rendering only one viewport, but this at 180Hz, sending each eye a correct image) which will not happen (except with the help of Ralf in the use of his VorpX), and/or is more demanding for the graphical subsystem (like the dreams of real 180Hz above)…
Brainward is supposed to help us in the case of not reaching 90FPS steady. If i can render 180Hz nobody needs brainwarp…
Brainwarp probably won’t be able to use warping like Oculus ASW. The 90 Hz limit is mostly due to bandwidth limitations between the graphics card and display. As I understand it, ASW creates a new warped frame and sends it to the headset display. The only way I can see warping working in conjunction with brainwarp is if the ASW warp happens in the headset scaler unit. Normally, that’s a simple stretch/resize, not a complex warp.
That’s why I originally started this thread. I simply have not seen any hard proof of this software’s existence. Again, I’ve already backed the campaign and want an 8K regardless but this is something that can really aid performance. The fact that we have yet to see it in action makes me wonder if they can pull it off (on the software side).
do you get lost in your own imagination and theories?
DO you know what 3d monitors do?
same concept alternating between each eye. and last time i checked i can look wherever i want and the image is the same. there is no delay, do you even know you cant even spot that speed with your eyes, the only thing that can spot is your brain…and last time i checked you not looking at the screen with your brain but with your eyes…
I do too with shooters. So far (a year) there have been zero issues with rocket league and vorpx knock on wood they seem to be cool with it so far. It is beautiful and vorpx made me still wanna play rl after vr came out.
After seeing a very informative video from Valve GDC (posted by @VRGIMP27) I have a new theory how that Brainwarp stuff might work: If they predict the head position at the point in time when the left or right eye are displayed, then both eyes could be rendered at the same time.
Not sure, perhaps this might even work without patching the game as long as the game requests the transformation matrices for the eyes per frame from the engine (with GetProjectionMatrix) - so all this prediction stuff could happen at driver level. (Not sure what OpenVR delegates to the low-level driver and thus whether providing individual eye-matrices is in control of Pimax or Valve).
Depending on that this might/might not work without game (or at least OpenVR) modifications.
But it’s all guess work, we don’t know until they release more infos…
Pimax do not have the resources to research things like ATW/ASW (remember how long it took for oculus to get ATW and then ASW right?) They do not even release an updated PiPlay version with the features of the old version (no debug tool, etc.). I hope they will now 5-fold the human resources they put into software. Their software really needs it!
I really want Brainwarp to be the killer feature for the 8k, but so far not one showcase. Not even for the 4K. Just this picture of alternating frames sent to the displays, which doesn‘t take into account that the game engines do not support that.
For the eyes to see a image which is in sync with the change of position is essential for not emtying your stomach. Oculus found out that 84Hz is a minimum for nausea free VR for most people. They found that out after bringing us DK2 with 75Hz.
So if Pimax wants us to enjoy VR in the 8K we should remember these scientific results…OR they really finally enlighten us with their magical brainwarp feature? Please @PimaxVR! What is the status quo regarding Brainwarp?