Hi guys, I currently have a pledge on the 8k x edition. I appreciate there are checks to carry out as to reducing or scaling input options for those with current generation hardware (I have a gtx 1080) But can you kindly confirm when you expect to identify this can be confirmed. I.e. It can accept input resolution of 1440p instead of the whole 4k to help scale.
It would be very useful to know this before the kickstarter ends…
Also looking forward to an answer to this question. (Doubt that even the Volta chips will be able to drive two 4k displays at 90 Hz for anything but perhaps virtual desktop/movie applications, so an 8K X without an upscaler would probably not be usable for years for games).
With an upscaler that can produce (almost) arbitrary in-between resolutions up to full 4k per panel it would be worth the 200$ extra for me though, so I can use it like the 8K for games and at higher resolution for movies.
So the imho interesting questions to find out (optimally before the end of the Kickstarter) would be:
- will the 8k X support the upscaler?
1.1) when using the upscaler (with 2560x1440 per eye) - will the 8k x achieve similar framerates than the 8k with the same GPU? (so, does it make a difference whether one or two DP cables are used?)
1.2) are all applications (especially SteamVR applications) compatible with the two DP cable approach?
- which GPU do we minimally need to watch movies in 2x4k@90Hz (or if possible also @75 Hz) native resolution?
If all 1s are a yes and 2 is possible with a single 1080 TI then I’m in Otherwise I will likely downgrade to the 8k before the end of the Kickstarter.
P.S.: One interesting link: http://www.analogix.com/en/system/files/AA-004263-PB-5-ANX7530_0.pdf
There the manufacturer of the upscaler chip writes that the maximal input resolution of the ANX7530 is 4kx2k@60 Hz. This might explain why Pimax isn’t giving us a straight out “yes” regarding the question whether the 8k X will support upscaling. If 4k@75 and 4k@90 Hz should turn out to be impossible through the upscaler even with a dedicated DP+scaler per single panel then they might have to bypass the upscalers for native 2x4k resolution and only use them for lower ones. Which might be tricky/expensive(?).
Still some days left before the end of the Kickstarter though - keeping my fingers crossed that they can come up with a viability study before that and tell us whether it’s definitely possible to add the upscaler to 8k X or not.
Yeah, my understanding is that it allows a single 4k feed that is then split across the two panels, but that is likely my own mis-understanding… If we are talking about dual 4k feeds to each panel, it’s likely to be something 2-4 generations AFTER Volta that could deliver this
Also still unsure whether I fully understand this “Brainwarp” thing. One variant to interpret this would be that the GPU alternatingly calculates an image for the left and the right eye and sends this to the HMD as a 180 Hz input stream which is later on split into two 90 Hz streams by the scaler chip. If the panels are then offset by half a frame this might then create the illusion of 180 Hz (or at least higher-than-90 Hz) frequency. But it wouldn’t help anything with stress on GPU - it still would have to calculate 2x90 images per second.
The other interpretation would be that the GPU is calculating 45 frames per second per eye. The panels are running with 90 Hz each - synchronous this time - and alternate blank and calculated images. That way we would have the illusion of 90 Hz and less stress on the GPU. (Their explanatory images of “Brainwarp” go more into this direction. But then I have no clue how they come to 180 illusionary Hz).
Perhaps it is something in between where sync is offset by half a frame and for each eye only every second image is calculated and the in-between one interpolated.
No clue - just guessing - only Pimax can answer this.
Will also be interesting how the 2 DP solution has an impact on Brainwarp (depending on what it actually is ). Is it e.g. still possible to guarantee that panel syncs are offset by half a frame if these are two completely independent hardware paths now? Can the 180 Hz illusion still be maintained?
I hope they explain soon,as it makes all the difference to pledges I guess… Future tech is great, but not if it cannot be used at all for x no of generations ha ha
Hence we really need pimax @PimaxVR @Pimax-Support to clarify asap
there is a good explanation of how brainwarp works in another thread of this forum. use search.
You can search for the answer in this forum.
Pimax support has said, “Sorry for the later response due to limit support guys.
Yes, we’re internally trying the possibility of 8K standard version support the input resolution higher than 4K. but need some time to get the conclusion.”
It will get done, the just need some time.
^^Re: Brainwrap @mp4x did a great job in the other thread visualizing the options how brainwrap could be understood (interpretations from the community). I haven’t found a clarification yet though which of these variants it is now (so how many fps per eye does the GPU calculate, how is the offsetting done, is there interpolation involved, can Brainwrap also work on 8k X with dual DP/dual upscaler etc.)
Re max resolution of the standard 8k. This is also an interesting question, indeed. In this thread it’s more about the dual DP 8k X and the compromises we have to expect compared to 8k. (will it be a full superset of the 8k or lack some of it’s (e.g. upscaling) capabilities - see text above)
I do not think PiMax wants to reveal their trade secrets just yet. Most of this is still in the latter stages of development and I just don’t think it’s finalized. Even if it were, Why does everyone seem to think this is an open source device. PiMax has never stated that to be the case. They are open to user input about how to improve the product but they never have said that their engineering data would become widely available. I have been around VR for a very long time, (90’s) and have yet to see a company that freely gave away all of their design data before release of a product. All of this trying to get them to do so is futile at the present time. If you want to be in on all of the design, engineering, and development, I suggest you get a job with them and contribute that way.
I fully understand that no one likes being uninformed, especially when i comes to committing a substantial sum of money on an unknown. Such is the risk of supporting a kickstarter project.
Sometimes you just have to wait to get the answers.
This one I can answer: yes !
The Pimax HMD driver does its own rendering. SteamVR just forwards the left eye and right eye game rendered image and then it’s up to the Pimax HMD renderer to do the distortion rendering, interpolation etc. So it’s not up to SteamVR what happens after that, it’s the Pimax HMD driver that sends the data over DP. That’s also why you can’t dump HMD rendered images from within the SteamVR composer: SteamVR simply doesn’t even get them !
SteamVr => Pimax HMD driver => DP => HMD