Pimax 8K VR Frequent Asked Questions

Im not being aggresive. I want them to shoot straight with us about what will work on day 1 of the kickstarter with those early backer units. I dont need to be agressive, the industry itself is very agressive.

If they are not as clear as possible about what the headset can do on day 1, kickstarter backers will spit out their bones.

I dont say this to be harsh, but because I want to see this company be a success.

I was in bigscreen the other day in fact promoting this HMD and their kickstarter campaign for almost 2 hours, telling people to come and back the Kixkstarter, and now the company is deciding to tell us right before the KS goes live that it suports 1440p out the gate?

They tell people a min spec of 980? Nobody who wants to back this hmd is doing it for a so so VR product. People want it because its high end, or alleges that it is.

I just want clarity from them so they succeed.

1 Like

Pimax 8K only requests 980/1070 for 8K resolution and eliminate SDE

We can offer native 4K, but please notice not every computer can run native 4K. so we have to offer it in a different version.

You keep avoiding the question: why doesn’t the ‘regular’ 8k version support native input 4k ? Why are you talking about 2x DP connectors when 1 DP is all you need ? The ‘regular’ 8k version has all the hardware to support 24k90hz input, what am I missing here ?

Been following this headset for a while. I already have an Oculus rift and a 1080ti, if this isn’t significantly better with native 4k out the box.

I might as well give the kickstarter a miss and wait and see what how things pan out.

1 Like

You have to input “half” 8K in any case (8Kx4K), IMO they have just verified that only 1 DP 1.4 can not handle that because of problem with Display Stream Compression (DSC) or just simply because this special format drop the frequency because of limited bandwidth. Hence they need 2 DP.

Anyone who wants this feature would/should know that their GPU has to be top of the line or very close to it in demanding titles.

What’s wrong with having a choice to utilize the option?

1 Like

@deletedpimaxrep1 when you say “version” do you mean new piplay software or new headset hardware will be needed to enable full 4k input?

If the kickstarter unit has displayport 1.4 it should only need 1 cable, unless you are doing like Dell and using two earlier displayport 1.2 cables with an adapter?

They don’t have to half anything. Just send 4k resolution at double the framerate to the headset. DP 1.4 supports 4k up to 240 hz. That’s enough to send 4k at 180 hz. Then divide it among the 2 panels. First frame for the left eye, 2nd for the right etc.

One DP isn’t sufficient for 2x4K 90Hz uncompressed. They haven’t got compression working. Whether that’s a limitation of their receivers, GPUs, drivers or firmware hasn’t been indicated. The specs for the ordinary 8K version remain one DP, with the full rate version using dual DP for dual panels. What’s disappointing is it’s taken this long before a straight answer of what it actually does use; 1440p. It beats the Vive and Rift by using more subpixels, higher fill ratio, wider field of view. But trying to sell the 4K panels when there’s no way to drive that resolution is plainly deceptive. That said, we’re talking 22.1 million subpixels compared to the Rift/Vive at 5.18. Even with the huge FOV upgrade, that’s an improvement. But at this point, it sounds to me like there’s no real reason to have the 8K model; only 5K and 8K X.

1 Like

If you check the forum, a lot of the theory behind this has been discussed - interesting stuff. There’s a lot of great info on it buried in post, just have to find it.

DP 1.4 comes with compression ! It supports DSC compression out of the box. all the 1070/1080 cards that have DP1.4 have DSC. But thinking of it … The 4k supports DSC compression but there is a possibility though that their new panels don’t support DSC compression. That could be the reason …

Well if problems with the DSC decoding are the reason for 2QHD input only, then it should be still possible to provide 24K@60Hz without DSC. Maybe not high enough framerate for every game, but enough for a virtual desktop, enough for watching movies, enough for slower games.

1 Like

And if the panel doesn’t support DSC, then why not simply add a decoder chip ? It still makes no sense.

60hz low persistence mode would be just fine.

There is no reason to not allow 4k 60 on this hmd.

The frame buffer in steam vr at 1.5 is already just below 4k resolution, so I dont see any good reason to start it with 1440p per eye.

I wanted this HMD for a legitimate resolution improvement, not just for less SDE.

I want virtual desktop to look close to 720p definition. The SDE in the rift doesnt bug me. Its that all the content looks 480p in headset that bugs me. Lol

1 Like

Per game foveated rendering. The pure resolution though, different story.

Hehehe you keep going on about foveated rendering. Games do NOT support it and most likely will not in 2018. And even if they do support it, it does NOT reduce bandwith at all.

Foveated rendering doesn’t require game support for the resolution, but for other things like post processing, shaders, culling.

yes it heavily reduces bandwidth! google SID speech clay bavor, 2017. youtube.

Well there might not be one that does all the things needed. The latency might be to high or the input resolution might be to exotic. Such a decoder might support 8K@60Hz and 4K@120Hz but it might not support 2*4K@90Hz or 4K@180Hz because it is a asic with fixed function.
A custom design therefore would have to use either an fpga which is expensive and inefficient or a custom asic which requires a minimum production count of about 1 million units just to start the production.

Shaders, that’s game rendering. The only thing that can be supported without the game supporting it is the distortion rendering. But again, it has nothing to do with bandwith.