What is the resolution shown in SteamVR when SS=1 in 5K+ and 8K?

All current VR devices use an internal SuperSampling ratio, so the rendered resolution with Supersampling=1.0 is not the physical display resolution.
With the Rift (2160x1200) when you set SS=1 you have 2688x1600 for the render
other devices here:

I would like to know the resolution shown by SteamVR (below the SS slider bar) when SS=1.0 (ie 100%) for the Pimax 5K+ and 8K.


8k, screenshot from the last year (prototype)


…that must be for an 8K-X. The whole point of the 8K is to not have to render 4K per eye. Unless it has something to do with PiTool’s funny business of having it’s own render/sample settings.

Read first post wisely to undertand the issue, its explained very well. The screenshot is from 8k as it clearly shows 80hz :slight_smile: 8kx doesnt exist yet.
p.s. the whole point of 8k is to achieve 4k per eye, althrough it cant receive it fully. That doesnt means it should not or cant render at this resolution. As noticed, it was a screenshot from a prototype. Can happen that final version will use lower less data.

I understand fully. My point is that the 8k is supposed to render at a lower resolution because if you are going to render at 4k per eye then the 8K and it’s upscaler(what differentiates it from the X) are pointless. If your PC is capable of rendering two 4k images at once with a playable frame-rate then the X is for you.

1 Like

Sure, although there is no choice so far. We will find out real resolution starting from september 16th. It could happen that resolution on the screenshot was supersampled by the pimax software. They were showing that the prototype is working stable on 80hz. The point of such a high resolution was unknown.

From what I’ve read that seems right. To get the most out of the 8K you still need to super sample, so you’d send in an 4K per eye signal, downscale it to about 2k to get it down the DP cable and the headset chip then upscales it back to 4K per eye.

The 8K X would be the better choice if you’ve got a PC that can handle this but the thing that’s lacking is the speed of the DP cables and the ability to get that signal to the headset which is why the 8K X will be using double DP Cables and not needing the upscaling chip which is stuck at 80hz

(Hopefully somebody can confirm this is correct but this is how I’ve understood it)

That’s pretty much it, yes. We don’t know the limitations of the upscaler chip precisely, but the receiver chip ANX7530 isn’t HBR3 capable. So in practice (without at least special GPU drivers) it’s stuck at DisplayPort 1.2 HBR2 speeds, meaning 17.28Gbps. Divide that by 3 channels (red, green, blue), 8 bits per channel, 2560x1440 resolution, and 2 eyes: You arrive at 97.6Hz maximum capacity. Unfortunately CVT standard timings don’t let you use that full capacity. Remove the resolution factor and we get a maximum pixel clock of 720MHz; a reduced blanking v2 mode for 5120x1440 90Hz requires 714.23MHz, so that’s just barely within spec.

So what does that mean for the 8K X? Standard timing for 3840x2160 at 90Hz would require 811.44MHz pixel clock, decidedly not available. So we need to compromise somewhere. In fact, just the pixels without any sync times would require 746.5MHz pixel clocks. We’re well out of the bounds of what DP 1.2 can do. Lowering the frame rate to 80Hz does get us back in, and this is likely to be the origin of the 8K model scaler chip limitation; 3840x2160 80Hz requires a pixel clock of 717.76MHz. But the 8K X was supposed to be the no compromises model; I think it may be prudent to await next year when the ANX7538 chip launches with HBR3 speeds (DP1.3 HBR3), compression (VESA DSC, included in DP1.4), and built in scaling (so no separate scaler chip needed). That still involves a compromise as it would rely on lossy DSC to make up the roughly 47% capacity missing (raw HBR3 should permit 1.08GHz pixel clocks, but 7680x2160@90Hz RB2 requires 1.59GHz); it’s quite possible the dual cable plan will be used as well.


It has been said somewhere in the forum that the steam value was misread. What is relevant is what is set in the piplay software. I hope we will get the exact values by the 16th at the latest.

so you have to run games internaly at 4K?, i thought 2.5K (2560x1440) was the best resolution to urn games at fro the 8K, internally at least.

That’s the image transfer resolution (for the 5K/5K+, at least). The render target is typically at a higher resolution because of lens compensation. The resolution is chosen to be close to the central display and tends to be higher than necessary around the edges, which is where lens matched shading may help. Doc Ok has explained much of this well.


so for the 8K we have to run games at almost 11mpix (4096x2657)? and DP will only send a 3.7Mpix (2560x1440) image.

depends what resolutions is set in PiTool

“That’s [2560x1440?] the image transfer resolution (for the 5K/5K+, at least).” – So are you implying (as I think do many other discussions) that since 2560x1440 is native for the 5K it somehow is except from the need to SS (just let it render at 2560x1440) even though it has the same lens compensation needs as the 8K?

And for simple image enhancement SS, i.e., anti-aliasing, I would think the 5K would have the greater (or at least equal) need since it has lower pixel resolution.

The whole upscale-downscale-upscale process for the 8k seems crazy to me. Heavier oversampling burden for the 8K was not suggested back at the start of the KS. God what I’d give to have an HMD in hand so I could see the difference. I’m glad we can change to 5K if we want but its a difficult blind choice. Reducing the GPU burden is VERY important if you don’t want to really dumb down your graphics. With a 1080ti my system struggles now to render rFactor2 into a Vive. We need the testers to really give some detailed info. Through the lens would be great!

1 Like

Couldn’t YCbCr be used to spare the bits per channel? Sacrifice a bit of colour fidelity to gain resolution and refresh rate.

[quote=“ricknau, post:14, topic:8043, full:true”]So are you implying (as I think do many other discussions) that since 2560x1440 is native for the 5K it somehow is except from the need to SS (just let it render at 2560x1440) even though it has the same lens compensation needs as the 8K?
Certainly not I - quite the opposite. Both of them require lens compensation, so it makes sense both of them would use higher render resolutions. Apparently I wasn’t clear enough; it was a reply to MicroM’s post, which appeared to think the transfer and render resolutions would be the same. That won’t be the case until we do native ray tracing, and even then we’ll want antialiasing.

Yes, if you have a matching decoder. In fact, this is one of the steps within DSC, which uses YCoCg-R. However, the established protocols with YCbCr such as 4:2:2 chroma subsampling don’t save channel depth but go directly for halving resolution (holdovers from analog YUV video), and you’d actually want Lab* anyway for more accurate estimates in quantization (but that task is done in the GPU). It won’t solve if our bottleneck is actually the output pixel clock of the scaler, as that is normally where the decoding would occur.

Chroma subsampling to YCbCr 4:2:2 would mean pairs of pixels share colour, similar to how Pentile panels don’t show full colour per pixel since a pair is needed to contain both red and blue. It is routinely done in video formats and does have a decent chance to go unnoticed, though I’ve seen completely broken implementations such as Samsung 4K TVs that would ignore 3/4 of the colour information if sent 4:4:4 (whether in YCbCr or RGB). This issue shouldn’t apply for Pimax if they were to add YCbCr since the GPU would perform the subsampling and should do the proper blending, but the Samsung device lied about its capabilities. 4:2:2 does have the advantage over 4:2:0 that it sends complete lines; 4:2:0 can’t be decoded without an additional line buffer. DisplayPort specifies to support 4:4:4 and 4:2:2 if YCbCr is supported, and colour depths of 8bpc or more, so 4:4:4 doesn’t save us anything. On the other hand, HBR support would be great in VR, it’s just that we have this bitrate issue to handle already.