420 chroma subsampling would allow 2x the refresh rate at full res and look better than 1440p too

I see some others have mentioned this, on the sidebar there, but let me throw my hat in: Chroma subsampling IS AWESOME AND SMART.

No, it’s not dumb, it doesn’t suck. It’s brilliant. You DO NOT NEED FULL RGB NATIVE RES for 99% of content. Even with 420, you can use bilateral filtering to solve coloured text issue. Virtually every TV now has built-in high quality chroma upscaling filters which make video content (including VR games) virtually indistinguishable from RGB / 444 signals, except with “free” double the refresh rate.

This post is my gift to you. If you haven’t already realized what virtually all the major movie studios have long figured out, it’s that video content doesn’t need full chroma resolution, the brain can’t tell the difference 99% of the time, and for the rest, just use better chroma upscaling filters. I’d much rather an 8KX at 4K 160 hz in 4:2:0 than 4k 80 in RGB. And if you were to compare 1440p 120 in RGB vs 2160p per eye in 120 hz in 4:2:0, the 2160p version would look WAY SHARPER despite the reduction in chroma res. These are basic, basic facts about human vision and video encoding which have been well-known and studied for ages. Hence why 100% of video is encoded and sent to consumers, including at 4K and 8K, in 4:2:0. Because it’s smart, it works, and it’s the right thing to do.

Thanks for listening.

EDIT: by @MReis dear @Alan.sun, @SweViver would be interesting to know if this is an Option and on the Teams List?

5 Likes

Are you talking about wireless? I’m sure this is being done for the wireless module.

For anything else, it doesn’t help.
Refresh rate and resolution are either limited by the panel or by the performance of the graphics card.

Sending a super resolution image (higher than native) from the GPU to the headset (i.e. 4K to a 1440p headset) would reduce fidelity, not increase it.
That’s because the cable protocol sends the image lossless. Unlike any video you watch on Netflix etc. which are all super lossy.
So of course an Internet video will look better in 4K on a 1080p screen than the “native” 1080p video, simply because the bitrate is higher on the 4K one.
Again, this is not the case with display protocols.

4 Likes

That’s not what he’s talking about. Video formats by the studios chroma subsample. If you go to the store and buy the latest 4k UHD bluray, the video is 4:2:0 and not 4:4:4. However, on your PC (and perhaps the latest consoles or upcoming consoles, I’m not sure) where the rendering takes place locally 4:4:4 makes more sense, because why not. You’re rending in real time and not worrying about making a video stream fit to any particular media size. OP is suggesting that chroma subsampling would reduce throughput requirements, presumably so that more frames could fit down the pipe.

2 Likes

Yes, that’s correct. 4:2:0 makes a lot of sense when you try to fit a several hour movie on a 25GB disk or when you try to stream a 4K video through a 20Mb/s internet connection. But for sending a rendered image through a 30Gb/s DP cable, this doesn’t yield any benefit.

3 Likes

Although I have to add, that I agree with OP, that with a 8kX, we are getting close to the uncompressed bandwidth limits of widely used standards like DP 1.4.
When we reach those limits, the obvious solution is to use DSC compression, which is supported by the DP interface of the GPUs. This does use 4:2:0 or 4:2:2 encoding to save bandwidth.
But as far as I know, the limitation is the 4K panel refresh rate at the moment, not the cable bandwidth.

3 Likes

The cable bandwidth is definitely the main bottleneck and limiting factor for the 8KX.

4 Likes

I would expect the bandwidth to be sufficient for the 8K X at the moment.
To transmit two 24bit uncompressed UHD images 72 times per second, you need 14.9Gb/s. DP 1.3 should be capable of 25.9Gb/s net transfer rate, after calculating for protocol overhead.

So native resolution 120hz on the 8K X should theoretically be possible on the DP 1.3 (which has been around for a while now).

Although I can imagine, there are still many cables out there that would not deliver the full speed at 10 meters or so.

DP 2.0, which will probably be used on the new Nvidia and AMD GPUs being released this year, will tripple the effective bandwidth of DP 1.3, so that should also bring some relieve to the situation for the future.

2 Likes

They had to work really hard on a high spec and expensive DP 1.4 cable to manage the native res at 75hz. And only one cable has been found that can extend it. If the cable wasn’t a factor they could output at 120hz in native 4k which would be awesome!

1 Like

Even DSC compression allows chroma subsampling to work simultaneously, because chroma subsampling is cheap and easy way to reduce bandwidth. At the same cable bandwidth, you can trade 50% chroma res for double the refresh rate, stereo 3D, passing depth along with colour (something the new UHD Bluray 3D standard does).

I’d hardly call 2x the refresh rate not worth having, as an option. Nobody’s forcing anyone to use 4:2:0 for games or PC use or VR use. It’s just dumb not to have it as an option, when Pimax is trying to have high refresh modes using objectively inferior lower res.

Take a given res X, Y, pixel clock has bandwidth B ~ X * Y

4:2:0 gives B’ = B / 2.

To reduce X,Y such that B’ = B/2, using RGB, you need X’ = sqrt2(X) and Y’ = sqrt2(Y)

Roughly speaking, 1440p is root 2 in both dimensions of a 4K display. So that when you square it, it’s 2. Easy, basic arithmetic.

Now, as an engineer, what’s a worthwhile tradeoff? People cannot tell the difference in video content, according to vast amounts of research by both movie studios and game companies, most of the time, when chroma subsampling is used. There are places when chroma subsampling is indeed used in games, such as the famous YCoCg Compact frame buffer paper, if you are starved for GPU memory (say, on a console). Google it. It’s implemented in many game engines. They’ve done studies, unless you’re pixel peeping on a static screenshot with 1-pixel wide text, you can’t tell the difference.

So, 4:2:0 at X,Y, or RGB at sqrt2(X), sqrt2(Y)? Which is better? Easy, 4:2:0. You need to compare like for like, which means, in this case, bandwidth used for RGB that could easily be used for something better, like doubling the refresh rate.

I highly doubt at 4K per eye res, with your head constantly moving, you can tell the difference from RGB. So let people decide what they want. Nobody’s forcing anti-chroma-subsampling folks from using it. It’s just a smart tradeoff.

Full res luma + 1/2 res chroma >>> better+sharper than sqrt2 res in x,y for both.

What do you think looks better, 720p, or 1080i ? Same thing. 1080i is half the bandwidth as 1080p, but still quite good in terms of spatial resolution at least. On old CRTs interlaced modes were brilliant because, there too, you could save half the bandwidth and get twice as many channels w/o compromising vertical resolution (your visual system persists this detail over time, so it’s hardly noticeable). On a progressive display interlaced is inferior at the same Hz, so the main reason one would use it is if you wanted to double the max refresh rate at the same cable.

This stuff has been studied thouroughly. Chroma subsampling is smart and here to stay. There are even test images of EXTREME chroma subsampling, where the entire picture is black and white except for a few vertical and horizontal lines that are in full colour, and it’s amazing, your brain fills in the colour gaps. You could practically render your game in black and white and only apply colour to a few places and it would still look full-colour (albeit with some imperfections, to be sure). But 4:2:0 is not perceptible, especially at 4K. Maybe in a VR headset, but losing out on double the refresh rate you could’ve had is worse than losing out on a tiny bit of colour detail.

If I had an OLED TV capable of 4K 240 hz internally, but my HDMI 2.1 cable and circuits couldn’t pass more than 120 hz at 4K, I’d much rather run 4K 240 in 4:2:0 than 4K 120 in 4:4:4, for fast paced games. Maybe 4:4:4 matters for static games with lots of text like RTS, turn-based strategy, MMOs with lots of small icons and text. But in VR, your head is constantly moving, and this is before you even consider what types of awesome chroma upscaling filters are available now in hardware with ultra low latency.

But I’d take bilinear upscaling chroma with double the refresh rate any day, especially at 4K for VR uses. 150 / 160 Hz is going to be massively superior to 75/80 hz for VR presence.

3 Likes

Willy, I was talking about overcoming the cable bandwidth. Indeed most games are rendered in RGB but there are various tricks applied like temporal AA, reprojection, etc, which means even the raw uncompressed frame buffer isn’t typically native res. And as we all know in VR, people use DPI scaling to try and stuff more quality and get rid of aliasing on lower res headsets, which is a pity because the same GPU with supersampling could have been used to generate a native 4K / eye signal, quite often.

Aliasing is always an issue but the higher your native res, the less of it there is. I’d always apply AA even with an 8K or 16K headset, or monitor, because specular highlights can result in glint detail that are 100x smaller than a 4K pixel. So yeah, you always need anti-aliasing.

We’re all on the same side here! Trust me, if you had the option to compare 1440p 120 in RGB with 4K 120 in 4:2:0, you’d conclude that 4K 4:2:0 was sharper. No question in my mind.

The games are still rendered in RGB but we’re just talking about overcoming the cable bandwidth limitations. Or wireless streaming. I would always prefer higher refresh rate than higher chroma res, except maaaybe if I was using the VR headset to do virtual desktop and lots of text and typing.

Once we get eye tracking in there, though, I’d definitely say 4:2:0 w/ 2x the refresh rate should be used all the time, even for text. Because of ocular microtremours your eye supersamples temporally, which doesn’t work well with refresh rates < 100 hz. A 4:2:0 image at 120 hz that tracks your eye movements precisely would likely look sharper than an RGB image at 60hz, due to spatio-temporal supersampling and the way our perception works.

I agree with what you’re saying. Nobody is arguing against using lossy compression methods (like 4:2:0) for video transmission, where we are bandwidth starved. But with DP 1.3 or higher, we should have sufficient bandwidth for dual UHD panels, at as high refresh rates as they can go.

And if not, then you’re right, they should use DSC, which can use 4:2:0. It’s not really up to Pimax to implement the chroma subsampling, it’s the DP interface’ job to do that.

3 Likes

Sorry man, this is just objectively wrong (when talking about 8KX). Doubling your max refresh rate is always worth it, if you are bandwidth limited. Which they are.

If you have more cable bandwidth than your display can handle, sure, use RGB everywhere.

But in this case, Pimax is giving people the option of dropping to 1440p to get higher refresh rates, which is a strictly inferior way of allowing higher Hz compared to chroma subsampling.

Bottom line: if you care about detail and overall quality, don’t drop your luma resolution if you don’t absolutely have to. Luma res matters more than chroma res. Dropping RGB res by sqrt2 in both dimensions is worse than dropping chroma res by half but keeping luma res the same.

It’s a pity the other 4K native HMDs didn’t allow the option of chroma subsampling to pass full luma res to the display, instead of forcing the scaler board to upscale to the panel res using RGB.

I guess it was a limitation of the scaler, but I’m shocked if even VR input scaler boards don’t support chroma upscaling. I have a DIY LCD HDMI input board meant for DIY VR HMDs that supports chroma upscaling. Costs 50 bucks for the LCD including the board.

You need to know the specs of the bridgechip. With for example the 8k and 5k+ from what I recall. You could run 1 4k panel at 120hz and 2 panels at a lower res/refresh.

So there is more than just playing with the input stream.

Now in theory if the advertised BW feature ever comes out. It might be possible with Alternate Eye. As only 1 panel is being sent a pic at a time.

1 Like

I was assuming, we’re not bandwidth limited by the cable.
DP 1.3 is rated at 25.92 Gbps of uncompressed video data net payload.
Every GPU, even low end ones, of the last few years has DP 1.3 or usually even DP 1.4 outputs.
The Pimax 8K X currently uses 14.9 Gbps of that bandwidth, which is only a bit more of half of that.
A quick Google search reveals you can easily buy up to 50 meter DP cables which are rated at 21.6 Gbps transmission.
I was assuming the bottleneck is somewhere else, namely the scalar or DP interface in the Pimax.

But if it’s indeed the cable for some reason, then it definitely makes sense to try to use DSC with 4:2:0.

1 Like

If the scaler / bridge chips can do chroma upscaling, which I would be surprised if they didn’t (it’s a pretty standard feature to include on such chips since the mid-2000s), then you could pass both panels at the higher resolution + refresh for the price of dropping the chroma res in half. It’s very much a worthwhile tradeoff when you are bandwidth starved like VR headsets definitely are.

But yeah, it depends on the hardware. It’s possibly just a firmware update that’s needed.

Remember when Nvidia added 4:2:0 to HDMI 1.4-based Kepler GPUs so they could pass 4K60 in 4:2:0? 4:2:0 isn’t officially part of the HDMI 1.4 spec, it was only added officially to HDMI 2.0. Didn’t stop them, it works. AMD added VRR and HDR to HDMI 1.4 output chips via firmware update too. (original PS4s can output HDR now, IIRC).

I believe supporting 2x refresh via chroma subsampling modes might be possible for the Pimax 8KX, via firmware update.

I usually don’t quote myself but I just realized something:

“A 4:2:0 image at 120 hz that tracks your eye movements precisely would likely look sharper than an RGB image at 60hz, due to spatio-temporal supersampling and the way our perception works.”

I’m now wondering whether eye tracking is even necessary for high Hz to benefit perceived resolution in VR scenarios. It would need testing. Even if you keep your head very straight, the delta between each frame due to the head pose turning even slightly would result in the chroma subsampling error being erased from your mind.

4:2:0 is good enough even for 24 FPS movies, but with fast, constant, full-camera movement at very high refresh rates, it should be even less perceptible. Movies and shows typically have static cameras and forground objects move, in most scenes you’re watching people or things move, but the camera rests static relative the background. Even despite this (where there is no motion blur), 4:2:0 isn’t perceptible. I believe with head movement and persistence-based motion blur, the difference between 4:4:4 and 4:2:0 is going to be negligible. Especially since the head / camera itself is moving constantly, even if ever so slightly. So high Hz is a big benefit to VR I think. I’ve never used a headset faster than 90hz so I’m only conjecturing here, but I do have a 144 Hz and 240 Hz display so I know high Hz does wonders.

Again if the bridgechip supports the refreshrates. It is possible pimax may already be using this on the 5k+ models. Then of course there is also the panel limitations to take into consideration which is likely why the 5k+ serial 20s so far is capped at 110hz where as the 203/204 can do 120hz & 144hz.

1 Like

Oh yeah, obviously there are potential showstoppers here that might make this a non-starter, but the interface to the panels is always at the native res, regardless of the input signal coming off the wire. But any stage before the panels themselves a signal processing might be undefeatable, or have other limits. So if a panel can do refresh rate X, the scaler’s max pixel clock becomes the main issue.

1 Like

The calculations have been done several times around here. Yours is off for some reason. I think there are other overheads involved you aren’t factoring in. The cable bandwidth is the reason the 8kx can only run at 75hz and not 120hz and it is using DCS to achieve even that and then at a pinch. So it takes a custom cable and is basically not extendable with any other cable on the market bar one.

1 Like