Even DSC compression allows chroma subsampling to work simultaneously, because chroma subsampling is cheap and easy way to reduce bandwidth. At the same cable bandwidth, you can trade 50% chroma res for double the refresh rate, stereo 3D, passing depth along with colour (something the new UHD Bluray 3D standard does).
I’d hardly call 2x the refresh rate not worth having, as an option. Nobody’s forcing anyone to use 4:2:0 for games or PC use or VR use. It’s just dumb not to have it as an option, when Pimax is trying to have high refresh modes using objectively inferior lower res.
Take a given res X, Y, pixel clock has bandwidth B ~ X * Y
4:2:0 gives B’ = B / 2.
To reduce X,Y such that B’ = B/2, using RGB, you need X’ = sqrt2(X) and Y’ = sqrt2(Y)
Roughly speaking, 1440p is root 2 in both dimensions of a 4K display. So that when you square it, it’s 2. Easy, basic arithmetic.
Now, as an engineer, what’s a worthwhile tradeoff? People cannot tell the difference in video content, according to vast amounts of research by both movie studios and game companies, most of the time, when chroma subsampling is used. There are places when chroma subsampling is indeed used in games, such as the famous YCoCg Compact frame buffer paper, if you are starved for GPU memory (say, on a console). Google it. It’s implemented in many game engines. They’ve done studies, unless you’re pixel peeping on a static screenshot with 1-pixel wide text, you can’t tell the difference.
So, 4:2:0 at X,Y, or RGB at sqrt2(X), sqrt2(Y)? Which is better? Easy, 4:2:0. You need to compare like for like, which means, in this case, bandwidth used for RGB that could easily be used for something better, like doubling the refresh rate.
I highly doubt at 4K per eye res, with your head constantly moving, you can tell the difference from RGB. So let people decide what they want. Nobody’s forcing anti-chroma-subsampling folks from using it. It’s just a smart tradeoff.
Full res luma + 1/2 res chroma >>> better+sharper than sqrt2 res in x,y for both.
What do you think looks better, 720p, or 1080i ? Same thing. 1080i is half the bandwidth as 1080p, but still quite good in terms of spatial resolution at least. On old CRTs interlaced modes were brilliant because, there too, you could save half the bandwidth and get twice as many channels w/o compromising vertical resolution (your visual system persists this detail over time, so it’s hardly noticeable). On a progressive display interlaced is inferior at the same Hz, so the main reason one would use it is if you wanted to double the max refresh rate at the same cable.
This stuff has been studied thouroughly. Chroma subsampling is smart and here to stay. There are even test images of EXTREME chroma subsampling, where the entire picture is black and white except for a few vertical and horizontal lines that are in full colour, and it’s amazing, your brain fills in the colour gaps. You could practically render your game in black and white and only apply colour to a few places and it would still look full-colour (albeit with some imperfections, to be sure). But 4:2:0 is not perceptible, especially at 4K. Maybe in a VR headset, but losing out on double the refresh rate you could’ve had is worse than losing out on a tiny bit of colour detail.
If I had an OLED TV capable of 4K 240 hz internally, but my HDMI 2.1 cable and circuits couldn’t pass more than 120 hz at 4K, I’d much rather run 4K 240 in 4:2:0 than 4K 120 in 4:4:4, for fast paced games. Maybe 4:4:4 matters for static games with lots of text like RTS, turn-based strategy, MMOs with lots of small icons and text. But in VR, your head is constantly moving, and this is before you even consider what types of awesome chroma upscaling filters are available now in hardware with ultra low latency.
But I’d take bilinear upscaling chroma with double the refresh rate any day, especially at 4K for VR uses. 150 / 160 Hz is going to be massively superior to 75/80 hz for VR presence.