HDMI 2.1 spec (updated today) can handle 4K @ 120Hz, 8K at 60Hz

Not sure how it helps Pimax but at least it is now an official standard. The Enhanced Refresh rate specs sound interesting…

Read more here: SPECIFICATIONS - HDMI Forum

1 Like

Pimax are using DP 1.4 which already meets these specs. In fact DP 1.3 met these specs back in Sept 2014, which makes me wonder why 120Hz 4K monitors have never become a thing.

DP 1.4 is only marketed so far, ANX7530 chip used on HMD(V3) is supporting only DP 1.2 speeds(21G/s raw)…

To be perfectly correct ANX7530 is markted as DP1.4 “ready” (which is something some gfx cards use as well) and supports HBR2.5 (6.75Gbps for one lane), which is non-standard speed (notice 2.5 after HBR) as only HBR2 (DP1.2) and HBR3 (DP1.3, DP1.4) are defined by the standard. I believe this topic has been already discussed many times on this forum and if you want to follow on it, just search the corresponding topics.

3 Likes

Not really. The time between the spec is solid and accepted and the time it is implemented in the hardware is usually several years. Even now when we already have DP1.4 established, full DP1.4 support is hard to find (as for example seen on aforementioned chip from Analogix).

The devil is in the detail. So while Pimax uses Analogix “marketing speak” to promote DP1.4 compatibility, it does not mean it support all DP1.4 speeds and unfortunately for Pimax the fastest speeds are not supported by the chip (http://www.analogix.com/en/products/dp-mipi-converters/anx7530).

So technically, depending on the gfx card support, even when using DP1.4 signaling the port can still run on DP1.2 speeds. Which seems to be (one of) the problem that prevents Pimax reaching 90Hz refresh.

1 Like

Yep, I know about that, Pimax SW engineer wrote 21G/s, so HBR2 it is. Marketing only because out of DP 1.4 specs, not HBR3 nor DSC support :confused: . There is idea to tell chip that screen is 80% res, since 80% is used by HMD(to allow HW IPD setting), LoneTech suggested that to Pimax, maybe will solve some problems. We wait for update this week about it.
Edit:
I have also idea, if that trick with reducing input res to for example 2304x1440 per eye, that would in theory allow 10% faster refresh rate, from 82 to 90 doesn’t work. But it would require HW solution, to use adapter/breakout box. So that we take advantage of HBR2.5 speeds, by making custom box that will accept HBR3(DP 1.3+) from GPU, but output signal using HBR2.5 protocol, if Analogix can help to define that spec. I hope someone understand idea. So breakout box has input DP 1.3/4, output DP 1.25 :slight_smile: . @Matthew.Xu @bacon thoughts?

2 Likes

Acording to pcgamer 2.1 can eventually take 10k@120… 10K gaming at 120Hz will be possible with HDMI 2.1—but not for a while | PC Gamer

Problem is of course a chipset to run at and accept that input

Thanks @SpecsReader We will think about it. :slight_smile:

1 Like

Thanks.Our current solution is using DP, not HDMI. so the data cable is not the limit.

1 Like

Great. Are we going to get update on 90Hz progress this week? Some new findings at least, what is the bottleneck(if @bacon is correct that HBR2(DP1.2) speeds are enough for 2x1440p@90 stable)?

1 Like

The whole VR industry is driving new demands and thus challenging all methods of visual input. It is a rapidly changing landscape but I have my own predictions…

Lets assume that the traditional monitor will be obsolete in the next 10 years due to VR/AR. That means that the shift (as we see it now) moves to micro panels for all our digital visual input.

Lets now assume that micro panels could hit 10k @ 120Hz. That means that the GPU as we know it today has no chance of supplying that amount of pixels at 120Hz, that article said what? 50 x more pixels than a 1080p display! I dont think the GPU advances that fast, not in 10 years. it would need foveated rendering which needs eye tracking. Again, as the demand for eye tracking research progresses also, so does light field tech like Avegant so I think panels large or small will become completely obsolete too. Of course we can just throw more power at a problem (multi GPU, huge PSU’s etc) but in a world that wants to get greener that is not the optimum choice, certainly not cost effective for mainstream.

I am just musing here, none of that might happen in the next 10 years :slight_smile:

Because most PCs cant run then?

But we don’t necessarily need the rendering resolution to be as high as the pixel resolution. We need the pixel resolution to be high to eliminate sde, but the resolution of the software can be whatever and just be scaled up as gpus progress.

SDE is one part of problem. They may find ways to get around that with the panel technology like how the array is structured in combination with the lenses themselves. If not then higher PPI is the brute force solution. The other problem is clarity which comes with higher resolution so increasing resolution and rendering at native solves that, as long as you have the power. So yes, I agree, you can render at 75% of native and scale to 100% as a way to improve frame rate at the cost of clarity.

We have changed to DP, not using HDMI again.

1 Like