New AMD GPUs and Pimax, any news on compatibility?

If that’s the case, why the first 8KX works like charm with AMD GPUs? I have one working with a 6800XT without any problem in native 90Hz. But then with the lastets models are not compatible.

I hope that Pimax only use standard protocols in the Crystal (like DSC) and not propietary ones of any brand.

2 Likes

Hi, please re-read my post. I am specifically asking for support for the new RDNA 3 graphics cards. This is a Pimax problem, not an AMD problem. While RDNA 2 support is also highly desired, I understand they were bandwidth limited which made them harder to support even with DSC (although, they were actually initially supported by the first 8KX…).

Whatever reason they can muster for not continuing RDNA 2 support, it absolutely cannot be true for RDNA 3. The Radeon 7000 series has the same or higher data throughput than NVIDIA RTX 4000 series and can easily handle 90Hz, 120Hz, etc.

1 Like

Then there shouldn’t be a problem either. There are standards for scalers and panel drivers, but they are not always backwards compatible for older generations.

Sorry that I only read half of your wall of text.

1 Like

Indeed especially since we know Amd does indeed support VR on other headsets and is fully supported on the Playstation.

1 Like

Yep in fact on the Original 8kX the 90hz worked from the first Beta FW on And where as it was hit & miss on Nvidia for a couple of FW versions and iirc still not working for some Nvidia users with the 2073, 2074 & 2075 models.

True but Nvidia’s implementation is much much better than AMDs which, afaik, is the main reason AMD is not supported by the 2076 8K X, Arpara 5k or Varjo Aero. It’s more on AMD than the headset manufacturers.

I do also hope that AMD gets full support but Pimax still don’t even have any of the new AMD cards (from what I’ve heard) while they had the 4090 for months before launch. Pimax have said they are working with AMD to improve things but at some stage I think the onus shifts more to AMD than Pimax.

Dp 2.1 is great and would solve the issues that I’m aware of but then they would be limiting themselves to an incredibly small market then, have a 2.1dp headset in the next year or 2 would be suicide. I think 2024 is the earliest, if even, that we’ll see hardware that actually outright requires 2.1dp. Maybe they can work it so that the 12k has 2.1dp for AMD but still back compatible with Nvidia with 1.4 and their DSC implementation? Something like that could work.

1 Like

The real question has Pimax this time provided Amd an 8kX and Crystal to work on there part or is it like the KS where they were going to make Amd wait til backers had theirs

An Amd contact secured my original 5k+ to be able to work on there end.

The more interesting thing is 8kX 2073, 2074 and 2075 all had better 90hz support on Amd Gpus even with the first beta FW that was advertised for Nvidia RTX.

So it is more likely that Nvidia has given incentives like before when pimax was selling for a short time RTX pimax bundles.

The revised 8kX is likely more leveraging Nvidia Propietary. features.

2 Likes

I want to expand a bit more on HDMI 2.1 vs DP 1.4 as the more research I do and think about the implications, the more necessary I believe HDMI 2.1 is for a $2,500 VR 3.0 device (prepare for another wall of text). With DP 1.4, we’re at risk of having a premium device

  • with a bulky set of cables (two display cables, one data cable, and one power cable); or
  • less bulky set of cables (one display cable, one data cable, and one power cable) but gimped resolution and refresh rates…

The standard specs are as follows:

  • DP 1.4 = 32.4 Gb/s max bandwidth (26 Gb/s usable)
  • HDMI 2.1 = 48 Gb/s max bandwidth (42.6 Gb/s usable)

That translates to the following max resolution and refresh rate support:

  • DP 1.4 without DSC = 8K@30Hz with 4:2:0 subsampled colors / 4K@60Hz
  • DP 1.4 with DSC = 4K@120Hz / 8K@60Hz
  • HDMI 2.1 without DSC = 8K@30Hz with full 4:4:4 colors / 4K@144Hz
  • HDMI 2.1 with DSC = 4k@240Hz / 8K@120Hz

The Pimax 12K displays should be about 5760x3240 each. That’s 18662400 pixels per eye, for a total of 37324800 pixels for a Pimax 12K headset. Since they advertised HDR support, we can assume a 10bit per pixel color depth. If we calculate that out with a refresh rate of 90Hz, we get a total signal bandwidth output of 120.94 Gb/s. With DSC, we have a 3:1 compression ratio for a visually lossless result of 40.31 Gb/s , which puts us within HDMI 2.1’s usable bandwidth of 42.6 Gb/s.

For comparison, let’s calculate the total signal bandwidth divided by the DSC compression ratio for the Pimax Crystal (using 2880x2880 per panel, 10bit color depth, 120Hz refresh rate). The result is 23.89 Gb/s, which puts us within DP 1.4’s usable bandwidth of 26 Gb/s.

This means that a single DP 1.4 cable with DSC is enough for the Pimax Crystal. At the same time, the Pimax 12K will require two DP 1.4 cables with DSC to achieve its full resolution per eye at 90Hz (base numbers, not accounting for any other proprietary tech they use on top of DSC).

That being said, they may use the Tobii eye tracking along with NVIDIA VRSS 2.0 for foveated transport to push the refresh rate at the Pimax 12K’s max resolution per eye from 90Hz to 120Hz. This is nice for us with NVIDIA graphics cards (if implemented with a focus on high res and refresh rates…), but that doesn’t mean they shouldn’t implement support for AMD Radeon 7000 cards at 90Hz.

On the other hand, they may use foveated transport to try and barely squeeze the bandwidth on just one DP 1.4 cable to avoid the bulkiness and inconvenience of two DP 1.4 cables. In an interview with MRTV, the COO briefly mentioned that there are two prototypes: one with one DP 1.4 cable and another with two DP 1.4 cables. While neither situation is ideal, using one DP 1.4 cable is even worse than using two DP 1.4 cables. If they use only one cable, then foveated transport would be required to just barely achieve 90Hz at full resolution. Higher refresh rates will come at the cost of heavily reducing resolution (and FOV).

I think it’s a massive mistake not to go with a single HDMI 2.1 cable. They should take the time to do two very important things: add AMD Radeon 7000 GPU support and use HDMI 2.1 instead of DP 1.4.

P.S. Some other features of HDMI 2.1 that will improve the Pimax 12K even further if implemented:

  • HDMI Cable Power - Enables active HDMI cables to be powered directly from the HDMI connector without attaching a separate power cable. This would potentially future-proof the Pimax 12K to ditch the power supply and only two required cables (HDMI and USB) on supported graphics cards.
  • Variable Refresh Rate (VRR)
  • Quick Frame Transport (QFT)
  • Quick Media Switching (QMS)
5 Likes

That all…makes a lot of sense. Nvidia are very aggressive on the mindshare tactics.

1 Like

Isn’t that only with nVidia’s implementation of DSC? I thought AMD wasn’t at 3:1 but rather just 2:1. Looking into it further though, I don’t see any sources for that, not sure where I got that idea from. I think it just came up in discussion but not actually sure if it’s true.

I don’t think HDMI is on the table for any headsets going forward, not sure why, but I only ever see DP being mentioned.

This is already the case on current headsets. Has Pimax mentioned needing a power cable? Also, the new headsets have a built in battery also (which I would assume doesn’t drain during wired PCVR use, if required.

Hdmi 2.1 is interesting, I’m guessing there must be some reason (beyond just licensing) that no one uses it in modern headsets.

1 Like

At a guess? Possibly due to say eith the Pimax hmds at least the Audio is using the usb. Freeing up bandwidth perhaps to be dedicated to visuals.

Not sure if other hmds are using usb to transmit audio.

Where as iirc hdmi does Audio, Video and has added networking?

It’s a VESA standard, so I don’t see why they both wouldn’t have access to the 3:1 compression ratio.

The older Pimax models needed an external power adapter (I had a Pimax 5KX), but it looks like the new 8KX accomplishes that with a USB cable instead (one DP 1.4 and two USB 3.0). Regardless, that’s not the same thing as HDMI Cable Power would remove the need for that second USB cable. Not as big of a deal now that they don’t use external power adapters like back in the day, but still an improvement.

High-end displays support HDMI 2.1, so the display chips exist. I would also like to know why it’s being ignored so far.

1 Like

AMD does not have half of the market. They have more like 20% of the market overall, and at the high end their market share may be even less. Perhaps that will change with the 7000 series GPUs. There’s a lot of speculation out there that nVidia is in trouble because of pricing and power connector issues and yada yada, and AMD is supposedly going to swoop in and take over. I don’t think things are actually very likely to turn out like that. In fact, Ada Lovelace is turning out to be so good with an unusually large generational improvement that nVidia’s near monopoly may actually strengthen further rather than weakening. But however things turn out, if it does go really well for AMD then realistically it would translate into gaining a few percentage points of market share, nothing like a massive jump to half the market share or more in a single generation.

I bring this up because it’s surely the primary driver behind the decisions of companies like Pimax and Varjo about AMD GPU support. This thread is contemplating the possible engineering issues that may be barriers to AMD support. But these are not impenetrable barriers and anyway the true technical barriers are probably internal details that we don’t know about from just the specifications.

The equation is how much market share does AMD GPU support represent versus how much effort does it take to overcome the technical hurdles of that support versus are there other tasks Pimax’s engineering could be working on which will have a more important impact for the expenditure of resources.

I have little doubt that Pimax will eventually support AMD GPUs again.

I think what’s happened is Pimax initially supported AMD, but when it came to the new chipset in the updated 8KX, it was going to require significant work to rewrite the support again. Pimax’s engineering focus will have been on the Reality series, and they likely wanted to divert as little as possible to having to rewrite the 8KX firmware over again. Supporting nVidia was required but AMD GPU support would likely have relatively little impact on keeping the 8KX alive while waiting for the Reality series launch. And so a lower priority.

Would it make sense for Pimax to delay the release of the Crystal or 12K in order to include support for AMD first? I don’t think so. That would be a very hard argument to make. But that’s really the question. Surely Pimax would like to support AMD GPUs. But what should they sacrifice to get such support sooner?

I think what we’re going to continue to see is firmware engineering resources at Pimax continuing to focus on the Reality series for a while. And then once their more critical tasks are done at some point with the next few months, they will turn to AMD GPU support. And I think we’re likely to see AMD GPU support in the Reality series before we see it in the 2076 8KX.

1 Like

It’s also worth noting that so far we have had a single revision on 1 headset not work with AMD, other than that they have been compatible across the board afaik so AMD compatibility is the default for Pimax for the most part. I do think there might be some hurdles, or AMD support coming later, considering all the super high res headsets like Aero, XTAL, Arpara etc don’t support AMD. That could just be down to diminishing returns or such.

Fingers crossed AMD does get supported though, I would love to get the 7900xtx, looks far more suitable for me personally than the 4090.

3 Likes

Pimax 12k could have one Display Port 1.4 cable with usb 4.0 :smiley::point_up:

Especially since iurc I believe isn’t the 7000 seties gpus coming with the newer DP standard?

Varjo makes sense for them to focus on Nvidia as there corporate customers. likely use Nvidia and like consoles is easier to Develop on a restricted hardware profile.

Pimax needs to be clear for those rrserving a Crystal or buying current hmds with the udea of the 12k trade in.

As for Arpara at present they look dead in the water as even there ks page has been deactivated. Even the dead StarVR One focused on Nvidia Support; not sure if they had any Amd Gpu support.

HTC did well with the Vp2 supporting both.

By the way… AMD are apparently pitching the 7900XTX as a competitor to RTX4080 – not 4090, and seem to suggest that they are leaving it to partner card makers, to crank the clocks, in trying to get closer to the green side’s top end - likely at similar power draw costs; In terms of rasterisation at least – hardly getting anywhere near when it comes to raytracing…

This just makes me feel all the more that I’d probably be interested in something that uses more silicon, instead of higher frequencies.

The actual computation is apparently still done on a single monolithic chiplet, at the moment, with the six surrounding ones being all about memory; But even with that conservative approach, I think I’d be keen on at least a dual-GPU card (…or revival of SLI/Crossfire, if nothing else) , which finally fulfils the old promise of splitting the left and right eye rendering work between two GPUs, and letting them run relatively cold, rather than having a single one racing to manage both frames.

…if ONLY all the software would support it. sigh :7

1 Like

According to some channels like Moore’s law is dead, AMD are holding back quite a bit and could release a crazy card like the 4090 but likely are waiting to see if they should with sales etc and the inevitable 4090ti. Could be total nonsense though but it has been said by quite a few.

Won’t touch RT performance of course but it could possibly match or surpass 4090 in rasterization if they went all out.

1 Like

It’s really easy to get the impression the players are taking turns taking small steps, waiting to see what the other guys do, before committing to take another one… :stuck_out_tongue:

MID is not a good source and has been wrong about many things, and he even deletes videos where he makes wrong predictions to hide the evidence. I would not remotely trust them as a primary source without other places verifying it.

3 Likes