What can we expect and wish from a future Pimax 8K-X HMD?

Moores law has ended graphical improvements each generation are minimal using the same microprocesses and fabrication methods.

Thats why you have ray tracing and chiplets, 3d stacked intel processors coming. My prediction is that the new tech may not offer backwards compatibility so the 1080/2080ti series may be the last and best gpus in the future for rasterised games, and in the future while tech has to be better it will only work better on new games coded to take advantage of new architecture. (Due to older games being coded to take advantage of monolithic dies and one to two cores, not chiplet or multi gpu design architecture, but we’ve reached the limits on what we can produce and improve with monolithic die.)

With current improvement gains at like 10-20% each generation the hopes of more than doubling the pixel count to native 4k x2 at stable 90hz with no tricks seems like an unrealistic and impossible pipe dream, at least on older games in my opinion (as it would take like 20 years to get twice the performance as the 1080ti based on diminishing returns in gpu improvements, at 10% improvement from the last generations due to moores law ending )

The fact the 1660ti is even a product proves my point. Nvidia’s selling less powerful tech now.

2 Likes

Indeed the original concept of the 8K-X was the result of backer requests during the Kickstarter sign-up phase, and Pimax advised that native input wouldn’t make sense due to the excessive GPU demand but they then agreed to offer a limited number of them for those lunatics amongst the backers pressing them for such a device.

However, as evidenced by Kevin’s recent interview on MRTV’s channel, Pimax have changed their attitude towards the 8K-X in the meantime, and will likely not just want to produce the 400 units and then forget about it, but it is supposed to become their flaggship product. I suppose in part because one year has passed and the next generation of GPU’s will be around the conrer by the time they will eventually release the 8K-X, and then Brainwarp with Smart Smoothing, lower refresh rate, etc. does seem to work decently which may make a native 2x4K feed less unrealistic than it seemd two years ago. And of course the 8K did not quite live up to the standards one had hoped for, beaten by the lower res 5K+ for clarity.

So there are a number of factors which show why Pimax have good reasons to try to make the 8K-X their high-end device in their commercial portfolio. However, this means it is supposed to be commercially viable, i.e. it should address not only a very small range of enthusiasts but ideally address as big a market as you can get for this kind of device.
So while for a limited 400 unit run they could have easily said that who-ever backed that device surely will get a VRLink enabled graphics card, they may no longer feel comfortable to exclude all users with older graphics cards if with some neat Brainwarp tricks it may even make sense, say for certain applications with a slightly lesser general demand on the GPU, to the owners of an 1080Ti or some RTX cards which do not feature the new connector. Keeping your product accessible is an important consideration if you plan to sell it in numbers. Question is, will it really make sense for owners of graphics cards who don’t have this connector ? Won’t they upgrade the graphics card soon anyhow if they go for the expensive flaggship HMD of Pimax ?

3 Likes

I’m well aware of moores law and raytracing being the future. However that doesn’t change the fact they talked about the 8KX not being for gaming and primarily for use as a desktop until GPUs catch up.

2 Likes

The best CPU and GPU technology out to date may be totally future proofed as much as it can be for older games which take advantage of older architecture, newer tech will likely only show improvements on newer architecture and i believe we will see a dip in performance in older games which can’t take advantage of chiplet or multi cpu and multi gpu architecture, newer cards will probably never be at good at rendering on older games due to multi gpu design which is weaker to single core performance and they will only benefit from newer games supporting multicore architecture

2 Likes

And the problem with what they are trying to do is what you just said. Making the best anything on the market while trying to get it to be a mass market item is the very definition of dichotomy. That’s why different products exist the 5K+ for low end VR enthusiasts and the 8KX for the extremists.

1 Like

@anon74848233 @PimaxUSA The 8KX needs to wait to have one display cable because two is way more messy and can cause more screen problems.

1 Like

We’re going to need tricks like foveated and brainwarp to get the performance improvements required. I dont think its possible with raw gpu improvements at least on games coded for older tech.

1 Like

Well truth the 8kX was originally aimed to be the 8k. But after realizing they couldn’t do 4k native/eye due to bridgechip & gpu limitations. They had to go with an upscaled solution like the 4k. Due to community’s request a limited run known as the 8kX came to be.

Though it would seem that due to advances like Analogix’s new bridge chips with builtin flex scalers out this year that the 8kX though still needing dual vid inputs will no doubt be king.

4 Likes

In terms of older games & newer tech. It will depend on how the new tech is implemented. Think of emulators. Ps3 was due to Cell processor difficult to run on a pc. Now not so much. But your right in terms of likely needing an api frontend layer for older games to be able to take advantage of newer architecture.

1 Like

As If they could manufacture GDDR6 for computer XD

2 Likes

Isn’t hbm different than ddr?

1 Like

Put simply, GDDR RAM is designed to work with graphics workloads generated by (a) GPU(s) and DDR RAM is designed to work with computing workloads generated by (a) CPU(s).

GDDR being optimized for bandwidth while DDR is optimized for latency

3 Likes

But hbm is supposed to replace gddr memory. Even Nvidia uses it in there non consumer cards. I recall reading gddr was supposed to be at it’s limits back with gddr 5.

1 Like

Well, all RTX cards are using GDDR6 ^^

1: GDDR5 & HBM1 are both VRAM

HBM : Higher Bandwidth (4 time over GDDR) , Smaller Form Factor meaning multiple layers, a lower voltage ( around 10% ) .

GDDR : Higher Clock Speed ( 7 times ) , Widely Available & less expensive because used for decades.

2 Likes

Isn’t hbm2 even higher? As far as I know gddr is only being used by Nvidia for consumer cards.

1 Like

A year ago it was already said that the HBM3 and HBM4 memories would make the GDDR memories obsolete, but I have not seen what is happening yet.

1 Like

3 Likes

Wanna know what the 8K-X will be like?

Get the 8K and run it at 64Hz mode. That will give you exactly what the 8K-X will give but at slightly slower Hz.

I was wondering why the headset had to reboot when changing the Hz in PiTool and the only reason I can think and with recent testing at 64Hz mode - which I ignored earlier - is that it switches the Scaler Chip Modes. I am under the impression that the Scaler Chip is bypassed when running at the lowest Hz because there is no longer a bandwidth issue with DP 1.4 and also because even viewing the Pimax logo screen through the headset yet alone VR titles - there is much better pixel definition, colour and contrast.

The real only current issue to be concerned about with the 8K is the front plastic housing. The head strap isn’t that bad, the foam although could be improved slightly isn’t to bad and side distortion is better without padding it up.

Anyway - hopefully before mid year, lighthouses and controllers will be available and Pimax will have products for sale as complete bundles to ship. As for down the track - well by years end or next year there will be good alternative panels that can be used for an updated version and also higher bandwidth single cable solutions. No - I am not willing to have wireless at speeds required strapped to my head :partying_face::joy:

1 Like

Unfortunately not. Pimax has stated that the 8K scaler chip can NOT be bypassed. Running at a lower refresh rate would allow you to increase the super-sampling amount, assuming you’ve edited the SteamVR config to allow rendering at more than 4096 pixels, but the maximum input res of the 8K is the same as the 5K+: 2560x1440.

2 Likes

“Unfortunately not. Pimax has stated that the 8K scaler chip can NOT be bypassed. Running at a lower refresh rate would allow you to increase the super-sampling amount, assuming you’ve edited the SteamVR config to allow rendering at more than 4096 pixels, but the maximum input res of the 8K is the same as the 5K+: 2560x1440.”

Not that the scaler chip can be bypassed by its functioning can be turned off. I am well aware of being able to run higher Super Sampling at lower Hz but what I am finding is much better image quality at same or lower Super Sampling compared to what I ran when at 80Hz.

PiTool Render Settings and SteamVR Settings the same.
There is a lot that we do not know about the Pimax headsets. We are told that the 8K panels are 4K each eye but we don’t know if that is 3840 × 2160 per eye or 4096 × 2160

Hard to deduce from PiTool at and SteamVR per eye Render Target information. I might have to purchase FPSVR and check that out. The other question is how much or the pixels are being utilised by the headset and what is left in reserve for IPD image adjustment shifting and FOV changes.

I have thus also decided to give up on SteamVR adjustments and just leave them at 100% and 100% and do any changes within PiTool Render Adjustment itself. Anything under PiTool1 apply some AA and above - turn AA off.

Some decent documentation on what the headsets are doing with what they have would be greatly appreciated.

2 Likes