The 8K-X concept

I have to say that while I am an 8K(X) backer, I don’t expect it to be the device I would be using the whole time but rather as a special purpose device. Maybe it will also be fine for other use cases without a built-in upscaler if the upscaling occurs on the PC side, but that will demand additional resources on the GPU so it may perform a bit less but may have less aliasing as a bonus.

And you say you expect to get a StarVR or XTal for 600,- in one year ? Or if you assume retail pricing, for 1,000 ? Sorry, no chance. The prices won’t drop significantly, they would be beyond competing prices anyhow (assuming the current expectations of very high prices are correct).

And I saw your comment on adding the GPU cost to the equation - the StarVR would be a competitor to the 5K+/8K because it will probably offer a comparable experience in terms of visual quality (don’t remember it in detail tbh) but with OLED blacks & colors and without some of the kinks of the Pimax headsets. But likely at an incredibly steep price. But it will not compete with native 2 x 4K for those interested in a major bump in resolution, so what’s the point of mentioning it in this context where it is all about increasing the resolution beyond the 5K+/8K level ?

So will the 8K(x) (as currently specified) be the better daily use headset ? No, probably not, it will be ahead of its time resolution-wise and put some almost insatiable demand on the GPU you cannot satisfy for many use cases. And it will share many of the issues of the 5K+/8K headsets.

Unless of course Pimax rethink the whole approach to the 8K(x) and it is taken into a new direction, either by including an upscaler which can be activated or not, say an 8K+. Or by actually making foveated rendering or something comparable work, but this would require many other factors to fall into place which are entirely outside of Pimax‘ control, so it doesn‘t seem realistic to expect this to happen.
On the 400 unit argument: that isn’t really relevant - it will only remain a 400 unit product if it sucks. Otherwise it will serve as the blue print of their next commercial product.

2 Likes

Exactly

GPU can use much more advanced upscaling algorithms and can probably do it with less latency too.

It will have optional software upscaling, same as all HMDs and all monitors.

If you play league of legends or something, you can chose in the game settings to use 720p even if you have 1080p monitor.
The game will be drawn in 720p, then after that you upscale it to fit the monitor.

For 8k-X you could do the same thing (or not, as you want).

The problem with 8k is that, we use something like 3k(or more) res to draw the game, then down-sample to 1440p (this is what SS means) so that it can be transmitted over the cable, then the chip on the HMD upscales it to 4k.
This is obviously not as good as if you draw the game in 3k and then directly upscale it to 4k

2 Likes

Simple truth even the new RTX cards are only advertising 4k gaming.

Dual input won’t change that. Will just make bandwidth more stable. Analogix 38/39 chips in 2019 with integrated flex scaler will be needed. As i believe it’s closer to a desktop monitor’s scaler.

1 Like

Np it can run on single cable or dual for StarVR One

2 inputs yes as the v2 was 2 hdmi. But mgpu is still sparse.

I think too many are defensive in debates resorting to calling things a Strawman argument when examples are presented.

If they use Analogix new chips it will be native & have a built in fkex scaler.

2 Likes

Yes you would sacrifice software compatibility.

StarVR seem to think it is possible as they mention a Sli mode on their website features list. But nobody knows how that works yet, they do have an SDK that needs compiling in to support their unique rendering I think. It is not an automatic feature of GPU drivers that’s for sure. NVidia have an SDK that supports multi-gpu VR too but none of this is native in game engines (yet).

I think the problem is harder than it sounds when you need synchronized low latency. All the existing multi-gpu work (GPU rendering, stitching, compute etc) just works as fast as it can without much care about low latency synced timings (effect similar to VSync on / off) Maybe this is why Brainwarp development is still on-going too as anything per eye needs to be timed correctly or the results will be instant nausea.

Raw Data developers wrote this:

Please realize that because mgpu support requires changing the graphics pipeline, including shaders, even when SLI/Crossfire is not used, those changes are still in the game laying dormant. As a result, these changes can (and are right now now) affecting the stability, performance and visualis of the game even with one card. This is all further complicated by the fact that there are different liquidVR and gameworks implementations, which can have undesired conquences/complicates merging, and on top of that different features in gameworks alone can have unintended consequneces on other gameworks features on some specific things in the game. We are actively working with nVidia and AMD to resolve these issues, however it will take time, as we will only release it if we are sure that we can guarantee a positive experience for all our players.

So you can see this is a lot of work on the developer which is why game engine support is rare.

1 Like

It doesn’t need per game support - in the early days of 4K monitors they needed 2 cables too and nvidia supported it just fine via drivers.

That’s not the same thing and was due to IO bandwidth. Splitting a 4k image down the middle and outputting it via two 2 DP ports to one monitor from one card has much less timing issues than trying to use two cards to two displays and still maintain sync at 90hz

1 Like

It is the same thing, the early 4k monitors had 2 controllers on their end and each controller controlled half the panel, it was effectively like having 2 panels in one casing with 2 DP contollers on each end

I’m not talking about SLI, i am talking about using a single GPU

We are talking about Multi-GPU VR here. Not single card splitting a signal.

1 Like

FRC didnt mention multi gpu, that was just you, i didnt reply to you i was just saying in general Steamvr / games dont need to support 2 DP to one device because its been done before

1 Like

I wonder how things like silicon lottery, thermal protection, and power protection will keep things sync’d. Without some sort of sync even identical cards will have different framerates. That’d be weird if your left eye is showing a slightly older frame than your right eye.

2 Likes

I think he was implying it. But maybe you are right. @Frc ?

1 Like

It’s been down before, true. But that was also before we “improved” DX and Vulcan with low-level GPU access. You just can’t do as much in the drivers as we used to.

1 Like

Monitor support has nothing to do with vulkan or DX version, as far as the game cares it just needs to know what resolution to render.

Sorry, I was stuck thinking about SLI.

1 Like

I’ve backed the 8K-X out of a couple of reasons. And please don’t judge with the knowledge we have right now, but think about the situation as it was while the Kickstarter campaign was running.

  1. I thought, its always better to drive a Display with its native Resolution, otherwise, as you can test with your own Display the picture is getting blurry. Sure there are Scaler Chips which are good, but none creates a perfect image. So the 8k was ruled out just by this reason.
  2. It was clear from the beginning that it will need 2 Cables. And honestly, why should I care as long it is working. And why isn’t this progress? Do Pimax has to invent a new Transfer Method to be able to use just one Cable? For what reason? It is enough that they bring out a Headset which has this Resolution to pressure others to make an even better Headset. But I laughed a little bit about the plug/unplug Statement. This is definitely my least concern.
  3. About Wireless, I share your thoughts. I am not capable of imagining a way how to transfer that amount of Raw Data to the headset wireless. But were I able to think about 50/60? GHz WLAN 2-3 Years ago? We simply don’t know what the future will bring us. Be it a higher Frequency, a very good compression, or whatever. We will just see. At the time I backed I didn’t think about Wireless connection as a makable thing and accepted that. But I neither thought of it for the 5k and 8k.
  4. Except for some rare exceptions (< 10), we can’t make use of SLI in VR Games. It might be then even only 30Hz with a 2080 TI. But who said the Games have to be rendered in Maximum Detail? And who said that the 8k-X can’t be used for other things like Development? Virtual Desktop? Movies maybe? For these use cases, 90Hz should be archivable.
  5. I am not sure if StarVR and Xtal was a realistic choice while the Kickstarter campaign ran. And even if, they weren’t actually sold nor the price known. Also, I didn’t want to wait for a Headset any longer and bet on Pimax.
  6. About the ‘broad’ market support. So isn’t the wide FOV from the 8k. And the 8k-X and 8k should not differ from the perspective of a Game Developer. It has higher resolution, but that’s it. There is no other difference, and don’t tell me ‘but the dual cable’. Yeah, that has to be handled by the driver in a perfect world.
  7. About colors and blacks, well sure OLED is better in terms of colors and blacks. And still, I am sitting in front of an LCD on my daily work and gaming sessions on my PC. LCD is okay, not perfect, but when was it necessary. Also, OLED
    isn’t Jesus which bring us salvation in every aspect. It’s more expensive, depending on your usage it can ‘burn in’ pixels and apparently at the time of the campaign the needed display size wasn’t available or just not affordable.
  8. I am not sure what you want to tell about ‘New LCD required actual 8K panels not desirable’
  9. The aberration and distortion go for the 8k and 5k+ as well, don’t we talk about the 8k-X? This also goes for using the Normal FOV.
  10. I really hate SDE so my biggest request is to have as less as possible.

In the end, I bet that Pimax will create an 8k-X, it will have an acceptable Level of SDE and ‘might’ just might be used as a Monitor Replacement. I will see if I lose my bet in some months. And even if I do so, just having 400 pieces of this Headset eases my mind. In case it isn’t fit for my use Case and not a complete disaster, I should be able to sell it to someone with little to no loss.

5 Likes