8K-X and RTX 4080: Slight movement of image first, when moving the head

That’s probably because there’s a lot of variability involved both in how much latency is present in any particular case and how sensitive people are to it.

I think something to understand about this is that there has always been latency and will always be latency. It will never be 0ns. The goal of all VR headsets is to have latency below human perception. But that threshold of perception varies from person to person.

nVidia had some sort of change for the 40 series which has had the effect of increasing the latency by some amount in some scenarios. Whatever is going on there seems to be amplified by monitoring software like MSI Afterburner and fpsVR. In some cases, this can reportedly also cause flickering which I might guess could be some kind of buffer underrun related to latency.

It’s hard to isolate what specific scenarios it happens in because it does seem related to specific hardware, firmware versions (not just on the VR headset and GPU but possibly also motherboard components), etc. And in many cases the latency is increased to a range where it is right around the threshold of perception and users aren’t sure whether it’s really there or not.

There’s a lot of factors here which make this a difficult problem for the companies involved to track down and solve. Early on it was questionable whether the problem even actually existed. By now it’s been well established that it does exist, but the parameters of it are not well understood at least by the public.

2 Likes

I need videos.

I have noticed some flickering that I didn’t see with my 3090 but it isnt too noticeable right now. The pure joy of having a 4090 powering the 8KX like it does on my system overshadows that right now :joy:

1 Like

I generally agree with you.
But as you said: A latency of 0 neither is physically possible nor necessary, as long as it is low enough so it cannot be perceived.

This has been the case for me for a lot of graphic cards, Pimax HMD’s, CPU’s and mainboards over time:

  • i5-3550, Radeon Fury & Pimax 4K
  • i5-3550, NVidia 1080 & Pimax 4K
  • Ryzen 2600X, NVidia 1080 & Pimax 5K+
  • Ryzen 2600X, NVidia 2080 Ti & Pimax 5K+
  • Ryzen 2600X, NVidia 2080 Ti & Pimax 8K-X
  • Ryzen 5600X, NVidia 3080 & Pimax 8K-X
    (just can’t remember all mainboards right now; but it have been at least three, too)

… and I always had MSI Afterburner running in background.

So, it is certainly something related to the 40 series specifically.

Unfortunately, VR is still a niche, and enthusiast HMD’s such as from Pimax even more.
That doesn’t make things easier compared to a broader spectrum of bulk hardware.

Is it related to Pimax headsets only? Or do HP Reverbs, HTC/Vive’s and Mixed Reality HMD’s suffer from this as well?

Vive Pro 2, some G2 and Index too

1 Like

Can We Get Acknowledgement of the Pimax Specific RTX 4090 Latency / Lag Issue please? - OpenMR | Community

Have maybe found a helpful solution

1 Like

Bro turn off afterburner!! It makes the problem 10x better although doesn’t fix it outright.

Well, I certainly could :+1:t3:
But I don’t really want:

I am using Afterburner to limit the power target:
Although the 4080 draws much less power than my former 3080 and although I didn’t experience higher power loads than ~280-320 W (under 3DMark), the 4080 generally is able to draw up to 420 W. But I’ve “only” a six year old 750 W power supply. As long as the 4080 doesn’t consume more than 350 W in peaks (and it is much much lower until now), I don’t have to bother. But no one knows, whether there won’t be single peaks >400 W now and then - and neither do I want my power supply get fried nor do I want to invest 200€ for a new and more powerful power supply if it isn’t really needed. And generally, it isn’t as normal gaming power consumption shows - not even with a 4090, if adjusted via Afterburner.

Last but not least, I’m not yet fixed to the 4080 - I first will return it, go back to my 3080 and wait for the 7900 XTX next week and test this one, too, before making my final decision. Might even be, I’ll have to pay through the nose and go for a 4090 in the end… :see_no_evil: although hopefully not :wink:

Last but not least, during my last benchmark test drives in ATS, I noticed that I get somewhat used to the issue. It’s still neither comfortable nor satisfying - but a little less annoying than during my first sessions.

:slight_smile:

The 4090 has the same issue so that won’t solve it. And AMD cards are incompatible with the new 8kx and all upcoming pissmax headsets.

Speaking of which I’m powering my 4090 on an 850w psu, you will have plenty of overhead if you’re on an AMD CPU.

Good luck anyhow whatever choice you make.

I’m testing the different graphic cards for performance reasons as - at those expensive prices - I simply want to find the personally optimal GPU without spending 2000€ and above if possible.
If in the end, the 4090 is the only GPU which satisfies me - well, I’ll have to deal with it somehow. As well as with the lag issue we’re talking about, of course.

Why do you think the 7900 XTX won’t work with my 8K-X?
Because there’s no confirmation neither from Pimax nor AMD yet?

:slight_smile:

We already know it doesn’t work with the revised 8kx or any future headsets, pimax have confirmed it a few times.

I haven’t a revised one, mine is first batch from Nov/Dec '20.

Then you’ll be fine with it but unable to upgrade afaik.

I don’t think we know that either model of 8KX will work with a 7900 XTX. The earlier model 8KX does work with AMD, but whether or not there are compatibility problems with the new AMD GPUs and whether Pimax will address any such problems that appear is not a known quantity.

Okay, thanks for that hint :+1:t3:

But since I didn’t follow that issue, just shortly:
What has been changed here that AMD GPU’s don’t work with Pimax headsets any more? And as far as I’ve been able to understand, with some other headsets neither.

I know that VR still is a niche - but I wasn’t aware of how much it is still a niche…! :dizzy_face:

Well, I don’t even know whether upgrading my HMD to a 12K would make sense for VR-simulation anyway:
Afaik, the 4090 is the only GPU so far which is able to maintain stable 75 FPS “everywhere” or perhaps even 90 FPS in some simulations for the 8K-X.

=> So: Why should I buy a 12K for appr. 2.000€ +/- if there won’t be any GPU out there for the next upcoming years which is able to fully “fuel” it…?
And: Even if there was such a GPU, I’d probably first use the additional headroom for SuperSampling on the 8K-X :wink:

Not, that I would not want to have a 12K :smiley: :heart_eyes:

But I think at least GPU’s are still much too weak to fully benefit from its advantages over my “old” 8K-X. Unfortunately :disappointed_relieved:

Well I think we are able to downsample the rendering in the worst case with a 4090 and still get a much better picture than the 8KX.

1 Like
  • 6K panels will produce better display quality than 4K panels at the same rendering resolution. It’s like oversampling that you get all the time without a GPU performance hit.
  • The 12K has eye tracking and foveated rendering which are integral to its design. This reduces the load on the GPU by ~40% without perceived reduction in display quality.
  • Better lenses will improve the display quality over the 8KX without any increase in GPU load.

The result of all of this is that the 12K converts GPU umph into display quality with substantially higher efficiency than the 8KX. For any given GPU and game, the 12K will look substantially better at the same frame rate.

Whether or not the GPU can “max out” the VR headset is actually kind of immaterial. It’s like saying “These high performance tires are rated for 200mph, but my sportscar can only reach 140mph at most. So they’re a waste to buy!” Except that those tires are providing increased grip at all speeds which is the actual point.

Keep in mind that those 6k are spread out over a larger FOV than the 4k, so higher effective resolution is not a guarantee…

It is 1.5 times the resolution (6k/4k), and 1.33 times the claimed per-eye FOV (160-ish/120-ish, maximums), which is still a larger number for the former part than the latter, but one need to take into account that the wider the FOV is, the more pixels are typically wasted out in the periphery, where the tangent of the angle avalanches.

This concern may very well not be an issue – it is all down to how the HMD optics warp the image; Hopefully the new lenses do a really good job at concentrating display pixels to the front, and tapering the resolution off toward the edges.

haha i dont recommend that as Pimax doesnt support AMD.