Indeed, why nobody thinks of correctly done eye tracking feature, it should significantly help with bandwidth.
Although not very much:
With foveated rendering, Werner says, there will be a 30 to 70 percent decrease in the amount of pixels drawn, a processing power saving that can translate to higher framerates and the ability to achieve high quality output with 4k headsets as opposed to the 24k level needed to meet natural human vision level.
But if this article has some valid points then 80hz + 70% = 136Hz which is not that bad. So decreasing rendered pixels I guess alread technically possible with current ET module, if they will update firmware maybe users with ET will have higher refresh rates, say 100 or 110Hz or at least 90.
Miss party of my point, itās not only ET the problem and yes many people think about it but market not readyā¦ Iām taking about the transport link. the minimal 30% gain which is by the way not negligible would result in 90hz+ refresh rate for the 8K-X if gained on bandwidth.
if not dependent of Display port or HDMI spec other optimization could be done:
-Variable refresh rate by panel portion could also be tried.
-one pass lossless compression algorithm targeting only central DFR region of interest.
My point is that actual link to display is optimized for video not VR. This include the video controller on the LCD.
But anyway preferable to invest in a 3080 for high refresh rate in the short term and the long cable may disappear with IEEE 802.11ay
isnāt nvidia vr link supposed to help with that?
I see it says that transer rates is about DP1.4a, however Im not sure if that possible to use own protocol through that port, so HMD manufactures could optimize amount of data that pass through. As you suggested 2 images cut by circular mask. Or mapped array where you pass only pixels & firwmare takes those & put in high mid & low dense regions according to custom made protocol.
The problem will remain at the received end which have to decode and translate/transmit the data to the mipi lane. maybe possible though no expert here but doesnāt seems the case
quote: " The available bandwidth is estimated to be equivalent to DisplayPort 1.4 (32.4 Gbit/s, up to 4K @ 120 Hz with 8 bpc color) for video and 10 Gbit/s of USB 3.1 Gen 2 data.[3]"
And virtual link doesnāt seem to by a great commercial success
Is anyone still planning on getting a 3090 or 3080 immediately or are you going to wait until November to see if AMD can match the 3080?
It seems Nvidia 3080 AIB models will offer 20 Gb versions, which will be more expensive than the Nividaās own 10 Gb 3080, which seems āaccidentallyā weak on memory considering a 2080 Ti had 11 Gb.
To my untrained eye, 3080 20 Gb variant is a good compromise for those unable or unwilling to buy a 3090.
I wonāt be buying new hardware until I have to. That is, when DCS World no longer works on what I have, and the newer stuff is sufficiently faster than this ~2045MHz RTX 2080 Ti.
As for DP2.0, that is only really interesting if we can get 8k per eye headsets at something like 180Hz refresh rate with Smart Smoothing that works as well as Oculus ASW did. At that point, we might actually be able to really see major performance gains from cranking down the supersampling.
Thatās not happening for a while, so Iām not really excited about the new hardware. At best, itās just a way to burn more of my cash.
For now, the 8kX itself is the biggest VR upgrade we have. And maybe hand tracking, very excited to be able to mix my physical HOTAS/keyboard/mouse with native VR āhandsā.
For that reason, I expect to upgrade both CPU and GPU at the same time. Which would be a lot easier, because while I am at it, I plan to construct a custom case with water cooling, all as compact and low-noise as possible. Perhaps with built-in lipo battery backupā¦
At this point, Iām planning to wait until next year, when the Super versions are released. By then, the prices should have settled down and hopefully, nVidiaās next rev will be on TSMC, instead of Samsung chips, which should reduce the power requirements. And yes, thereās a good chance that Iāll get a 20GB 3080 variant.
Oh, my decapitated at 5.0GHz clocked 7700k runs with a cute simple 130watt air cooler in a mATX case without any problems since I have the GPU under water.
Earlier, when the CPU was still under water and the GPU under air, the GPU often hit the temp limit.
it is really a very small case, have no Space for a bigger or two Radiators to chill booth with water. But now is all fine, GPU (2080ti oc 300w bios) max 78, CPU max 80 Degree Celsius @ 26 degree room temp
Reportedly the i9-10900K (which I plan to upgrade to unless something better is on the horizon) breaks the 300W barrierā¦ thatās definitely too much for the Noctua NH-D15S.
Besides, an advantage of water cooling is an external chiller port. That should add a percentage point at least to core clocksā¦
Any reason why you think this? Last gen AMD were not a threat, this time I think Nvidia are relatively nervous.
If they donāt push the specs, thereās a big chance AMD will hurt them, and they wonāt be able to charge so much, even then itās possible AMD will be able to compete with the 3080.
while iām likely to get the 3090 simply because amd historically has had less than stellar vr/pimax compatibility out of the gate and because of the 24gb vram. i am going to wait until testers and reviewers has had their hands on itā¦ maybe.
in the end iād prefer not having to deal with driver issues on top of pimax issues
Iām waiting to see how much the 3090 is supposed to offer above the 3080. If it truly is just 15% more performance for nearly double the cost, Iāll stick 3080. If itās a substantial improvement, I may just get it.