No Display Port 2.0 on Nvidia Ampere 3090 šŸ¤Ø šŸ§?

Indeed, why nobody thinks of correctly done eye tracking feature, it should significantly help with bandwidth.

Although not very much:

With foveated rendering, Werner says, there will be a 30 to 70 percent decrease in the amount of pixels drawn, a processing power saving that can translate to higher framerates and the ability to achieve high quality output with 4k headsets as opposed to the 24k level needed to meet natural human vision level.

Also, eye tracking technology will make it possible to reduce graphics distortion caused from not taking eye position account when rendering VR graphics, Werner adds

But if this article has some valid points then 80hz + 70% = 136Hz which is not that bad. So decreasing rendered pixels I guess alread technically possible with current ET module, if they will update firmware maybe users with ET will have higher refresh rates, say 100 or 110Hz or at least 90.

1 Like

Miss party of my point, itā€™s not only ET the problem and yes many people think about it but market not readyā€¦ Iā€™m taking about the transport link. the minimal 30% gain which is by the way not negligible would result in 90hz+ refresh rate for the 8K-X if gained on bandwidth.

if not dependent of Display port or HDMI spec other optimization could be done:

-Variable refresh rate by panel portion could also be tried.
-one pass lossless compression algorithm targeting only central DFR region of interest.

My point is that actual link to display is optimized for video not VR. This include the video controller on the LCD.

But anyway preferable to invest in a 3080 for high refresh rate in the short term and the long cable may disappear with IEEE 802.11ay

3 Likes

Theyā€™re working on 90hz for normal mode? Do you have a link to the post?

1 Like

No I thought they were , not actually heard / seen anyone state it , I understand that higher refresh rates may be possible with lower resolutions.

isnā€™t nvidia vr link supposed to help with that?

I see it says that transer rates is about DP1.4a, however Im not sure if that possible to use own protocol through that port, so HMD manufactures could optimize amount of data that pass through. As you suggested 2 images cut by circular mask. Or mapped array where you pass only pixels & firwmare takes those & put in high mid & low dense regions according to custom made protocol.

2 Likes

I thought that VR Link have been canceled

1 Like

The problem will remain at the received end which have to decode and translate/transmit the data to the mipi lane. maybe possible though no expert here but doesnā€™t seems the case

quote: " The available bandwidth is estimated to be equivalent to DisplayPort 1.4 (32.4 Gbit/s, up to 4K @ 120 Hz with 8 bpc color) for video and 10 Gbit/s of USB 3.1 Gen 2 data.[3]"

And virtual link doesnā€™t seem to by a great commercial success :wink:

1 Like

Is anyone still planning on getting a 3090 or 3080 immediately or are you going to wait until November to see if AMD can match the 3080?

It seems Nvidia 3080 AIB models will offer 20 Gb versions, which will be more expensive than the Nividaā€™s own 10 Gb 3080, which seems ā€˜accidentallyā€™ weak on memory considering a 2080 Ti had 11 Gb.

To my untrained eye, 3080 20 Gb variant is a good compromise for those unable or unwilling to buy a 3090.

1 Like

I wonā€™t be buying new hardware until I have to. That is, when DCS World no longer works on what I have, and the newer stuff is sufficiently faster than this ~2045MHz RTX 2080 Ti.

As for DP2.0, that is only really interesting if we can get 8k per eye headsets at something like 180Hz refresh rate with Smart Smoothing that works as well as Oculus ASW did. At that point, we might actually be able to really see major performance gains from cranking down the supersampling.

Thatā€™s not happening for a while, so Iā€™m not really excited about the new hardware. At best, itā€™s just a way to burn more of my cash.

For now, the 8kX itself is the biggest VR upgrade we have. And maybe hand tracking, very excited to be able to mix my physical HOTAS/keyboard/mouse with native VR ā€˜handsā€™.

1 Like

the 3080 wonā€™t be faster than the 2080ti either.
only the 3090. The same game as the last change from 10xx to 20xx.

But for DCS/xp11 I would rather have a CPU with 9GHz on one single core. All problems solved.

For that reason, I expect to upgrade both CPU and GPU at the same time. Which would be a lot easier, because while I am at it, I plan to construct a custom case with water cooling, all as compact and low-noise as possible. Perhaps with built-in lipo battery backupā€¦

1 Like

Just put the computer with thick fans out of the window and annoy the neighbors. makes less work

1 Like

Loud fans are not the problem. Air cooling will not be a good solution for achieving full overclock on next generation hardware, especially CPU side.

1 Like

At this point, Iā€™m planning to wait until next year, when the Super versions are released. By then, the prices should have settled down and hopefully, nVidiaā€™s next rev will be on TSMC, instead of Samsung chips, which should reduce the power requirements. And yes, thereā€™s a good chance that Iā€™ll get a 20GB 3080 variant.

I will get the 3090 immediately !

So many Games that i cant enjoy right now in 4k with my 2080 ti

2 Likes

Oh, my decapitated at 5.0GHz clocked 7700k runs with a cute simple 130watt air cooler in a mATX case without any problems since I have the GPU under water.
Earlier, when the CPU was still under water and the GPU under air, the GPU often hit the temp limit.

it is really a very small case, have no Space for a bigger or two Radiators to chill booth with water. But now is all fine, GPU (2080ti oc 300w bios) max 78, CPU max 80 Degree Celsius @ 26 degree room temp :slight_smile:

1 Like

Reportedly the i9-10900K (which I plan to upgrade to unless something better is on the horizon) breaks the 300W barrierā€¦ thatā€™s definitely too much for the Noctua NH-D15S.

Besides, an advantage of water cooling is an external chiller port. That should add a percentage point at least to core clocksā€¦

2 Likes

Any reason why you think this? Last gen AMD were not a threat, this time I think Nvidia are relatively nervous.

If they donā€™t push the specs, thereā€™s a big chance AMD will hurt them, and they wonā€™t be able to charge so much, even then itā€™s possible AMD will be able to compete with the 3080.

while iā€™m likely to get the 3090 simply because amd historically has had less than stellar vr/pimax compatibility out of the gate and because of the 24gb vram. i am going to wait until testers and reviewers has had their hands on itā€¦ maybe.

in the end iā€™d prefer not having to deal with driver issues on top of pimax issues

2 Likes

Iā€™m waiting to see how much the 3090 is supposed to offer above the 3080. If it truly is just 15% more performance for nearly double the cost, Iā€™ll stick 3080. If itā€™s a substantial improvement, I may just get it.

1 Like