I must change my gtx 980 to drive the Pimax 5k+

I have to choice between the MSI GTX 1080TI (DUKE) and the new

Gigabyte GeForce RTX 2080 GAMING OC WHITE.

The new 2080 will cost around 100$ more.

Accordind Your experience using the Pimax 5K+

would You pleasee be so kind to recommend the best one to buy?

There’s not so much difference between 8GB and 11GB DDRAM or not?

And the same question is for the Ray tracing on the new RTX.

The 2080 has and one of them does not.

Thanks in advance for the suggestions.

Intel I9 9900K

16GB 3000 DDR4


I would go for 2080 at it might support various features which can theoretically be useful in the future. Performance wise they are same.


I’d go for the 1080ti…it has 11 gb of vram, 2080 only has 8[ which means you can get better textures] :wink: RTX isn’t really supported in any games yet and for sure isn’t supported in any VR titles, and by the time it is this will be a sub entry level card anyway :wink:
raytracing atm = gimmick
“a trick or device intended to attract attention, publicity, or trade”
real life 3gb more texture memory or gimmick for $100 more it’s your choice lol


Well I would suggest it depends on where your focus is…

20 Series adds new features the 10 does not have - if you like them thats a plus point for 2080
1080TI has 11 GB - plus point if you need it (take a look at the games you play -do they max out the 8gb?)

I was in the same situation and went for the 20’ series - but I do play flat games and love the fact that finally there is some new stuff going on (RTX) in games after what feels like a decade of nothing really new.

You can check out Benchmarks if it hepls… but in that case i guess its about what you prefere more


I would go for the 2080 (with an if…), if VRS is used for foveated rendering then the 2080 is the only option…
Maybe @Sean.Huang or @Alan.sun can suggest or tell if VRS is planed to be used…


I would say you’d want to weigh cost & performance. If you can get a 1080ti new or used for a great price; knowing later you will want to upgrade. This could be a good choice with pci-x 4 or 5 coming.

If you want newer features & possible performance gains now & don’t mind the price a 20series might be better.


Would not go with a card from Gigabyte but thats me.

1 Like

I used to go ASUS (not happy anymore with them) - but I kind of discoverd EVGA now - they have 3 years warranty and you can add on that on their site to 5 or even 10 years. I got two products now from them and they are quite good - its that kind of enthusiast niche hardware maker I was missing - they are into the tech and have great customer support - not all that marketing bull shit road the others went down. IMHO


I would probably get the Radeon VII over the 2080.


IMHO: I don’t see RTX as a gimmick, for me its the next stepping stone in GFX i was waiting for - I use it and game(s) do look better, more natural and still run at 60-90 FPS on WHQD. I dont wont to argue - its just a matter of taste, that is all I am saying (and hope I am not taking away the focus of this thread with that).

Do you need it? No, but who really needs any of the stuff we are into here?

For me, its kind of like the FOV with the PiMax, you take it on, it just is more natural and you don’t notice the true benefit until you take an old HMD on again - then you now what you are missing.


It’s the next stepping stone when it has wide adoption, sadly at this point it’s early adoption, problem is when raytracing is supported by more than 0.1% of the games you play the 2080ti will then be an entry level or below card. Let’s say it takes another 3 years for raytracing to be in 80% of games you’re now on a 3 yr old card and we’re up to the RTX 6080ti :slight_smile: this isn’t my first rodeo.


And the 2080 and 2080 Ti really aren’t powerful enough to “fully” raytrace games. The few current games which offer raytracing use it for effects, not entire scenes. I think it will be a few more GPU generations before we see raytracing finally reach its potential.


Bingo, thing is we’ll have to see how Nvidia go with this, they may just make one generation of cards supporting it then swerve it, we’ll have to see, if they are absolutely determined to force RTX on everyone then we will have no choice but to buy RTX cards in the future. It’s quite difficult to find anyone stocking a 1080ti these days but easy to buy a 2080, so they can control the market. What annoys me most about the 20 generation of GPU’s is that Nvidia sacrificed potentially giving us 50% more raw power for damned raytracing, like they sacrificed the 1000’s of games you already play for the ones you might play in the future once they’re created lol, thanks Nvidia :confused:


I would go for a 1080TI as it performs as good (sometimes even better) as the 2080. I think the benefits from more RAM are greater, than from software/driver mechanics that ‚could‘ be used and have to be supported by game developers… I use VR mainly for simracing and there is no developer that has implemented even anything from VR Works (e.g. SMP) and this is available for the GTX series for years now. I guess when developers are implementing such things as specific RTX mechanics, there will be the next GPU generation available… just my opinion.


Personally, I think raytracing is here to stay, mainly because its in DirectX not just an nVidia thing. It also fixes all the problems with the current clever dev tricks. In the long run (and I mean long run) it saves the developers and level designers a lot of time in the game development because they don’t have to spend as much time faking everything to make it look real.

AMD will support it also in due course because its part of DirectX (I believe they said they would support it in some way during one of the presentations).

Raytracing for VR may be possible with brainwarp although a 2080 might be pushing it a bit, now we have seen that its possible to get the FPS up to 60+ in (a non VR game) Battlefield V.

Certainly the future generations will do this much better.

Of more interest in the 20XX cards at the moment is DLSS. I’ve not seen a discussion on how this would help VR - this could be a really good tech. This is pretty straightforward for developers to implement but the main problem with this is its nVidia only.


RTX 2080 performs better already in most cases and this will improve drastically, when the drivers mature and games begin to adapt. Don’t listen to people, who have no clue about the many drastically improved architectural designs of the RTX cards beyond raytracing and DLSS.

One example is foveated rendering. RTX does offer a very easy way to implement it.
Actually Pimax just announced that they made their fixed foveated rendering work on driver level with RTX cards. 10x0 cards may follow, but still not offer the same performance as the RTX cards can deal much better with it.


Also - if memory serves me right - the 20XX cards have the ability to render multiple views in a single pass which reduces GPU load for VR content (again, if the game supports it).


Another huge potential is the texture space shading, especially for VR. In simple terms it could be used to “recycle” a lot of computations that are redundant, when doing stereoscopic renderings (= 2 eyes) or textures that don’t change much over time. The potential savings could improve performance by dozens percents.


Pimaxvr just released the news that they’re implementing Fixed Foveated Rendering in Pitool, at first it will only supported with 20xx series gpu.

About raytracing, works well in BF5 with 2080ti,
good framerate (~100fps) even with eyecandy upped to ultra + 2k resolution. In single player at least.


Hey it’s a great card, don’t take it personally if you bought one.
You’re points do all pivot on could be’s , maybes, eventually’s, at some point etc etc you do see that, ? And these things take a lot of time. Foveated rendering for example, something that zero software supports and your’re chatting RTX I do find that quite funny, sorry for laughing.
We have to look at how far hardware can dictate software.
Hardware is great but it needs software for it to do anything meaningful, without that it’s a useless lump of silicon and software takes time.
We can always talk about the “potential” for something but if the device you own is 5 years old before developers realise that potential you kinda missed the boat :confused:

I wouldn’t be surprised if NVIDIA actually ditch raytracing if developers don’t use it, people won’t want to pay that premium for a feature that’s not really used. We will see over the next few iterations of cards, maybe it’s here to stay, I hope it is. I will enjoy it in 2021 :wink: