11 series nvidia cards coming around July. Prepare your wallets

11 series nvidia cards coming around Juli. Sounds like these might be perfect for the 8k, what do you guys think?

5 Likes

Seems that I’m stuck with my 970 until July ^^

2 Likes

If Hynix starts production in July, I would expect to see the new cards in 2-3 months after, i.e. September/October. If however AMD does not come up with anything substantial until then, they may as well delay the introduction to Christmas season.

I have previously held out for Ti models. Has anybody done the maths, to determine bang/buck rato between xx80 and xx80Ti cards, and the consistency of this over time? Is it signficiantly more economical, in that sense, to go with vanilla models each generation? :7

if you have a 1080ti i guess you could skip a generation. but with the pimax 8k, who knows how fast you’ll feel the need to upgrade.

2 Likes

We-eell… I found the 1080Ti insufficient right at its launch. Elite Dangerous needs that juicy supersampling, and those shadows at maximum setting. :7

It should hold its own against vanilla 1180, if earlier generation handovers are anything to go by, but the question, in that light, is whether it’s cheaper (per performance) to skip the entire 11xx generation, go for 1280, and then stick to vanilla, or to get the 1180Ti, when it comes, and.keep Ti-ing… :7

Things like RTX will be restricted to Volta, too… On the other hand, almost nobody ended up using the Pascal-gated “VRWorks”, despite how beneficial it can be, especially when you have a wide FOV, like the 8k – I could see the same happening with functions/APIs that lean on tensor cores… :7

OpenXR can’t get hammered down soon enough. If I were a developer, I wouldn’t tie myself down to proprietary stuff that may be deprecated in a few months time, either. :7

I feel ya. Let’s wait and see what the benchmarks say.

I had one of my 980’s die on me, so I ended up shelling out (at the worst possible time) for a 1080ti. For whatever it’s worth, if you aren’t keen on paying the crypto premium on a new card, definitely sign up for evga’s auto notify. Now that bitcoin is a little unstable, 1080ti’s are no longer selling out in minutes. You’re paying the full-bang MSRP, but at least no premiums on top of that.

Steve Burke over at GamersNexus has been a great source of coverage on the new cards. GDDR6 is expected to COST manufacturers another 20%. So expect to see that as a minimum- more likely there will be a profit margin on there as well. My guess is the 1080ti equivalent of the 11 series will be a pretty expensive card, particularly when crypto starts to bounce back.

1 Like

I’ll wait for the 1180TI this time. Was quite the bummer when they decreased the price for the 1080 just to release the 1080TI for the same money as the 1080 at launch.

My 1080 will have to make due til then.

The 1080ti is the first TI I bought and have been using nvidia since the 780s.

I got it just before the bitcoin craze and funnily enough I ended up ofloding my old 1080 at more than the price I bought it at. Lol.

The speed difference was noticeable and made my Rift perfectly playable.

I suspect that i will wait for the 1180 TI unless NVIDEA introduce some newfangled tech that makes it a clear winner. Just like the 10 series was far faster than the 9 series at 4K.

2 Likes

One architecture will address the PC gaming market, while the other will be the company’s new entry for the AI and compute markets.

What I don’t like about this announcement is the separation of compute cards and gaming cards.

Many parts of the software industry has slowly been converting routines to use the huge power of parallel processing (CUDA/OpenCL compute) for physics, realtime ray-tracing, shaders, compression, special effects etc so if NVidia split this into two products it will mean that anything with high core counts will end up in their pro range of cards along with the premium support/driver costs that go with them.

I hope that is not the case!

they don’t deserve my money. It’s an unethical company as bad as its gets. And they rip off gamers just because they can.

Well I’m not sure if you can call it ‘unethical’ but their strategy of delaying their products because they don’t NEED to release them earlier, because of the non existent competition from AMD, that really sucks and I do hate them for that. They basically give us the middle finger. BTW I still highly doubt we’ll have those cards this quarter, I’m guessing Q3.

2 Likes

www.hardocp.com/article/2018/03/08/geforce_partner_program_impacts_consumer_choice/

they just continue with their reckless path just to make everybody pay more and more.

1 Like

I understand how expensive this 12 nm process must be. I don’t care who does it, but we really need a vr dedicated from the bottom up architecture. And we need nvidia to implement atw/asw at the hardware level with options in nvidia control panel. I just want nvidia to do more for vr directly.

1 Like

Interesting links. And annoying as hell to read. I am a strict NVidia guy due to being bitten by AMD BS marketing on too many previous occasions. But that doesn’t mean I want NVidia to monopolise the GPU market.

1 Like

Having always had Amd gpu s in my rigs until vive arrived… had to go 1080ti luckliy before prices exploded… really need Amd to to step up …just doesnt seem like its going to happen… :frowning:

our life - especially considering Pimax 8k - would be so much easier when VR software developers would support dual GPU setups. AMD and NVIDIA have documented how to enable it for DX11 and DX12 in 2016, but the uptake has been minimal… although it’s a different pair of shoes when compared to SLI/Xfire for 2D displays…

NVidia built VR-SLI into their VRWorks SDK but as you say it is not popular. And I totally agree that VR-SLI is something that would solve many issues for consumers as the bottleneck will always be single card speed, even a 1080Ti can struggle.

You would think with two displays in a HMD they would invest a little more into drivers to detect and then make one card per eye just work at the driver level so developers don’t need to do anything bespoke at their end.

4 Likes

No, that’s not how it works. The application (e.g. game engine) has to define the render target, there is where the VRAM is defined for a specific size and layout of the displayed image, but there are two in case of VR.

So it really boils down to the engine developers of VR titles to give an option to split the render target jobs onto two GPUs. It’s rahter straight forward compared to AFR for a single render target. The blame is really on the devs of game engines and their priorities. On their end VR is obviously not even a 2nd class citizen.

1 Like