11 series nvidia cards coming around Juli. Sounds like these might be perfect for the 8k, what do you guys think?
Seems that Iâm stuck with my 970 until July ^^
If Hynix starts production in July, I would expect to see the new cards in 2-3 months after, i.e. September/October. If however AMD does not come up with anything substantial until then, they may as well delay the introduction to Christmas season.
I have previously held out for Ti models. Has anybody done the maths, to determine bang/buck rato between xx80 and xx80Ti cards, and the consistency of this over time? Is it signficiantly more economical, in that sense, to go with vanilla models each generation? :7
if you have a 1080ti i guess you could skip a generation. but with the pimax 8k, who knows how fast youâll feel the need to upgrade.
We-eell⌠I found the 1080Ti insufficient right at its launch. Elite Dangerous needs that juicy supersampling, and those shadows at maximum setting. :7
It should hold its own against vanilla 1180, if earlier generation handovers are anything to go by, but the question, in that light, is whether itâs cheaper (per performance) to skip the entire 11xx generation, go for 1280, and then stick to vanilla, or to get the 1180Ti, when it comes, and.keep Ti-ing⌠:7
Things like RTX will be restricted to Volta, too⌠On the other hand, almost nobody ended up using the Pascal-gated âVRWorksâ, despite how beneficial it can be, especially when you have a wide FOV, like the 8k â I could see the same happening with functions/APIs that lean on tensor cores⌠:7
OpenXR canât get hammered down soon enough. If I were a developer, I wouldnât tie myself down to proprietary stuff that may be deprecated in a few months time, either. :7
I feel ya. Letâs wait and see what the benchmarks say.
I had one of my 980âs die on me, so I ended up shelling out (at the worst possible time) for a 1080ti. For whatever itâs worth, if you arenât keen on paying the crypto premium on a new card, definitely sign up for evgaâs auto notify. Now that bitcoin is a little unstable, 1080tiâs are no longer selling out in minutes. Youâre paying the full-bang MSRP, but at least no premiums on top of that.
Steve Burke over at GamersNexus has been a great source of coverage on the new cards. GDDR6 is expected to COST manufacturers another 20%. So expect to see that as a minimum- more likely there will be a profit margin on there as well. My guess is the 1080ti equivalent of the 11 series will be a pretty expensive card, particularly when crypto starts to bounce back.
Iâll wait for the 1180TI this time. Was quite the bummer when they decreased the price for the 1080 just to release the 1080TI for the same money as the 1080 at launch.
My 1080 will have to make due til then.
The 1080ti is the first TI I bought and have been using nvidia since the 780s.
I got it just before the bitcoin craze and funnily enough I ended up ofloding my old 1080 at more than the price I bought it at. Lol.
The speed difference was noticeable and made my Rift perfectly playable.
I suspect that i will wait for the 1180 TI unless NVIDEA introduce some newfangled tech that makes it a clear winner. Just like the 10 series was far faster than the 9 series at 4K.
One architecture will address the PC gaming market, while the other will be the companyâs new entry for the AI and compute markets.
What I donât like about this announcement is the separation of compute cards and gaming cards.
Many parts of the software industry has slowly been converting routines to use the huge power of parallel processing (CUDA/OpenCL compute) for physics, realtime ray-tracing, shaders, compression, special effects etc so if NVidia split this into two products it will mean that anything with high core counts will end up in their pro range of cards along with the premium support/driver costs that go with them.
I hope that is not the case!
they donât deserve my money. Itâs an unethical company as bad as its gets. And they rip off gamers just because they can.
Well Iâm not sure if you can call it âunethicalâ but their strategy of delaying their products because they donât NEED to release them earlier, because of the non existent competition from AMD, that really sucks and I do hate them for that. They basically give us the middle finger. BTW I still highly doubt weâll have those cards this quarter, Iâm guessing Q3.
www.hardocp.com/article/2018/03/08/geforce_partner_program_impacts_consumer_choice/
they just continue with their reckless path just to make everybody pay more and more.
I understand how expensive this 12 nm process must be. I donât care who does it, but we really need a vr dedicated from the bottom up architecture. And we need nvidia to implement atw/asw at the hardware level with options in nvidia control panel. I just want nvidia to do more for vr directly.
Interesting links. And annoying as hell to read. I am a strict NVidia guy due to being bitten by AMD BS marketing on too many previous occasions. But that doesnât mean I want NVidia to monopolise the GPU market.
Having always had Amd gpu s in my rigs until vive arrived⌠had to go 1080ti luckliy before prices exploded⌠really need Amd to to step up âŚjust doesnt seem like its going to happenâŚ
our life - especially considering Pimax 8k - would be so much easier when VR software developers would support dual GPU setups. AMD and NVIDIA have documented how to enable it for DX11 and DX12 in 2016, but the uptake has been minimal⌠although itâs a different pair of shoes when compared to SLI/Xfire for 2D displaysâŚ
NVidia built VR-SLI into their VRWorks SDK but as you say it is not popular. And I totally agree that VR-SLI is something that would solve many issues for consumers as the bottleneck will always be single card speed, even a 1080Ti can struggle.
You would think with two displays in a HMD they would invest a little more into drivers to detect and then make one card per eye just work at the driver level so developers donât need to do anything bespoke at their end.
No, thatâs not how it works. The application (e.g. game engine) has to define the render target, there is where the VRAM is defined for a specific size and layout of the displayed image, but there are two in case of VR.
So it really boils down to the engine developers of VR titles to give an option to split the render target jobs onto two GPUs. Itâs rahter straight forward compared to AFR for a single render target. The blame is really on the devs of game engines and their priorities. On their end VR is obviously not even a 2nd class citizen.