But really most things said in the past will look ridiculous when viewed in the future.
A perfect example is the Christopher Reeve’s Superman movies; great at the time of release; but very poorly written stories when compared today. Just look at the Kryptonian Rolex Jor El was wearing. Lol
No it’s not.
The Tom’s article referenced WCCFTech simply to point out that they had already written about the rumors.
Our source is not WCCFTech. The information is accurate, though the people who told us aren’t authorized to share that info.
I guarantee you are wrong about GPUs.
There’s still tremendous room for advancement. We’re nowhere near the maximum capabilities of graphics systems.
Also, sound card advancement didn’t end when 16-bit cards came out. They have continued to evolve. However, they shrunk the hardware to level that is sustainable for onboard components. There’s no need for add-in sound cards anymore, just as there’s no need for add-in network cards anymore.
Discreet graphics cards aren’t going away any time soon. Hell, Intel just scooped up a dream-team to build it’s own GPUs.
There’s a case to be made for integrating GPU hardware into CPUs, as we’ve seen recently with the new Intel chips with AMD GPUs and the new AMD APUs with Ryzen graphics. However, those aren’t likely to overtake discreet add-in GPUs any time soon.
The big horsepower GPUs will remain independent cards for a long time still.
People who are satisfied with today’s graphics hardware are not the people that would buy the next gen hardware right away. But don’t confuse a few comments on a news post as a barometer for the market demand. There’s still a large community of gaming enthusiasts that will happily buy the best GPU money can buy, and there always will be.
Nvidia isn’t sitting on old stock. The company has barely been able to keep up with the demand for high-end GPUs because of cryptocurrency mining.
Nvidia also said that it would be slowing down production of GPUs because it anticipated the crypto-mining crash that we’ve just gone through. The company still has stock of existing cards, but it probably isn’t sitting on a massive overstock right now.
Well that’s true of course. But look at VR games like SKyrimVR, they look so very detailed already, the only thing I miss on my VivePro is a little more sharpness, so just a little bit more pixels. Sure, you can keep on improving but at some point people won’t care much anymore. Look at TV’s, I’ve bought a 80" UHD TV and while it’s all nice, I usually don’t even notice much difference between 1080p and 4k content. At some point it’s just ‘good enough’ for most people. Sure, there will always be a subset of people who’ll want the latest and the best, but that’s just a subset of the market.
My point is: the increment in ‘perfection’ is getting smaller and smaller each generation and thus people will start to care less. I’m pretty sure this is one of the main factors why NVIDIA and AMD are taking it easy now.
Uh-oh, just look at Skyrim VR and then check out something like Witcher III on 4K monitors and tell me that you wouldn‘t love to see that quality in VR.
Hell, I thought Comanche looked great at the time with its voxel tech and was fully immersed, but I know now that there was actually space to improve…
Skyrim VR sucks. It’s a port of an old game. I’ve managed to get Far Cry 5 to work somewhat on an odyssey using VorpX, and it’s pretty freaking cool. Makes Skyrim look hokey. Not very playable yet, but I bet someone fixes that. But it is really neat to just go sight seeing.
It’s not entirely clear. There are rumors though. Moving to 7nm will reduce the size of the chip, which translates to higher profits (assuming yield is good). Also, less heat and faster calculations. I’d like that actually, since the initial yields might be low and NVidia would likely call it the 2080Ti.
Oh, that sucks big time !! We need the 1180 to relaese soon so we only have to wait for 6-8 months for the Ti version. This sounds as if they will milk the 1080Ti for another 6 months.
Hmm another interesting tidbit though is that Nvidia has changed the schedule on Hot Chips.
“NVIDIA’s Next Generation Mainstream GPU” has turned into “TBD”
Yeah, @CDaked noted that above. At first I was still hoping that this was to keep it a secret for as long as possible. But now it seems much more likely that they’re actually NOT are going to release the cards anytime soon.