PIMAX 5K+ GTX or RTX?

Well, maybe a miracle can happen, and there is some hack to somehow achieve foveated rendering out of the box for almost all games on Maxwell and Pascal. But I don’t know any technique of those architectures that would allow this.

So one thing is sure: It will never be as easy to implement as with Turing (RTX)!

First you complain, that SLI wasn’t supported as much as you hoped, and now you hope, that some miracle will make foveated rendering happen on hardware, which even in Nvidias most optimistic outlooks is not able to do…
Quite the irony. And pointless to discuss in such nebulous manner. If you have substantial ideas of how to achieve this with GTX tech, feel free to post it.

Well Hyperion has problems specifying which driver unless you absolutely push him. He’s actually talking about the Pimax driver not the GPU driver as you might assume as I did.

Here’s how it is, if the 20 series have new instruction sets at their driver level specifically made to implement foveated rendering and Pimax know how to use them, then he might be right.

I think though Pimax are using their own pimax driver to probably do something a lot simpler and that really doesn’t seem to require much graphical overhead in fact it’s graphical underhead, which is under sampling the peripheral. I see absolutely noo reason why this would be difficult or impossible to implement on a 1080 card if this is the case, I think it is the more likely scenario. I could be totally wrong but unlike him I don’t get offensive about it :wink:

1 Like

Now here’s the problem with your argument:

under sampling the peripheral

This is not possible without GPU driver/API support.

You were passive-aggressive from the beginning. If you were serious, you would have read about Variable Rate Shading by now and know that Turing actually has some instructions to introduce foveated rendering without the games necessarily having to implement it explicitly in their own code.

It must be because Pitool already does it to make up for the distortions in peripheral so I am told.

1 Like

Nah I wasn’t I was just a little miffed at your complete and utter fanboyism and practically telling people the sky would fall in if they bought a 10 series.
We’ve established that ss already happens around the peripheral because of stretching, pitool has to compensate and changes supersampling across the screen, so why on earth can a 2080 do it and a 1080 can’t ? We’re already doing it now is the point. Literally all pimax have to do is undersample the peripheral.

1 Like

Have a look at EVGA!
Gigabyte and Asus used to be my choice since the late '90s, but really I don’t like them anymore - products are faster left to die then you can read articles about the new ones - and I mean “high-end” products, that’s just not customer friendly - in the end, I used to think that I pay a bit more also to have better support, but It seems that after more than a year-long they only focus on the new stuff (divers, bios updates and so on are rarely updated after that).

1 Like

Hard to say we might see one if the pci-X by summer/fall. Perhaps sooner.

And now Intel is regretting that thought. :joy:

The premise of Amd’s new gpu design is a beginning. It won’t be long. And with Nvidia being an early adopter to get Amd gpus before most speaks volumes on these 2 companies.

Yes for some gains not to be confused with a game having native support for full gains.

Indeed pimax is working on ffr on 10series & likely pimax & amd will do the same on those gpus as well.

2 Likes

AMD has only recently turned their CPU division around and taken the fight to Intel and thank goodness for that too as Intel are a horrible corporate entity. They had taken advantage of their market dominance to screw every last cent from their customer base and they also retarded innovation whilst forcing updaters to have to buy new motherboards when upgrading their CPU as the sockets would change to be incompatible.

AMD didn’t and still doesn’t have the cash flow either Intel or Nvidia have but now with the Ryzen CPU’s they have returned to profitability. This is allowing them to now invest more money back into research and development and hopefully this will show returns soon in their GPU developments.

One hopes with some competition and a customer base that chooses the better company practices of open standards and better tech over proprietary tech and software, Nvidia will change their ways. Just look at Nvidia’s recent drivers allowing now use of Freesync over DP on their cards. Their trying to hold customers hostage to their proprietary G-Sync system gouging an extra $200 per monitor for no benefit over the open and free implementation of Freesync. Also look at their current GTX price gouging.

As a company, Nvidia is not one I really want to support and now, with my 1070 Max-Q laptop thrashing away to give output to my Pimax 8K, I toss up whether to succumb to Nvidia’s shoddy behavior or bit my tongue and wait to see what AMD’s Navi brings to the table.

It would be nice if there were other companies who could offer high end CPU’s and GPU’s - the market is too dominated by too few providers. Crony Capitalism.

6 Likes

And we owe Things like Freesync to Amd whom often releases open concept technologies.

1 Like

I am done with people who are resistant to facts. If someone is not willing to read about the stuff the discussion is about, but instead refrains to call the people explaining the facts “fanboys”, there is not much left to say, but to conclude that it’s either a troll or a hopeless ignorant.

So I just leave here an overview from an actual game developer over the new techniques of the Turing (RTX) architecture, for those willing (and able) to read:
https://www.reddit.com/r/hardware/comments/9g0ppv/interesting_new_features_on_the_turing/

Variable Rate shading: This is also a huge feature, but its biggest use is for VR. It allows the developer to change the “resolution” of parts of the screen at will. The fun part is that the internal images are unchanged, so its not only extremelly easy to implement, but it could be done as a driver level toggle.

Now go log into Reddit and call this Nvidia fanboy out for his fake news agenda! HAahahahaha

PS: Oh, and congrats on the outrageous mental gymnastic achievement, for calling someone an “Nvidia fanboy” for arguing against buying an Nvidia GTX 1080 Ti. :smiley:

1 Like

I too am in need of a GPU and debating between a 1080ti, and a 2080. Though I freely spend thousands on my hobbies, I refuse to get bent over paying double for a 2080 ti that isn’t that great, knowing it could be obsolete if “TRUE” next gen GPU’s come next year - AMD or Nvidia.

I see talk that the 1080ti is much cheaper than 2080. I do not see this? I only see 1080ti’s used now and only a couple hundred at most less than the 2080. In many cases the same price? Same price as a new 2080, for a USED 1080TI seems like a bad deal!

Talk of implemented FFR on 2080 cards pushes me that way currently. Though I’m surprised the gains for FFR are 18% or less. I would have suspected bigger gains…

2 Likes

Why? It’s the same speed as the 2080, does not have the extra tensor & rt cores and sucks up 300 watts, also costs the same

Are these 2 related?

Sounds like another case of G-sync vs Freesync.

I would make use of 16gb of memory more than I would tensor cores.

And why does this not surprise me. Where possible I will support products that will make use of open standards . Freesync over G-sync, Vulcan over DirectX 12 OpenVR over any locked in proprietary VR.

One of the reasons I also went with Pimax. They are working on an open platform and seem to care about pushing VR tech not just their bottom line.

Also good point by Wmacky about purchasing mediocre performance product now, at high prices or holding off for the products coming down the pipeline later in the year.

Even still, with tweaking I can get my VR stuff to run at acceptable performance on my 1070 and the experience will be better than most alternatives due to PImax. So I might just sit it out and see what comes down the line and have a huge jump in VR performance from what I have now.

I’m having fun anyway and I get called “Hammer Head” by my family :laughing:

3 Likes