PIMAX 5K+ GTX or RTX?

I second the ASUS. Have been using the ASUS strix 1080 for 2 years and it’s been rock solid. My ASUS strix 2080 is arriving today. I’ve only been able to see a savings for a used 1080ti for about $200 compared to a brand new 2080. Paying $200 extra is worth it for new tech and new new new…

3 Likes

Lol I had those kinds of problems with MSI & had 2 asus board died. In my general xp Asrock(used to be iwned by asus), Asus, Gigabyte & MSI were top boards. (Poor quality vrm & chokes in the past)

Msi typically was not great for OC (though they seem to make giid nvidia cards).

Gigabyte usually has boring bios & depending on which you buy in gpu not always good overclocking ie my r9 390 g1 is a good card but overclocking limited compared to others.

Asus usually good but doubt their old rep of less 1% failure rate is intact.

Asrock good value for overclocking.

But none of three us as good as they used to be tbh. Like JBL & Cerwin Vega they sell now by there name.

1 Like

Pascal (GTX 10x0) may be able to do fixed foveated rendering, but:

  1. the quality and performance will be worse than on RTX
  2. it will not be able to do real dynamic foveated rendering!

Pascal is now almost 3 years old. A 1080 Ti only makes sense as a filler card to bridge the gap to the next node shrink. But you could end up waiting for 2 long years during which a lot can and will happen.
Pimax claims that RTX is already able to do fixed foveated rendering on driver level (so works with almost all the games out of the box). From this point doing real foveated rendering is only a small step for RTX, but impossible for any GTX card.

Mark my words: You will heavily regret your 1080 Ti purchase once your eye tracking module arrives!

4 Likes

See ya? One day i say “possible features will be implemented”, and another day: “ff rendering is in beta, currently using 20** models”. Youll newer know how new things will evolve, dont stick with old time technologies just to save 100$ on a card that cost around 700$. The difference isnt worth it, as you’re bying a card for approximately 2 years or longer. You wont buy new 1080ti “much” cheaper like others saying or it will be the chippest one which might be slower than 2080.

2 Likes

And RTX purchasers may regret their purchases when pci-X 4 or 5 comes out. :laughing:

The only regret I have is that I couldn’t get the Ti version (because it wasn’t readily available, not due to expense). I’d really like a bit more speed than my overclocked 2080 provides.

Like many, I would have preferred a cheaper, faster card, without the raytracing and AI features.

Why? Turing is cutting edge tech. There is no improvement on the horizon. AMD is in a coma. Nvidia is in no hurry to deliver a node shrink, just to compete against itself.

2 Likes

Your right Nvidia didn’t recently cave & added support for freesync. And obvious you were not paying attention to CES. Amd is not in a coma; in fact Nvidia is also in And’s early adopter list.

Amd already has real time Ray tracing in Radeon Rays.

1 Like

I wouldn’t mind a source for your claims tbh, I see pimax has started work on 2080 for fixed foveated rendering first, but they didn’t give any particular info about whether it was driver level or whether it was in Pitool or any idication that it would be any more difficult to implement on the 10 series, so I’m wondering where you got that I may have missed something ?
One could also argue that by the time we get our eye tracking module we will be on the 30 series of cards[still waiting for foam and headstrap]. By then even a 2080ti could be an entry level card.

1 Like

And when is that happening, in 2-3 years?

1 Like

It could be sooner than that, it could be the next 30 series drops raytracing altogether.
If devs don’t add it to enough games it will cease to be a thing.
Remember the release of SLI everyone thought great I can simply buy another card and everything will be 2x as fast ? That didn’t happen and nvidia promised software fixes ad infinitum and doubled their card sales initially, lol to this day hahahahah it’s still unsupported by a great many games and no driver level support to make all games work and absolutely no VR titles that I can think of despite the fact each card could drive one screen each theoretically:/ this is one reason why I’m giving RTX a wide berth.

1 Like

I bet Intel might have been thinking the exactly same thing for the past 6 years :slight_smile:

3 Likes

You are right, I must have missed something: What graphics architecture was shown at CES 2019 that can beat the RTX 2080 Ti?

(/s)

1 Like

In the quote you can clearly see that I literally wrote “Pimax claims”.

This only makes sense, if it works on driver level, and even if it did not, the effect is the same.

Maxwell (GTX 9x0) and Pascal (GTX10x0) can divide the image into different rectangular regions, and then apply different sampling rates to those areas.
Turing (RTX) can change the sampling rate easily, freely and dynamically all over the image.

  1. Multi-Res Shading (Maxwell)
  2. Lens Matched Shading (Pascal)
  3. Variable Rate Shading (Turing)
1 Like

Right so they didn’t actually say that then you just inferred it. There’s other ways it may be implemented with Pitool and simply down to their own sampling as we’ve discussed I think that is more likely, I think we will see that when they dial it in for 10 series and there’s hardly any difference :slight_smile: I like a good internet jaunt, it will be fun to look back at this thread when they release it to us and see who was right :slight_smile:

1 Like

I am not sure if you are just trolling now. Their implementation of fixed foveated rendering works via their Pimax driver, so what’s the point of arguing semantics? Please feel free to invent a new name for what they are doing.

You sound like someone, who is totally clueless, only understands parts of a discussion and then grabs the small bits he misunderstands (because of lack of context) to make a laughable argument.

You do know there’s more than one driver right ? The Nvidia gpu driver would be the one they would have to use the instruction sets if they existed for it to be at that driver level…if you’re talking about Pitool, yes it’s a driver but it’s not the driver. You seem to be getting upset because I don’t agree with you. Like I said, we will see when pitool supports both 20 and 10 series cards for fixed foveated rendering as I have said it probably will, so crack on being offensive and I’ll just wait for it to happen and prove you’ve been chatting cobblers.

1 Like

No, you are not able to disagree, because you have no clue. It was already obvious, that you have no clue about graphics card architecture, but now I even question your ability to understand colloquial English. So in simpler terms: When I say: “Feel free to call it what you want.” it means: I don’t care what you think, what you say, because it’s totally irrelevant.

1 Like

Like I said buddy, the clock is ticking if Pimax don’t support foveated rendering for 10 series I will come back and apologise, will you be man enough to apologise to me when that happens ?

1 Like

So Hyperion thinks GTX cards will not be able to do foveated rendering ‘at driver level’. Based on supported technologies in GTX vs RTX, I agree with that. At least they won’t be able to do it at the same efficiency, I think.

Maybe Pimax could surprise us though, so I don’t want to take any hard stances. If they can do it then cool. But VRS is better suited for foveated rendering.

1 Like