@Heliosurge Thanks again for your help and assistance and yes what you say makes sense to me.
Your very welcome always glad ti help where & when I can.
Thanks, that looks promising. Perhaps the next generation of cards can do this (and perhaps a little more) at stock frequencies and thus reach playable fps for 8k X native 2x4k resolution for some games.
Wouldnât have thought that the resolution can have so little impact on fps. Probably depends on the game though (how much resolution independent stuff it calculates etc.)
P.S.: Pitty you couldnât upload that result. It would have topped the current highest 1080 TI based result by 7 fps
P.P.S.: Found a result with a 1800x and 1080 ti at factory clock speeds - probably not playable yet but close enough that the next gen might close that gap: NVIDIA GeForce GTX 1080 Ti video card benchmark result - AMD Ryzen 7 1800X,ASUSTeK COMPUTER INC. PRIME X370-PRO
I imagine I wasnât able to upload it because it was custom, I expect to be a bit lower then 7700k systems.
This was a custom run at those settings, not the benchmark run
Orange Run: 9989
Blue Run: 3327
Next year sometime I might consider switching to whatever intels refresh of Coffee lake is but weâll see I know the 8k X is a stretch to use right now but I fully accept that lol
Not all games suffer quite as bad with the AMD systems where it loses out on that much Raw Frames, Largely due to just how Futuremark stuff runs.
I honestly donât know how my First run at 5120x2880 got that fps, its odd because second time I ran it I got the 73 or so magic
Actually I was thinking, if they were using the brain warp thingâŚwould it mean that the video card just needs to be able to push 4k at a high frame rate ?
Yes, âjustâ 4k@180 Hz (even the top line of todayâs GPUs is still quite some way away from this goal for nontrivial applications)
brain warp is just because the displays are effectively running at 180hz, but displaying 90hz of inputs, a black frame is inserted between frames to eliminate ghosting, and make it easier on the eyes.
Unfortunately Noam is wrong about needing 180hz to run these displays, so just think of them like 90hz displays with some extra magic sauce.
My understanding was that the panels run with 90 Hz each, sync-wise shifted by half a frametime and having a long black-phase between briefly showing the actual image.
As far as I understood, Aneurism asked whether you could see this as a GPU calculating many (180) 4k frames per second where images are alternatingly provided to the left and the right eye panels. Instead of calculating one 7680x2160 image at 90 Hz for both eyes at the same time.
interesting Well I wonder how it will process then, because alot of games and processors will struggle to hit 180 frames, I understand they are alternating sync-wise.}
So Left 1 Right Black, then Left Black Right 1, Left 2 Right Black, and Left Black Right 2.
Would that mean that effectively its running 180hz at 4k 3840x2160, then shifting eyes every frame, if thats the case it would likely run poorly in alot of cases, because it would require games to run at 180 fps which is alot of physics and per-frame calls.
if it somehow just staggers the frames by 1 but effectively to the game runs at 90 but just staggers the frame by 1 frame.
i.e render Left 1, pause, then render right 1.
Obviously Iâm still confused by some of the concepts theyâre purposing
@Neokolzia, yes, thatâs why I am so inclined on having options to subsample etc. to make realistic use of the 8k X.
If we would just render 4k @ 90 Hz and alternate that between eyes then you would effectively have 45 Hz per eye. Even with the brainwarp trick of alternating them Iâm not convinced that this would be fun. And it wouldnât explain Pimaxâs âperceived 180 Hzâ claim - this would - at best - be âperceived 90 Hzâ (or more likely a not-so-virtual puking fest, cannot imagine that they mean that).
But the Pimax didnât contribute to any of the threads where people where asking for details, so everybody is left with best-guessing