Anyone sticking with the 8k?

After seeing the test videos i definitely want a 5k+ instead of the 8k. At least for the time until i am receiving my 8k-X. Reason for that is simple:

  • To get the 8k to the same image quality level as a 5k+ it takes 20-30% more super sampling.
  • But even the current GPUs have already problems to deliver the needed FPS. So ‘20-30%’ more SS might be already to much.
  • And even with more super sampling it’s not guaranteed that the image quality is identical. As said before the upscaling process will blurry the image anyway on the 8k.
  • Sure it has more SDE but much more crisp image quality
  • And it has 90Hz instead of 80Hz.
3 Likes

It always hard with individual subjective preferences.

Take photo printing for example. Kodack printers were cheap & had decent quality depending on the end user. The reality though from a technical angle; the photo was not printed in optimal quality.

But is the result horrible? Yes & no are both true subjectively. :laughing:

How can I know, not having tested the 2080ti? Still, the 8K image is beautiful and don’t forget head motion produces different results, than clean still images. Moving the head, the brain also interpolates the information between pixels.

2 Likes

The 4k can’t use brainwarp as it only uses 1 panel. I imagine with the headset finalized we will see Brainwarp in the near future. @crony posted some interesting ideas on overdriving the 4k to further reduce ghosting & AtW might also help; the 4k doesn’t cutrently utilize post processing. Run Untity Chan & press spacebar.

The ‘real’ question is:

  • Will it be possible to upgrade the 5k+ and/or 8k to an 8k-X ?
  • And when will SLI be usable in VR-Games in a broad manner.

That might be the a good reason to stick with an 8k in case its possible.
But even than, i would assume that the upgrade makes no sense until Foverated Rendering is possible and/or the next maybe even next next generation GPUs are out. If we would have an 8k-X Right now, it would even perform worse than a 5k+/8k because of the dramatic need of GPU-Power.

2 Likes

I think i’ll stick with the 8k and hope for brainwarp, reprojection, motion reprojection or asw…

if nothing comes of it, that would be a big drawback.

1 Like

@neal_white_iii was mentioning sharpening the image before upscaling to improve visual quality. Something he said he does with photos to reduce the softening effect.

I am quite surprised by the results of the M1/2 Testers. But do you really think it would come fast enough to stick on the 8k instead of e.g.: getting the 5k+, using it until the necessary software is available and then sell the 5k+ and buy a 8k?

Indeed. Only the 8k was offered to posdibly have an option for backers to upgrade to 8k-X. The 5k+ imho would be expensive since you have more componets to change to upgrade.

While a game itself might not support multi-gpu. The VR compositor/headset driver can. While will not give the gains of a game also supporting mgpu; will have some gains once the VR driver takes over.

Native resolution is good but a grid is a grid. The smaller the object, the bigger the impact by the grid width or distance between pixels.

That might be the reason why distant objects are easier to identify in 8K as it has higher DPI despite they’re upsampled. On a 5K the grid width is more apparent no matter how sharp the pixels are.

As a result the 5K is sharper overall because of the native rendering or poor upsampling, while tiny objects may appear as easier to identify on the higher density 8K.

The overall sharpness certainly matters but SDE could be pretty destructive to immersion. In case they both matter you as much probably the answer is much dependent on the capability of the upcoming RTX Ti - if it is capable of providing the extra apparent sharpness by SS at the expense of energy then 8K could give you the best of both worlds at the moment.

Just my guess.

1 Like

i have been using the 4k for a year and a half now. i dont think i could go back to a 1440p panel, its 8k for me.
i just cant stand SDE, blurry edges i can do something about with software or settings, the SDE is permanent.

5 Likes

I can’t imagine how awesome the 8k-X will be then :slight_smile: Crisp sharp as the 5k+ but (nearly) no SDE either.

2 Likes

Almost as much as I can’t imagine the GPU required to power it! :grin:

2 Likes

isn’t the 4k using 1 4k panel, with the 5k using 2 x 1440p panels? I think the resolution is slightly lower, but not by that much?

1 Like

while the 2 x 1440p panel have about the same amount of pixels as the single 4k panel, they are spread over about twice the surface area. = bigger pixels, aka worse SDE (also the 5k+ pixels are arranged in straight lines which makes it more noticeable in my opinion/experience)

3 Likes

Ah that makes sense. If only the 4k had roomscale, it could really have made a huge impact on the 1st gen. Really impressive specs apart from that!

1 Like

With the old 4k it’s possible to add roomscale via Vive Tracker. There are a variety of other options. But the vive tracker in theory should be easier.

The 5k will have more sde due to ppi vs the 4k’s ppi.

1 Like

Careful, I’ve already changed my mind from 8k to 5k+, now the 4k is starting to sound interesting :stuck_out_tongue:

I had actually originally backed 8k-x too before changing when I thought about how much I’d need to spend to hope to power it.

1 Like

I am now leaning towards the 8K basically for future proofing (possible upgrade to 8KX) and the fact I have the RTX 2080 Ti coming next week.

2 Likes

Truthfully the current iteration of the 4k is showing it’s age much like Rift & Og vive are compared to current offerings(old gen m1 FoV). :beers::wink::+1::sparkles: