Better Screens for VR

This could be even better than the screen Pimax8k is using!

4 Likes

And even better, there’s this: https://www.roadtovr.com/int-announces-2228ppi-high-pixel-density-amoled-display-vr-headsets/

2 Likes

There are three.

2 Likes

There are even more. For example there’s Kopin with 2000x2000 per eye panels. They’re licensing production capacity from “OLiGHTEK” but they’re just like Pimax: delays, delays delays, LOL. In their recent conference call on 8 May they said “the line is running, and I think, that it will start operational third quarter this year.”

Then there’s BOE, building a whole new factory for their ultra high res OLED panels, should be finished some point next year. And then there’s of course SAMSUNG, they’ve been holding their cards close to their chests but I wouldn’t be surprised if they’d have next gen panels ready later this year.

So the future looks bright. One thing though is that we’ll need new display controller chips. Google developed their own but I don’t think they’re licensing out yet. Analogix, which Pimax is using, still hasn’t released a next gen chipset.

3 Likes

But Kopin will present something in the Display Week 2018? I think not.

1 Like

Nope, you’re right about 3 at display week. Just wanted to give a short overview of the upcoming panel tech!

3 Likes

Ok, ok, I just thought we’d left a display from that event.

2 Likes

Nope. Exciting times though, I’m especially excited about the Google/LG HMD, I really hope they’ll give a timeline

4 Likes

but to drive displays like these, right now hardware can’t do it at decent frame rates, and not all engines are preconfigured to handle such resolution. Plus I can only imagine that unless there is all of a sudden a huge bulk buy for these displays either in AR/VR then we would not see these hit the market due to sheer cost, at least on the consumer side.

1 Like

I’m already worried about driving the 8K. I’m currently planning to wait for the 1180Ti, which at this rate won’t even be released this year.

1 Like

At the very least they would be using fixed foveated rendering for these kinds of displays so your gpu wouldn’t be rendering at that resolution.

Foveated rendering is still an experimental technology. Personally, I’m not sure it will work well, without specific support in each game/app.

sounds good on paper…

not consumer ready at this time.

maybe in the future

The oculus go uses it right now and considering how good the reviews are I think it clearly works well

Correct me if i am wrong…

Foveated rendering we mean something totally different then what oculus GO means.
What you are looking at is they call it fixed Foveated rendering so basically its fixed where it is and will not change.
SO what does that mean? That only where your normal gaze is , is where you have the best resolution/quality, all the rest is less quality.(only the middle part of the screen has the bestquality/resolution, beyond that it is less and cannot change)

It is not what we actually referring too although you could use that term…
So main difference is foveated rendering is wherever you look at…thats the spot that will increase in quality, and you eyes can move right or left the rendering follows the eye movement.(but the screen itself is 100% everywhere the same quality/resolution possible, so nothing is wasted but used smarter)

i can’t help to feel they trick consumers using a technique but in some way not giving consumers the best specs …let me explain what i mean by that…example you buy a 4k pimax screen, the full pimax screen is all the same quality…we could also call that fixed foveated rendering…and that goes for all hmd screens.(or are we gonna call it foveated rendering only when a part is locked and a part is unlocked when comes to best quality?)

But this oculus go it feels like they unlock 50% quality and put it fixed in the middle and the rest…will be lower quality rendering and you can’t change that at all…why they do this? because their hardware is not strong enough and this way you can give consumers 50% of the best quality fixed in the middle.(read do not move you eyes sideways!!) and the rest you tell consumers don’t look a t it!! use your head to turn and not your eyes so see…

Look point is we don’t need foveated rendering basically…it is only a technique to make the gpu do less work, if our gpu at some time in space are able to handle VR with ease with all settings highest…foveated rendering is useless…foveated rendering is only usefull as a technique when the hardware to drive the hmd is lacking behind in terms of power…so people can shout foveated rendering is awesome we NEED it!! i argue totally we don’t need it eventually…it can be useful for demanding stuff…but once the gpu is increasing in power it doesn’t matter at all.

and basically every year new gpu are most of the time 20%-30% better then previous generation.
you can talk about 8k screens or 16k screens etc. But for now i believe gpu power is ahead on screen technology.

just my thoughts

Have I somehow missed what you mean?

Current GPUs can’t even power our rifts and vives at full settings on most games.
Current GPUs can’t even power 4K monitors at full settings.
How is it that GPUs are ahead of screens?

1 Like

the generation that comes out this year will be on par. And i don’t expect 8k or 16k very soon, so ye 4k is pretty soon this year in the bag, but you have to pay for top notch cards.

Top card last year can run some games already at 60 fps 4k monitor, one can assume the next release will be able to do even better at 4k, we will see in a couple months.

I also wonder do any of you VR users overclock you cards for another 10-15% bump in power.

Uh-oh, now that is quite a bold statement.

Not only will PC-based GPU/CPU power struggle for quite a while to feed the VR/AR demands, and by that restrain the development of technically feasible higher resolution HMDs, but if you think about mobile VR/AR, this will even be an issue for much longer due to limitations of mobile processing power.

In addition we want photo-realistic quality, ideally with realtime ray-tracing. So even if 16K at today’s image quality level for flatgaming is achieved, we will demand much higher quality by the time. Otherwise the only occasions for requiring more GPU power would have been the introduction of VGA, XGA, HD, UHD/4K. Correct me if I am wrong, but I believe there was some development in terms of GPU power inbetween, although the resolution did not change…

So do not expect the considerations behind the utilizion of foveated rendering to go away anytime soon. Unless we see streaming powered by cloud computing, which could alleviate the GPU bottleneck issue. Or somebody comes up with another brilliant solution.

well its justa thought and trying to be realistic here.

2018-2022 will just be focus on 4k with 8k coming maybe peaking at 2022.
It is okay that the tech exist, but that doesn’t mean it will be consumer ready/friendly.

Or better said they need to milk your ass first with 4k before they throw out the 8k. And this is why i believe gpu power will always be ahead in terms of resolution.

Because lets be honest, today market we can finally say for that last 2 years every household has 1080p monitors and tvs, now when it comes to 4k monitors/ or tv it is roughly i think way under 50% of all households, maybe 20% for the monitors and those people have the 1080 gtx nvidia and up, for 4k tv it will be a lot higher since its not a hobby but more of a family thing. maybe 40% at this time or close to that number.

Based on knowing the money making mindset, and milking attidude, 4k need to be adopted first by a lot more % before they will throw 8k at your face.

So just talking about TV now, how many tv channels/programs are now broadcasting only in 4k?
How can we expect 8k then very soon when 4k is totally no adopted fully as 1080p is today, and 1080p also took quite a long time.

Now talking about the pc, not a lot different types 4k monitors are brought every year, and those that are out are pretty expensive , don’t expect pc households to adopt 4k monitors when most of them will not able to afford such things, and if they 0would buy such things they also need a expensive gpu to combo with that, and now we talking about a smaller number of people.

i believe we will be in 4k era for pc and tv for the next 2 years at least. in those years the price has to drop, a nd all the gpu coming out will be able to play 4k by 2020 pretty easy with a mid range card. THen another year of milking the mainstream people so everyone can adopt the 4k setup,

Now take a look at content, 4k blu ray movies for example…still a very small % compared to the now mainstream 1080p that is also a signal for us to read how far the market is at this stage.

I won’t say you are wrong, i just pointing out reality facts/issues as of today.

I cannot believe 8k is coming very soon, when 4k is not adopted yet, company need to make money and for now 4k is their baby cow. But8k technology exist that i am sure of, and 16k is in the making maybe as we speak of

There is no use to bring out 8k when you don’t even have the ,medium to put that one, something newer then
bluray need to be made, and then new medium players need to be brought out, they need to sale first the new ‘bluray players’ that might also cost a lot since its ‘new tech’ and since sales will below , price need to be high and this again create a small adoption among mainstream.

so i believe in the next 4 years it is gonna be 4k hmd /tv/monitors etc.

No use to bring out 8k screens hmd when there is not enough content, not enough power,not enough people to buy.

Anyway pimax 8k should keep us pretty warm for the next 2 years at least, if it doesn’t break :smiley:

We know 11080 ti ison the horizon for example this summer,what if tommorow nvidia says,guys this summer we
release 1180ti,for 700$ with 14 TERAFLOPS and in 2019 Q1 we will release 2080ti with20 TERAFLOPS.

i think most people will not buy the 1180ti and just wait for the 2080ti,unlless you maybe rich or have well above normal salary. But ordinary people will sit out and wiat and want bang for buck.
So just writing my thoughts and why i believe gpu power will be ahead of resolution soon, Once we can run the 4k monitors at 120fps with a high end card then you might expect 8k rumors coming out
So don’t think we will hear a lot about 8k in the next years,maybe they will even make a little step between??

6k? more milk ,more cows??

And 1 more point to ponder about…how much space is 8k movie gonna take? We need new codecs, new medium,new tech etc new and bigger storage for pc.How much GB would such game native 8k need as install, if it takes like 100gb,people need to start upgrading their HD with 10TB drives at least
how much space does a 4k bluray rip take these days 20gb?

just want to point out, you have to take in account every aspects…

FOVeated rendering might stay or might not,but its a tech that can be used if needed,but i just can’t see hmd to take charge in terms of resolution vs TV/monitors just because the market is so small, things are gonna be hjella expensive if they do, and we are looking at 2000$ hmd to make it profitable for companys if they would push a 8k native hmd with foveated rendering within next year or 2,

First of all, I would not see the computer resolution market to depend on the TV market. And the second question is, will the VR/AR development really be dominated by the flatscreen monitor resolution development ?

I agree, the flatscreen monitor move from HD to 4K is not being adopted by the customers all too eagerly, because honestly, if you have a 24“ screen, 4K is not really required as you will have to look very closely to appreciate the extra resolution. And many people are shying away from increasing the size of the monitor on their desk beyond say 28“. So here there may be an issue with a saturation where the further advancements are not viewed as required any more.

But at the same time the currently top of the crop GPUs are just managing to handle the current games at 4K, so anybody not having purchased the 1080 or higher will think twice about upgrading to 4K. Now, let‘s assume you could easily play a newly released game with great graphics at 4K with your 1060. Would more people consider a 4K monitor ? I believe yes, but then still the saturation issue is around. Why invest lots of money in hardware if you do not notice the improvements so much any more.

On VR we would ideally need 16K, remember the Abrash outlook from a couple of years ago. You want to fill out your entire FoV in a ppi resolution comparable to that of your flatscreen monitor - so where that monitor only covers a fraction of your fov with its HD/4K, you need to have additional resolution for the entire surroundigs at the same ppi. That‘s A LOT of resolution in total.

On the longer run I do not believe the screen resolution is the bottleneck - you see the new prototypes of much higher resolution/ppi displays being shown, but they will not result in products tomorrow because nobody would be able to utilize them due to lack of GPU power. Do you see any GPU prototypes which offer anything even faintly near to the power required to drive such a monster screen ? No, it rather seems as if the evolution of the GPU increase is slowing down, they are at 12 FinFET, looking to move to 7. They need to come up with new concepts how to increase the performance.

I consider the computational power issue to be the major bottleneck for VR visual improvements for the coming 1, 2 decades. Streaming from the cloud would be a solution if the 5G network is really up to the task in terms of latency and reliability.

1 Like