Display panel tech

Doesn’t mean it’s the same you’re thinking of.

I remember noticing flicker more on crt screens when viewed in the corner of my eye , but they would have been 60hz

When I put on gearvr, bright color like white background in my peripheral vision shimmers, I can sense flickering at 60hz. In vive I couldn’t sense it at 90hz.

I will have to see what 80hz is like when pimax arrives.

I can see (and distinguish between) CRT’s that have refresh rates of 60, 75, and 85 Hz. A 60 Hz CRT was highly annoying to me and 85 had no noticeable flicker. A 60 Hz “slow” LCD monitor doesn’t bother me, but that might not be true of the “fast refresh” LCDs Pimax will use in their 8K. 80 Hz will probably be OK for me, but it certainly might bother others.

3 Likes

called it ,when your right your Lillo well done take the win

Heh, I used to own a multisync CRT monitor with an absolutely ridiculous decay time - the mouse cursor left a loong smooth wake behind itself. :9

This did let me run my Amiga at PAL interlaced resolutions with hardly any flicker, though. :7

2 Likes

I had both the Vive and Pimax 4K, and the Vive made me feel really dizzy after 20 minutes or so, whereas the 4K never made me feel dizzy. In fact, after using the Vive, I had to lie down for an hour or so in order to recover. It was a a shame, because I found the image in game far more pleasing to the eye with the Vive (far less strain), than with the Pimax 4K.

It’s going to be interesting to see how I fare with the 5K and 8K. I’m praying they will be perfectly fine for me. I believe the actual issue for me was the IPD range, as far as strain on the eye is concerned. As for what caused the dizzyness with the Vive, I have no idea, but it’s probably not the refresh rate, unless my eyes do not like higher than 60, or whatever the 4K is.

2 Likes

It is night and day difference, my DK2 gave me nausea so bad that I couldn’t even look at the headset without feeling nausea and I was developing on it so had to keep going. It was an amazing yet horrible experience all at once lol. Even thinking about it gives me shudders. But then you get your sea legs and things like moving outside tracking volumes seem to have less long term effect. So although 80Hz is not ideal for new VR users, people that already have and use VR will hopefully not experience the same degree of nausea.

The testing units will of course reveal this in the next month or so.

Our ability to detect Hz difference is individual but is also a logarithmic scale rather than linear. The higher you go the less difference it starts to make. e,g. 60Hz vs 75Hz is a notable difference. 75Hz to 100Hz is an improvement but not as much as you would think. 100Hz to 120Hz is to me almost no difference. Of course if you make the gaps bigger then you get to see it again like 75Hz to 144Hz but we are not talking about big difference here. 10Hz difference at the higher end may not be very noticeable imo. I hope not anyway.

2 Likes

Hi Chucksta, what IPD setting did you use on the Pimax 4k and which one on the Vive?

1 Like

No idea what the Vive one was. It worked fine out of the box.

I tried all settings for the Pimax. The problem was, my IPD is way out of the range (a lot higher) of the Pimax 4K. I think I would have needed to do one of those mods I have seen on the forums here. I know that the 5K and 8K(X) are within my range, so I am confident they will be fine for me, IPD wise.

I don’t have the HMDs anymore. I got rid of the Vive because it made me dizzy/ill, and I got rid of the 4K because I had visual problems.

1 Like

Okay, let me add something here, as I see some some are starting to get confused and mix completely different things, or take the example of CRT screens, when the problem is just different on different screens.

CRT screens worked in a completely different way than current liquid crystals, cathod tube was fast but the excited elements from the cannon on the screen (pixels) had a short persistence, even if they were updated at high speeds, that was the resulting primary flicker reason they had at the time.
Let’s leave the older CTR monitors of the home computing era, because they were just Tv’s with very low refresh rates and even used interlaced PAL/NTSC screens; the lastest flat panel models got as high as 100Hz at SVGA resolutions, but it is not important anymore, since these have now phased out from the market completely today.

On the other hand LCD work in a completely different way because they are not really refreshed continuously, liquid crystals mantain their state, and just transition to a new state when called to change, this is a good thing for displays who don’t have many fast changing pixel per frame (good enough for watching Tv or Movies, but not for fast paced games like racing games etc.) , but the problem lies in their much slower transition time speed, and they can’t really go as fast as the current gaming would require, hence the various techniques applied to “add” more frames or alternate black screens to pump out more “supposed” frames, but this doesn’t solve the problem, only tricks the user perception to see a smoother image and give the impression of more frames.
This is why almost any LCD screen is going to show the alternate grey light and darker bands that are clearly visible in the pictures I have posted, that are scrolling up or down if you watch a video taken with a fast enough high end shutter speed camera, this very fast strobing effect is not noticeable to the naked eye, even to most trained users, but the user brain is perceiving it, in any case…

Looks simple but it is not so simple to understand in deep.

DLP tech is a great technology with lightning fast transition time capacity, but it is not well suited for some applications including VR, and is very difficult to create the micro mirror matrix that goes over the Full-HD panel resolution, so we’ll leave it out of the equation.

Oled panels on the other hand are more similiar to cathod tubes in some regards, except that every pixel is an indipendent light source, and can refresh at much higher speeds (current most advanced researchs got as high as 1000Hz), and even if they need to be constantly refreshed in every pixel for every frame, they effectively allow the flicker to be eliminated if very fast refresh rates are applied (120Hz or more), and there are not other interfering or “dirty” frames added.

Now, knowing the above, please try to keep the focus on the problem, as I see some are still not grasping what we’re talking about…the problem is NOT that some humans have the ability to detect flickering/strobing images while some others don’t, it has some significance to understand what we’re talking about but does not point out the problem itself, so the Pcgamer article has almost no relevance here and can even lead to diverting attention more than being useful. Yes, some people, like avid and experienced gamers have a more trained eye to detect differences in refresh rates, ghosting, tearing effect, stuttering images etc. but any human being perceives these things on an unconscious and subliminal level, and if some experience nausea it has nothing to do with their being worse than others but rather that they are better picking up the warning messages their body is sending them, so…just like some military are adapted and trained to sustain very high stress levels and keep the concentration and focus on the target, some people will feel no effect, but the downside of this is that they will not be aware of some input information they are subject to.

Hope this helps to pick up what we’re talking about, now…

The first Vive is probably one of the worst about nausea inducing headset (DK2 was somewhat bad too…I had it in the past), even if it has an oled screen, it has one of the worst strobe effects of commercially available VR sets, and it is an artificial added strobing, just watch how many youtube through the lens videos show it, you will easily notice a dark band scrolling up or down (very high speed added black frames).

Strangely, CV1 is much better regarding this problem, and Samsung Odyssey is even better.

It’s the same problem I’m talking about, the difference can be easily noticed here:

This guy had not used a very high speed shutter camera, but the effect is clearly noticeable in any case.

4 Likes

The lights are going out

The strobing is indeed intentional, and a good thing, that is used by both Vive and Rift, and black bars across video is an artefact of the rolling shutter of one’s camera (EDIT: …not matching the rate of that of the display, or in this case, the rate of the global refresh of the display).

Why you think it should be a good thing ? I’m curious why you think so.

The artifact only appears if the camera shutter has higher speed than the panel refresh rate, if it is the same or close it doesn’t appear at all, like in my Oled TV picture, where the exactly same camera at 100fps/Hz was used, like for the other IPS screen pictures.

Also, why on newer panels like the Odyssey it is almost not there ?

If you have full persistence (i.e if every frame is visible the full period of time, until it is replaced by the next), you perveive the noticeable “jump” in motion across the screen, between them. This comes across as “judder”: Your brain expects continous motion, but it comes in discrete steps, and your perception gets screwed up - a sprite will appear to “jerk” across the screen instead of moving smoothly, and ghost at the the edges. The ghosting is completely created by the brain, and becomes doubly problematic in a VR context, when the screen is stuck to your face, and you get to look around, with the picture having to update the view accordingly - you turn your head, and the view moves along, rather than staying behind, being, as it is, the virtual world around you – a major cause for nausea.

In lieu if having to render hundreds of real frames per second, to produce full proper animation, strobing is a good workaround; Your brain will assume things move along between “flashes”, when the conflicting information is removed (EDIT: Think of it as the temporal equivalent of what you have spatially, when you pixel a slanted line using only dots, spaced apart a bit, rather than contigous – the latter will proudly exhibit its jaggies - for the former, your brain just thinks of it as a straight line, albeit dotted). As long as the flash rate is high enough, you will not notice the strobing, and your retina will receceive the same net amount of photons over time, as with persistant light, just like you do generally not with good quality LED light bulbs. (Volvo cars from some period of years had rather low strobe rates on their rear lights, which you may have noticed if you have ever been out driving behind one at nighttime, and possibly been a bit tired as well: Unpleasant POV ghosts of the things sticking to your retina, when looking around - especially in one’s periphery.)

The judder is a problem that was introduced with LCD screens; Unlike the CRTs before them. whose phosphor excitation would decay well before the next field was scanned out (the flicker was not in every way a bad thing), they were driven with capacitors, that hold a charge, to keep their state stable, producing the flicker-free greater persistence. Remove the capacitors, and there is no reason you wouldn’t be able to directly PWM a LCD at good rates, just the way you might a LED. Not that anybody does it, afaik, other than digital holographers.

There is less… hmm… “banding”, the closer your camera rates matches your display, yes, but it does not just plateau and end – you can still be out of phase with the screen above the matching frequencey, and then it becomes a matter of both the persistence of the display, and the exposure/shutter speed setting of your camera.

In CRT days, there used to be people who ran businesses that did nothing but provide display readouts for in-scene monitors on TV shows. They did not only run the monitors at the same PAL/NTSC rates as the cameras, beacuse that in itself is not enough; But also genlocked the monitors, so that their VBlank matched that of the cameras,

How much strobing you perceive in a HMD (and you are my sole (as well as fresh new) source for the purported perceived difference between these devices, on that account), can depend on several things: The refresh rate, the on/off period ratio, the luminosity, whether the user is tired (or otherwise affected), and so on.

1 Like

Perfect !!! I agree on almost all you explained here, it is basically correct, but you have to also consider that all this has been done to compensate the latencies and disavantages of the LCD and Led’s used for backlighting them because they are too sluggish in any case, and this mostly applies to these panels only ! We can not even talk about persistence in these panels since it is continuous, it is the transition that is so slow and need all that you explained (and more…) to eliminate the judder.

Yes but it is not a matter of persistence, since LCD don’t need that, but about TRANSITION. And it also proves that the LCD panel we’re analyzing has a slower refresh than the camera shutter, even if the panel manufacturer declares for example 90Hz, in reality there are no smooth transitions from one frame to the other at that “declared” frequency because most of these panels can not switch at more than 60Hz, real frames, CLEAN FRAMES x second.

What I’m saying is that this strobe created for the LCD’s unfortunately introduces another type of problem that is not consciously perceived by the user; in the end it is not a problem for the casual user or gamer who just want to play and don’t give a fuck about everything else, but the true story is: it affects the brain in a unhealthy way.

There are dozens of serious studies and researchs now viewable at various sources, including the US Patent Office site, some of them even belonging to a until very recently “classified” info, that are documenting these adverse effects and how they can affect human behaviour and brain waves, it is no secret, not anymore.

Oleds of the advanced generation we still have to see in a consumer product, if driven correctly by the panel logic and the graphic card pumping out very high frame rate numbers correctly in-sync with the display, can avoid all that, be it strobing, ghosting or judder.

Well…the effect we’re talking about and the difference is visible and discernible from anyone willing and open-minded enough, I’m surely not saying I am seeing it with naked eye, nor I have any special sensitivity that lets me seeing it, but can cite a couple or three instruments who can show the difference, and the numbers, the examples I posted are easily replicable by anyone who have the different panels to test on.

That’s all.

[quote=“Lillo, post:223, topic:6092, full:true”]
Perfect !!! I agree on almost all you explained here, it is basically correct, but you have to also consider that all this has been done to compensate the latencies and disavantages of the LCD and Led’s used for backlighting them because they are too sluggish in any case, and this mostly applies to these panels only ! We can not even talk about persistence in these panels since it is continuous, it is the transition that is so slow and need all that you explained (and more…) to eliminate the judder. [/quote]

Hmm, try the other way around. The slower your transition times, the smoother the in-place transition, producing ghosting that actually happens on the screen, rather than the “in the viewer’s brain” one, that I mentioned; They are not the cause of judder - that is having full persistence, although they are the cause of aforementioned temporal smear, which, as you suggested, we do not want to have to watch.

That the strobing masks these transitions, is more a happy side effect than anything else, certainly not the fundamental reason it is implemented, which is, again, to deal with the persistence issue, and its resulting psychovisual “judder” effect.

These slow transitions are such by design, basically implemented by engineers stuck in an analogue mindset – the actual transmissive liquid crystal materials can transition much, much, faster, when the driving circuitry is not deliberately made with latency incorporated.

So the strobing thing is independent of the display technology – the very main reason we got OLEDs in the Vive and Rift in the first place, is that their rapid transitioning made strobing possible - without needing to rewire any backlight, much less any active matrix drivers.

Today we have LCD panels where we do have better control over the LED backlight (rather than older and less responsive CCFL), even if the active matrix remains sluggish and analogue in kind, making them a viable cheap alternative, since we can strobe them, just like we do already with the OLEDs, and consequently also have global refresh (as opposed to rolling shutter, with a full frame that is snapshot in time, and can not adjust on-the-fly for your head movements, while the refresh scans out), just like with them.

[quote=“Lillo, post:223, topic:6092, full:true”]
Yes but it is not a matter of persistence, since LCD don’t need that, but about TRANSITION. And it also proves that the LCD panel we’re analyzing has a slower refresh than the camera shutter, even if the panel manufacturer declares for example 90Hz, in reality there are no smooth transitions from one frame to the other at that “declared” frequency because most of these panels can not switch at more than 60Hz, real frames, CLEAN FRAMES x second. [/quote]

Frankly, the thing with TV advertising spouting utter nonsense is beside the point; That the marketers do is an absolute given - they are going to do it at any given chance - it’s in their DNA. -If they say that a blank period between frames obviously counts as a proper frame, that is not the same as the engineers having had that exact thought in mind, back when they implemented it.

Transition, you should see not as a black band, but as a crossfading gradient (thankfully, I suppose the strobing got rid of that).

If I have a 60HZ rolling shutter monitor with less than full persistence, and a camera with the same, and end up so unfortunately synced that the camera leads the monitor, I will get just a black screen in my footage - if the monitor is 120Hz and still low persistence, I will get the effect of the monitor raster scanning twice in the time I’m capturing (scanning down) one frame; Persistence matters, and having a display that is speedier than one’s camera does not necessarily mean I’ll get a flawless picture.

If we want more than 60 real frames, we’re going to need higher than 60fps content, or at least interpolated frames (…which make up a whole pallet of cans of worms in themselves - “soap opera effect” and all that). :stuck_out_tongue:

[quote=“Lillo, post:223, topic:6092, full:true”]
What I’m saying is that this strobe created for the LCD’s unfortunately introduces another type of problem that is not consciously perceived by the user; in the end it is not a problem for the casual user or gamer who just want to play and don’t give a fuck about everything else, but the true story is: it affects the brain in a unhealthy way.

There are dozens of serious studies and researchs now viewable at various sources, including the US Patent Office site, some of them even belonging to a until very recently “classified” info, that are documenting these adverse effects and how they can affect human behaviour and brain waves, it is no secret, not anymore. [/quote]

I take it, then, that you only have incadescent lighting in your home, and in fact prefer the slow LCDs, which do not need high refresh rates to hide their strobing. Flourescent lights have always flickered naked-eye noticeably at mains frequency (although today HF electronic ballasts are replacing traditional electromagnetic ones), and LEDs are PWM:ed at a what is usually a significantly higher than mains frequency, but their phosphors has much less sustain, making their amplitude shifts towards infinity stronger.

Heck, traditional film cinema projectors multiplies their shutters: Every frame is shown three times, in order to increase the strobing from 24 (feed the next frame into position during the blackout) to 72 Hz. (…and this is something cinematographers have discovered they have to back away from now, as they began to shoot higher framerate movies, since the repeated showing of the same frame caused the persistence judder we are becoming familiar with, especially given the 48 or more frames per second footage has very little motion blur, compared to the blur from long exposure times, that was necessary to make 24fps palatable motion-wise)

[quote=“Lillo, post:223, topic:6092, full:true”]
Oleds of the advanced generation we still have to see in a consumer product, if driven correctly by the panel logic and the graphic card pumping out very high frame rate numbers correctly in-sync with the display, can avoid all that, be it strobing, ghosting or judder. [/quote]

It’s going to take significantly higher frame rates than we have today, though. If not through sheer brute force, it could happen through on-HMD frame synthetisation, or by at long last throwing out the legacy “frame” paradigm, and updating individual pixels, on the screen, the very moment they finish rendering, which could be viable, once we have raytracing that is fast enough.

[quote=“Lillo, post:223, topic:6092, full:true”]
Well…the effect we’re talking about and the difference is visible and discernible from anyone willing and open-minded enough, I’m surely not saying I am seeing it with naked eye, nor I have any special sensitivity that lets me seeing it, but can cite a couple or three instruments who can show the difference, and the numbers, the examples I posted are easily replicable by anyone who have the different panels to test on. [/quote]

If you have data from properly conducted measuring, I’d certainly love to see it.

1 Like

Agree on almost everything, seems to me we’re talking the same things but using only slight different terminology and having a slight different background due to knowing maybe one or two elements that could influence quite a lot the…say…“big picture”.

Will post some material that shed light in this direction…

This could come up as interesting:

Former DARPA executive talks about shady Facebook project.

Remember what that genius Mary Lou Jepsen said ? It has been already researched from her and others, and is already a reality, nothing to be really impressed about…at least for the insiders like them and people in the think-tank of technology.

This is the REAL reason FB became so invested in Oculus and VR technology, and why we must guard closely what direction this technology is going to take, and possibily take the driver seat of it, and not being just caught in it because of the hype and false/misleading advertising and tech talk.