I hope eventually that we see displays tailored for motion resolution, bandwidth constraints, for how our gpus draw images, and for low persistence over and as opposed to the raw frames per second sample and hold approach that the industry keeps chasing, and has been for the last 20 years.
Due respect to the earlier discussion about how many frames we can actually see.
Keep in mind, a lot of that research, at least early on was done with displays that are basically a particle accelerator shooting electrons at phosphor on glass at a kilohertz rate.
IE it’s way faster at drawing an image, and works in a way ithat just works with the quirks of our visual system better.
It’s 100% On Point to say that a human being can only see about 140 frames per second, but the real question is not how many frames, but what kind of display? How does it draw an image? And importantly, how rapidly can it draw that image?
640 x 480 at 120 frames on a good tube, with a fast phosphor decay time is in fact amazing, and most people who do not have Eagle Vision would be completely happy with it.
The way CRT’s worked enabled them to give us a passable image at 50, 60, 70 and 80hz because they weren’t sample and hold. They drew fewer images, but did so insanely fast, relying on our persistence of vision to “see” a solid image.
Totally different ball of wax with any kind of sample and hold flat panel…
I See this discussion on the Blur Buster’s forums a lot, where the chief will talk about LCD displays eventually running in excess of 2khs IE 2000 fps for blur free sample and hold. That is a pipe dream in my opinion, until I see it.
That new Asus TN 500hz monitor. if you could hit 500 Hz without dropping frames means you’ll have about 2 milliseconds of visibility per frame without blur reduction or BFI. That’s a ton of frames.
Contrast that with the 60 hz OLED in the old gear VR innovator Note 4.
It also had 2 Ms of frame visibility time, but that was running 60 hz with BFI on an OLED with near instant response time.
Do you see the clear Dilemma?
Different displays draw images in totally different ways, and have different inherent characteristics and limitations.
For a flat panel sample and hold screen to have the look of say, an OLED, or a tube display, it needs more frames for brute force, it needs very good GTG, and very good pixel response times. Put simply, flat panels are not a particle accelerator shooting electrons on glass at the speed of light and causing phosphor to glow.
Think of an OLED’s pixels like a building full of windows, but a very spry 20 year old runs very fast to either open or close the windows.
An LCD is like the same thing, but a grandmother is opening and closing the windows.
A CRT is not someone opening and closing windows, but a dude with a particle accelerator shooting electrons to make windows glow, or not glow. No matter how fast you can open or close a shutter, its not going to be the same as the particle accelerator.
I frequently comment to the chief how insane the idea of blur free sample and hold display is because of all of the half-baked workarounds, interpolation technology, eye tracking, and trade-offs just to force that to work.
Even with machine learning based frame rate Amplification interpolation or AI enhancement you can realistically only double raw frame rate without incurring significant visual artifacts, to say nothing of the fact that our eyes just don’t like the feel of sample and hold, and bandwidth is at a premium always. The pandemic Illustrated that in spades.
Display Engineers understood these things in the days of cathode ray tube displays, and the chief blur buster is trying to get flat panels to run with the motion clarity of a tube display.
An emission based display that drew the image line by line from top to bottom at a multi khz rate was ideal for our eyes, and our entire display pipeline still works with the assumptions from raster driven CRTs.
As Carmack once said " you still have crufty bits inside of the OLED controllers waiting for the raster like its on NTSC."
You didn’t incur blur on tubes Beyond whats introduced from phosphor decay time, motion resolution was great, and most important of all the bandwidth was very manageable on those old displays.
It’s hilarious that we see consumers unironically asking " when am I going to get 16K at 120 frames per second, with Ray tracing and Wireless?"
Not soon, if ever at least not wireless.
A game like csgo can run at 300 FPS on my 1060 3gb which could reasonably be upped to 600hz with FRAT.
I can run my copy of GTA V with a modified ini file at 4K 120 on the same card. Workarounds like that though are a pain.
The Problem nobody is talking about is the bandwidth required to have that Brute Force blur reduction on Sample and hold, and even with strobing that has drawbacks of its own with the sample and hold display, like double images and way less brightness.
The best strobing LCD that is relatively artifact free is the display inside of the Quest 2. That has a 0.3 MS strobe Flash for great low persistence.
I’ve read the chief Blur Buster say that there will be perceptual benefit of 2000 Hz displays, and that some of the LED jumbo trons actually run up near that rate. What’s absurd is thinking you’re ever going to get that in the consumer space at a reasonable price.
The best OLED available is the $30,000 Sony BMX X300 which can only manage 800 lines in fast motion with Sony’s special BFI implementation on an OLED that barely manages 1000 nits full screen. 800 lines in motion out of a display capable of thousands more lines of resolution, but only for still images!
That’s even counting OLED’S near perfect gtg response time and near instantaneous pixel switch time. No LCD comes close.to OLED, but evem thats not perfect.
A 4K display that can only manage 800 lines of resolution in motion without interpolation Technologies. Its so ludicrously inefficient.
And that PVM has heat sinks active fans, and it’s roughly the size a of a small CRT, forget about an HMD.
We need displays that work the way the old ones used to. New of course, but with similar Behavior.
I wrote a message a while back that I sent to @PimaxUSA @PimaxVR @SweViver and others with an idea that I had for a display. I’m not an engineer but I’m amazed nobody thought of it, because it seems like it would be possible.
A MEMS laser Scanning system using tech like whats in the Nebra Anybeam. However, Instead of being a pico projector, the scanning beam is used to illuminate Quantum dots on a glass sunstrate. The MEMS scanning laser would basically act the way an electron gun in a CRT does.
ETA prime on you tube did a review of the Nebra Anybeam. Its a 720p 60hz pico projector.
That does not sound impressive, but when he was playing a racing game, you could actually read the road signs on the side of the track, without any blur. Even though the refresh rate and FPS was so low, it did not matter!
Its because MEMS uses lasers, and kind of mimics old CRT displays.
Such a display could be very low persistence, accept variable resolution ( resolution would be limited by dot size of the laser, scan rate, and the mask with Quantum dots on it.)
It would be plenty bright, and essentially be able to mimic some of those better aspects of earlier displays without some of the downsides.