Pimax Crystal QLED VR Headset Revealed - Everything You Need to Know! [NextGenVR]

So, all modern displays are mass manufactured on a flat mother glass that is then cut into the various display sizes. With more specialty mobile phone and VR displays there is a kind of wafer that they are printed on, (in various forms) and then, like CPUs, they get cut and binned.

QD OLED has just come out. Samsung had the ability in mobile displays to do classic RGB OLED since the Note 3, but that manufacturing process could not be scaled to large displays at acceptable yield or cost.

Its one reason why QD OLED is the new darling of displays. Its a happy balance between profit and quality.

truly Curved displays cant be made either way cheaply yet. At least until inkjet printable OLED or electroluminescent Quantum Dot becomes super cheap.

Then you have the problem that even if a display itself can be curved, the curvature is limited by the panel circuitry itself. You would need to be able to print the display and the electronics on something like a fabric or extremely pliable plastic substrate. Its just not there yet.

Yet another reason why above I mentioned an idea for a laser based MEMS mirror system shining lasers onto a phosphor coated screen. You could do really interesting display and lens geometries with that sort of setup. I wrote about using it as a backlight, but you could use a MEMS system to do projection, or just directly drive a panel like an old school CRT.

If you look up the Nebra Anybeam or Show WX projectors, they were called “focus free” because they could project on any surface without distortion or blur. Really cool stuff.

3 Likes

Indeed someone should remind him that the head of Half dome quit after he shelved it. Lol

3 Likes

I think Zuckerberg wants the perception out there that HE is inventing the metaverse, that he is the godfather of VR.

Do you notice that we very seldom hear from Abrash, or Carmack, or any of the guys that had actual passion at Oculus anymore?

I was on MTBS 3D forums back when Palmer Luckey was building his prototype kit project, and Carmack just appeared on the board for nerdy reasons. It was nuts, especially when Palmer sent him his prototype. Massive shot in the dark there.

Contrast this with Meta that is acting like the 2006 Intel of VR. He’s spending millions on this research, but it likely wont become a product, until you can do all the things from that video on a mobile SOC. That is a decade away, at minimum.

Carmack had the idea at the first Oculus connect for getting HDR on a then current mobile device by making a deep interlaced variable persistence display, Where you could change the persistence of the display line by line and tone map it.

(it would exploit our visual system’s tendency to adapt to light. Sit in a dark room, and flash a bright light real quickly. Even if its a 2 watt bulb, it appears blinding because you are adjusted to the low light.)

There is no way to get a true HDR 1000 standard device using LEDs and today’s batteries.

The reason that prototype was hanging is probably because it needs an outlet. Not too mention the custom 3D printed metal cinter heatsink and fans that thing needed. lol

2 Likes

Razer phone is at least iirc is hdr1 compliant and 120hz Qhd.

1 Like

It worked for Apple. Everyone knows that Steve Jobs invented smartphones.

1 Like

It would have worked for Zuckerberg if he hadn’t changed the name. Oculus was already synonomous with VR for so many people, and then like a moron, he changed the name to Meta.

Yeah, but that HDR standard is in highlights, and the battery, naturally doesn’t last long using HDR content. What they really need for mobile HDR are better polarizers. As the Meta guys said in the video, the HDR prototype is only getting 1% of the light to your eyes. This thing tied to the ceiling with a metal heatsink holding its optics, pushing through 2 LCD layers and you are seeing 1% of the light.

Give me color accuracy in sdr before worrying about hdr. Current hmd are trash tier at color reproduction of 709 which should be a cakewalk by now.

I haven’t tested a Varjo but the rest are awful.

what was the crystal refresh rate at full resolution?

My assumption would be 160hz with the 42ppd lens on the lower FOV and 90 to 110hz with the 35ppd higher FOV. @PimaxUSA care to clue us in?

The resolution of the panels doesn’t change because of the lens. Therefore the frame rate shouldn’t be affected by which lens you choose.

The Crystal has the same number of pixels as the 8KX, and the 8KX can do 90Hz at large resolution. The 5K Super only achieved its high frame rates by narrowing the FOV substantially. But the Crystal is already starting with a narrow FOV, and it seems unlikely that Pimax’s approach here would be to narrow it further to reach 160Hz, though it might. That would be silly though. Look at our 160Hz postage stamp!

I think the key factor for enabling 160Hz on the Crystal is foveated transport. I think it’s actually going to offer full resolution at full frame rate. But only for games where foveated rendering and foveated transport are working. Otherwise it will have to drop back to 90Hz.

There’s president there. The upscaling mode on the 8KX is kind of like foveated transport in a way. At least in terms of transporting fewer pixels across the DisplayPort cable at 110-120Hz and then expanding them to native resolution at the headset. This could be a second generation of that kind of feature which is able to achieve 160Hz and near zero visible quality loss due to DFR.

2 Likes

i hope its 120hz on 35 PPD so i play skyrim caped at 60 smoothed that would be a dream.

2 Likes

Off course I understand that lol. ::- ) What I was meaning to get at, (and you mentioned) was that the 160hz refresh rate is probably only going to work at 110 horizontal, (so using the lens optimized for that FOV would be best.)

On current headsets you can only use the highest refresh rate on the lowest FOV for bandwidth reasons, and I suspect Pimax might artificially do that, even if they don’t strictly have to on crystal because they’re really going for the premium experience.

I agree that it might be 160hz with foveated rendering/transport or DSC, but here’s hoping that it can just do 160hz full out.

Hoping Pimax has learned by now that when it comes to sourcing components for a next gen VR device, get as much bandwidth as you can. don’t cut corners.

Please Pimax, no sub standard bridge chips, cabling, driver boards, or scalers. Give the Crystal the best components and most bandwidth you can.

Ar the asking price, it needs to give the full premium experience.

I am excited for the Crystal.

But that wouldn’t be how that works. The narrow lens gets 110 HFOV at full frame. If the Crystal disables using the outer edges of the panel to increase FPS, that would make for something much less than 110 HFOV. Using the narrow lens on top of narrowing the FOV by disabling pixels would make the resulting FOV even narrower.

I think it’s moot though, because I don’t think that’s what Pimax is doing on the Crystal anyway.

I initially assumed that they would do this on the Crystal the same as they do on the 5K Super without thinking about it. And then your post caused me to think about it and realize it wouldn’t make any sense. The FOV of the Crystal will already be narrow to begin with. If you start chopping off the edges, you would quickly reach silly small FOV before eliminating enough pixels to reach 160Hz that way. I can’t imagine what use case such a mode would be for.

The 5K Super only reaches its claimed 180Hz at potato FOV. But potato FOV, while much smaller than we’re used to from Pimax, is still roughly equivalent to the Quest 2 FOV. So fine. Clearly that’s a usable FOV. But the same technique on the Crystal would arrive at a much smaller FOV than that. And that’s kind of useless.

That’s when I realized that with the technology they’ve talked about Pimax wouldn’t need to do it that way. The math works out for the Crystal to theoretically be able to do 160Hz without narrowing the FOV.

I do expect the 12K to implement this technique, however. The 12K will have FOV to sacrifice to the purpose. The Crystal does not. On the 12K, you’d narrow the FOV and use foveated transport on top of that to reach that claimed 200Hz.

Impossible for the cabled link. DisplayPort 1.4 can’t carry enough pixels to keep up with 8KX resolution at 160Hz without using foveated transport. It’s already barely managing 90Hz on the 8KX. The 8KX itself is demonstrably able to run higher frame rate than 90Hz, but it’s bottlenecked by the cable.

100% it would be useless but I don’t ever put it past Pimax to do something silly like that and then reverse coarse on it when called out by the community. lol We have too much experience of exactly that kind of thing happening. (I don’t think it actually will happen again, was just being a bit pessimistic.)

They have a history of saying something grandiose for the spec sheet, and then having a really very bad implementation that takes months to get fixed.

With the 1st 8K HMD unveiled for kick starter, the X was just a pipe dream that had not even been planned.

Even 5K super didn’t exist yet. The HMD didn’t have the electronics to drive the panels at native resolution, and their bridge chip limited the refresh rate, even with the correct cable that could have handled it with the right electronics.

They had 1080p or 1440p as the 8K’s initial base resolution, being software options carried over from the 4k for upscaling, but the scaler was terribly limited, and the most they could squeeze was 80hz.

I remember I suggested 1920x2160 as a better base resolution for a scale to get rid of some of the upscaling artifacts but their scaler was too limited, so even though the bandwidth was there, it was a no go.

NeoSkynet brilliantly wanted them to try a kind of sub pixel rendering at the hypothetical 1920x2160 because the OG 8K had a pentile structure that looked god awful when scaled. The idea was to fake an RGB structure with a shader, and then you could bypass some of the scaler’s limitations. Basically a pre process prior to sending anything to the panel.

Adjustable refresh rate and FOV setting, along with specialized lenses for each was my initial suggestion to them as a stretch goal when testers were experiencing distortion, bad scaling, poor compatibility, and wobble.

When we knew that a hardware solution wouldn’t happen, I asked if it could be done in software. They put a poll out asking if we wanted these as software options (as a band aid.)

It served as a stopgap measure to fix distortion and make it easier to drive on 2016 era hardware, and as we can see better for things like supersampling. I cant stress enough the upscale image was awful.

Your probably right that they will use foveated transport to get the 160hz.

I do remember vaguely back in the day that Pimax had the 160hz spec on the initial 8K via a concept of what they called Asynchronus refresh. 80hz, but staggering the refresh doing one eye, then the other rapidly. @Heliosurge remember that?

I was never a backer of the initial 8K because of those shenanigans. I am really optimistic about the Crystal though.

The “X” was the original plan for the 8k model. What they had discovered about maybe a month to launching the KS that native 4k/eye did not seem possible with current hardware available.

So the Upscaled 8k became the 8k model. At the community’s request a limited 8kX experimental native 4k/eye was put que for a limited 400 units run.

4 Likes

This is late but, there was this: Diagram of “Same Platform” as 12K Series Diagram
Also, shown on the Pimax Crystal Reveal Video just after: 18:48 in the video

and the source Reddit post by Quorra: OG Pimax Crystal Reddit Post

The context of what I mean is simply that “Wi-Fi 6E” and “60GHz” are shown as being separate pipelines.

Edit: Clearer Links and better wording.

1 Like

That’s very interesting. I hadn’t caught that about that diagram even though I’ve seen that diagram.

It makes sense though. I think what it shows is that the 60GHz is going through the same pipeline as the fiberoptic cable… as in 60GHz has the bandwidth to carry the video in a more raw, low latency way. It mentions specifically DFR enabled, so I think that means foveated transport. And possibly some minor real time compression. Running essentially the same protocols over the 60GHz as it does over the fiberoptic.

Whereas the 6E pipeline runs through the XR2 and would be something like the Quest 2 airlink where it’s using a more substantial and lossy video codec like h.264/h.265 which offers a lot of compression but has high latency and processing requirements. Presumably this pipeline needs to drop the resolution and frame rate substantially versus the other one due probably more to limitations of the XR2 than the wireless bandwidth.

Now during the MRTV interview about the 12K, they had mentioned that 60GHz was already operational, but the 6E support was still being developed and being harder to get working well.

So what I’m thinking is that Pimax did not claim 6E support for the Crystal because they didn’t expect that pipeline to be ready by release. And they mean to go ahead and release without it. Possibly it would be enabled by later firmware updates.

In that case, the slide mentioning 6E is still there in the presentation because that is part of the intended design even if they didn’t mean to officially support that feature on the Crystal at launch. They may have reused that diagram without considering that it mentions 6E.

2 Likes

Try looking for an enemy aircraft in bvr or bfm fight in DCS, that’s a scenario where I am sure it can matter,…

But sure, to play vanishing realms (don’t get me wrong, I like that one a lot) you are not going to need it. But to play such games the HMD would be a waste, until like always such titles also evolve to higher textures and polygon count.

1 Like

source ? The main reason i get pimax is both worlds !