Shanghai Surprise (Neoskynet Director's Cuts)

Hello, my name is Neoskynet and I am a Pimax testers and a fan of writing reviews and comparisons of projectors. As you might expect, image quality is my top priority, and the ideal thing for me would be to be able to use Pimax 8K, in addition to games, to watch 3D ISO/SBS and 2D 4K/UHD movies.

Some projector reviews (in Spanish, if you have time to read):

I also want to say that I have done countless tests with M1s and have asked Pimax for major hardware and software upgrades, but I cannot explain them because I am under NDA.

In this series of chapters I will explain, without breaking the NDA, the general tests I have done to better understand how virtual reality headsets work in general and, based on this knowledge, what we can expect from Pimax 8K and also what is impossible to achieve with current technology.

I hope that it will serve to increase the knowledge and understanding of all backers, and that Pimax can improve the Pimax 8K to become the headset that marks the beginning of the second generation of virtual reality and can compete directly with companies like Oculus, HTC, Sony, Microsoft, etc


I wish all the best for the Pimax company and especially for the testers and backers.

Chapter 1: Understanding Human FOV

Have you done the test? I’ll explain it to you:

1.- Put your head straight forward.

2.- Close your left eye and look with your right eye to the left. You will see that one part of the image is covered by your own nose. You can explore the limits of your vision but be careful because and the other part of the image you will see the object on your left (it will even hurt your eye a little when you try to keep your eyes in that position, which indicates that it is forcing the ocular muscles). The right eye corresponds to the red lines in the picture above.

3.- With your left eye still closed, focus your right eye on an object to your left that is almost hiding with your nose. This will check the limit of your right eye’s vision by looking to the left.

4.- Move your head up and down slowly, keeping your eyes straight with your right eye. You will see that when you move your head upwards the object is covered by the tip of your nose, and that when you look downwards there will come a time when the object will be covered by your right eyebrow (and also your eye will hurt more because you are forcing it).

5.- Put your head back straight and while looking at the same object (close to your nose) with your right eye, open your left eye and focus with both eyes on the left object you were looking at (you will see that your nose will magically disappear). You will see that you see clearly the object. Try to open an close your eyes alternately while you keep the object in focus. You will also notice that you are straining your eye muscles and even your eyes will hurt a little if you keep the object almost covered by your nose. That’s because you’re forcing your eyes since you usually don’t look at this angle, unless you’re going to see a pretty girl’s ass coming up and you’re next to your wife.

6.- Now comes the interesting part. Close your left eye again, keep looking at the object to your left with your right eye, move your head slightly to the right, so that your nose covers the object.

7.- Now while you keep your eyes on the object on the left (which is now covered by your nose), without moving your head, open your left eye and try to focus on the object on your left. You will see that you cannot unless you are effectively a mutant homo-cameleon and then you can go to the circus to earn a lot of money and invest it all in the Pimax company, to become rich (or poor, because this depends if the Pimax engineers are able to make a god Pimax 8K). The important thing is that at that point your peripheral vision begins and you have realized yourself that it is impossible to focus there and everything looks very blurry.

Conclusion: Physically your eyes can only focus on certain FOV angles. All the peripheral FOV is completely blurry. That’s exactly what Pimax 8K lenses and screens would have to reproduce, because it would be like reality.

Chapter 2: Screens, lenses and other infernal gadgets.

In this chapter we will explain what happens when you join different screens and lenses to make a VR headset and how is related with FOV, SDE, visible pixels, etc. This is only a general information and is not related with any VR headset especifically.

1.- Looking a TV 4K/UHD screen.

Technically speaking 4K is not UHD. Actually there are plenty of UHD screens, but the people use 4K comercial terminology. Yo could learn more in: 4K resolution - Wikipedia

A TV 4K/UHD have 3840x2160 = 8.294.400 pixels. And every pixels is formed by three subpixels RGB (red, green and blue). I have count it with a lens magnification x20, I swear by Snoopy. Next my wife has kick out me to sleep in the dog’s house.

I have to explain no all TV screens have RGB supixels. Some screens have a blank pixels between that create a bad Greyish blacks and poor colors. They are called false 4K/UHD. Some LG TV screens have this. Avoid this type of screens. You need to find VA pannels, wich have very good blacks and vivid colors, but with limited angle of vision. There are other kind of LCD pannels called IPS with better colors and good angle vision but with poor blacks. QLED pannels are an evolution from VA pannels. And OLED pannels every subpixel emittes light y hava the better blacks, colors and contrast, but they crush the black range too much. Every technology has its pros and cons.

When you look at a TV 4K/UHD LCD screen, you can see 3840x2160 pixels with each eye (this is not a joke, I’ll explain later). If you close de right eye, you see 3840x2160 pixels with de left eye, and the same with the other eye. But if you watch the TV 4K/UHD with two eyes open you don’t see 2x3840x2160 = 16.588.800 pixels. Each image in each eye will be fuse into your brain as a single 3840x2160 image.

Your brain actually use both images to create a 3D/stereoscopic image to calculate the deep of objects. This is very interesting for monkeys to pick fruits from trees (I caught you thinking about the ninja fruit game VR). All carnivorous animals have their eyes in front of their face to better focus on their prey, calculate distances and be able to hunt them. Herbivorous animals have their eyes on each side of the face, to have a wider monocular field of view (non-3D/stereoscopic) and to better monitor their environment to avoid being hunted. This is just general information but it is interesting to know where it comes from and how our visual system works.

Your eyes do not have an infinite ability to detect resolution. When you view a TV picture with a size of 65" and a resolution of 3840x2160, if you are 2.6 meter away, the same picture with 3840x2160 and 1920x1080 will be the same for you. In other words, you need to be closer than 2.6 meter to benefit from a resolution of 3840x2160.

Some image processing technologies, like sharpness enhacement, HDR, etc, can altered the comparative result and modify the perceived distances and resolutions. For example, my DLP 1080p (1920x1080) projector with Enhacement Sharpness ON show a better perceived resolution than a 4K/UHD (3840x2160) projector with no Enhacement Sharpness, 4 meters away and a 150" screen.

On the other side, if you have a 1920x1080 FHD LCD TV and sit 0.6 meters away you’ll see the SDE and pixels, and if you have a 3840x2160 4K/UHD LCD TV at the same distance, you’ll see solid colors, pixels-free and SDE-free. I have four TVs in my house FHD and 4K/UHD and I have tried all this I have explained. I am now writing on a Samsung TV 4K/UHD HDR 50" with a VA panel using it as a monitor for working, watching films and playing video games. Playing at 0.60 meters, with an i7 and GTX1070, the detailed images and inmersion playing Elite Dangerous is incredible.

About the ilumination


In progress


2.- Looking a VR headset.

VR headset usually have a screen derivate from a mobile phone, with some advanced features such a high frequency, fast pixel swiching, reduced ghost, etc.

Chapter 3: The FOV is overvalued.

In progress


Cheers,
Neo

24 Likes

Reserved 1
Reserved 1

Reserved 2
Reserved 2

Reserved 3
Reserved 3

Glad you turned up :slight_smile: Do you have confidence in Kopins displays?

2 Likes

Ey! The ‘Shanghai Surprise’ joke was mine!

2 Likes

Finally someone has put it on the right perspective the easy way, what I was trying to point out all the summer along how it is useless to have such wide FOV angles
nice to have it come from a tester, thank you my friend.

1 Like

I’m not sure i follow the logic here. Isn’t saying the big FOV angles in the head set been useless the same as saying that in real life you would forgo your FOV because it’s not useful enough. I wear glasses and for sure i wouldn’t want to loose the FOV outside my glasses just because it’s blurry.

3 Likes

I love this @Neoskynet you have a lot of useful knowledge, great idea to share it so that backers also will get an idea what to look for in the Pimax 8k. If they understand the tests you did, you’re not breaking any NDA, yet you’re giving them a good idea what they could look for themselves. And it might be the start of a very interesting discussion.

3 Likes

I apologize immediatelly that I haven‘t read your post carefully, but this sentence jumped at me. Why would the Pimax screen have to show any part blurry, because we humans have blurry peripheral vision ?

A tree standing at 90 degrees to me is as detailed and sharp as ever, I only see it blurred. The same should apply to the sharp image on the peripheral part of the HMD screen, no?

The only reason why we wish it would be blurry when we don‘t gaze at it is to allow for less GPU usage.

And another point I would like to add: I see that you are very much concerned about visual quality, and like to use such device for watching movies. Although I would like to use it for that purpose too, I think we should be careful not to expect the 8K to do everything perfect for all purposes.
Keep in mind that Oculus and HTC haven‘t been brilliant although much bigger, and all of the competitors like StarVR, Xtal etc. also have a number of downsides. So it appears to be more difficult to create the perfect 2nd gen HMD than many of us will believe.

If the 8K is a clear improvement in a number of aspects over the current generation, please consider to let go of a wish-list when it becomes clear that Pimax won‘t do it and take a fresh look at the 8K whether it -while not perfect- may still be quite a good headset after all.

At the moment we get the impression that Pimax believe the 8K is good to go save for a bit of software work, and the testers believe it is far from readiness and if offered as is, will not be enjoyed by the majority of backers which will be coming from Rift‘s and Vive‘s.

2 Likes

Correct, it doesn’t have to be blurry, yet it would be a waste of rendering resources to display it detailed. However I doubt that foveated rendering will become a reality anytime soon.

I think the point Neo is going to make is regarding the lenses though. Let’s see.

Your understanding human FOV experiment is a great example on how people are overrating their capabilities of seeing things in wide FOV and peripheral vision. As we already have discussed before, peripheral vision is 95% for awareness, increased sense of speed and increased immersion, and not for reading text or distinguishing tiny objects at 70-80+ degree angle from the center point. We never do it in real life, and we will definitely not do it in VR, no matter if full clarity upon 160+ FOV is possible in VR or not.

Of course, there are exceptions. Such as people with mutant homo-cameleon eyes :slight_smile: But I’m not one of them


8 Likes

You don’t even see that much in focus even when looking straight ahead.
Make this full screen and you will see what I mean.

4 Likes

Exactly! Great example :slight_smile:

I would say, in terms of awareness, you CAN see things moving in the far peripheral (above 150-160 degrees), for example opponents moving at distance in Onward or Pavlov. But you cant really distinguish whatever it is, until you turn your head and look at it.

1 Like

I am sure there are some ups and some downs here, as everywhere else.

Seems to me that what is being pointed out with the final few points, is not so much that the periphery is blurry, but that the accomodation reflex happens in the binocular area, because that’s where you have convergence.

Now; There are two matters that may be worth remembering here:

First, that with current HMDs, the focal distance is fixed, so we are suppressing our accomodating anyways.

Second, that even though our foveas can never reach past a certain angle, which means anything beyond will forever be stuck in low-cone-density land, there is also the temporal sampling aspect - we will take up detail from how cones transition, as imagery moves across the retina, so how well the image out there works, depends on how much aliasing it exhibits, regardless of how blurry it is.

1 Like

Doesn’t this explain why there is motion sickness in VR? The fact that we don’t have a virtual nose? The eyes have nothing to guide and help perceive things without the nose. The nose helps create a sense of self aware?

1 Like

yup no point in having outer 20-30 degrees each side in full resolution. it’s an ideal target for fixed foveated rendering and performance savings. this being said, that area places importance on motion detection, so i wouldn’t say they are entirely unimportant, and at end of day it is your fov, and contributes to feeling of not having blinkers on.

1 Like

It doesn’t HAVE to, but it would be useful for foveated rendering, where the outer edges of your vision are rendered at a lower resolution. Your peripheral vision is highly sensitive to motion, so you want to avoid aliased edge “crawling”. Blurring those areas is a simple way of fixing that.

2 Likes

“That’s exactly what Pimax 8K lenses and screens would have to reproduce, because it would be like reality.”

Reality is not blurry, your perception is. Full resolution everywhere would be reality, but what we want is save GPU resources so avoid rendering at full res something we won’t be able to see at that res anyway.

I tried to explain it several time on the forum ( here for instance: http://community.openmr.ai/t/pimax-will-the-eye-tracking-module-severely-limit-fov-as-much-as-the-latest-photo-s-prototype-show/6258/18)

We can save a lot of resources if we don’t render the 30° on each side at full res (I don’t know how much that would be vertically).

I think VRgineers did just that with their XTAL headset.

Why would the Pimax screen have to show any part blurry, because we humans have blurry peripheral vision

No they don’t have to show it blurry, you don’t have to blurry something you are going to see blurry anyway.

What we want is those parts rendered at a lower res to save resources.

Oculus is working on a varifocal system to reproduce depth of field blur to add confort and realism, but it occurs in the binocular region where things should be rendered at full res and it involves foveated rendering and dynamic multi resolution, here we are talking about fixed rendering (or fixed foveated rendering as they call it for the Oculus Go) monocular peripheral vision.
Not a big deal for a 100° headset, but for our Pimax it is.

1 Like