The Problem with VR

Hi @PimaxVR ,

I have a feature request for the next VR headset/minor update after* the production release.

Have you ever noticed that everything you see in VR is in focus? That’s because the display panel / lens is positioned just right for each eye to be in focus. That’s not right. Not everything should be in focus. And that’s only for people who have 20/20 vision. If you don’t have 20/20 vision then you kinda have to make due with what the headset provides, or wear your glasses. My glasses won’t fit in my Vive, and I refuse to buy those lens inserts. The headset should have independent focus for each lens/panel. But I digress.

Close one eye. Position a pencil tip just in front of the open eye, just close enough that you can focus on it, and line up the tip so that it’s level with something in the distance. Now, with the open eye, focus on the pencil tip. Notice that things in the distance are out of focus? Now focus on the things in the distance. Notice that the pencil tip is out of focus? In the current implementation of VR, both the pencil tip and object in the distance are in focus when you’re viewing it with one eye. I don’t know about you, but this has always irked me about VR and takes a massive suspension of disbelief to get me immersed.

How do we fix this? Well, we can create a revolutionary display that contains miniature, flexible lenses in front of each pixel. The lens can bend or straighten changing the focal point of its pixel, creating a three dimensional effect, complete with blurriness if the viewer focal point is out of focus. I kid, of course. This type of technology is years in the future, if it’s even possible.

Or, along with eye tracking, we can observe the cornea of each eye and measure the flex. This would have to be calibrated, of course. This information can be passed back to rendering software which then blurs as appropriate objects that are not in focus.

Again, I can only speak for myself, but this would greatly aid the effort in suspension of disbelief and better contribute to a believable virtual reality environment.

What do you think? Is this even possible?

Can anyone else think of little things like these that can benefit virtual reality?

Thanks for your consideration.

EDIT: I changed “to” to “after” the production release because I don’t want people getting up in arms about adding another feature to the current planned production. x.x

This kind of stuff is not easy to say the least. Right now the closest we are to variable focus is with a patent by Oculus: Oculus Research to Present Focal Surface Display Discovery at SIGGRAPH

1 Like

I see you have a LOT of belief in Pimax. Once they have resolved that one, I’d like them to tackle world peace and make every child happy all the time. Not too much to ask, is it. After all, I am a backer!

Okay, on a more serious note, this is a way more complex task than what you can expect them to be a forerunner on, if you ask me.

This is the one stand-out feature which is said to set the mysterious Magic Leap goggles apart from the rest of the players. They certainly are making a big deal out of it, and it isn’t clear yet due to their non-communication policy, what they have achieved and if it delivers the big Wow-factor everybody is expecting from it.

Oculus are also working on it, and if I recall correctly, even the CV2 if released in 2019/2020 would not yet necessarily have such capabilities, it is rather a long shot research they are conducting.

Here’s the article from RTVR on Oculus’ efforts:

1 Like

Yeah, I’ve been racking my brain trying to figure out how to accomplish this. Specifically my first suggestion. I’m a vanguard thinker. :smiley:

I was thinking of a transparent material whose refractive material changes with an electric charge. Pair this with a pixel and you now have a 3D display that doesn’t require glasses.

I found the below link. Physics is so cool.

Kerr Effect

Wouldn’t it be easier to use eye tracking to figure out what object in the game they’re looking at and apply depth of field blurring based on that, rather than minute changes in the eye or some kind of crazy array of microscopic variable lenses?


I thought about that, but is focus independent of what the eyes are looking at? And if we can determine focus, the headset can automatically adjust IPD and lens correction for imperfect eyes, per eye. How cool would that be?!

It would, but then you really are just faking the thing about letting the eye focus on the object.

Yes, the artificial blurring would be fake, but it’s better than what we have now. Doesn’t it irk you that everything is in focus?

Since the sweet spot(where it’s in focus) is so small already on the vive due to it’s lenes. I can’t say yes or no as it seemingly is fine right now, but i don’t know how the Pimax will be like.

It would be cool if the lenses could move to keep the sweet spot in the center of vision. That’s a lot of movement, though, and power is a major concern.

For example:

Put the headset on, straight, slanted, or other. The headset automatically sets the IPD and continually moves the lenses so that you always have the sweet spot in your center of vision. What would it feel like having the lenses moving like this?

Artificially blurring parts of the image would be hugely annoying without eye tracking as one could look at blurry parts that never come into focus. Why on earth would that be a good thing? For any of the games I’ll be playing, I need to be able to look around with my eyes, not my head, and have targets instantly in focus in order to react correctly. That’s about as instant as I can tell in real life. I wouldn’t want a computer slowing that process down.

Besides that, once things are a certain distance away, they all appear in focus anyway, like a photo lens’s “infinity” focal distance. In fact, just looking about my office, it ‘appears’ that everything beyond 5 or 6 feet is in focus. That’s pretty close.

Of course it would have eye tracking! It wouldn’t make any sense otherwise. It’s eye tracking coupled with monitoring of the cornea to detect refraction degree.

As far as things far away appearing in focus anyway, do the pencil test. You’ll notice that while you’re looking at the tip of the pencil the objects in the distance are in fact blurred.

I just wached a video of tobii eye-tracking at ces 2018. They do exactly what you propose. And others have already done it (like Fove, the one and only HMD so far with included eye-tracking). There is also an add-on for the vive to have eye-tracking which can do this sort of things, too. I do also think that varying focus is important and absolutely necessary for “VR 2.0”, but it comes automaticly with eye-tracking. So hopefully pimax fixes the flaws of v5 and then adding the eye-tracking module will be the easy part because the libraries do all exist already. I am pretty certain that at the end of 2018 we all see VR 2.0 releases marketed primary with the benefits of included eye-tracking. Also because pimax will be the first to offer it (Pimax, do not miss this opportunity!). Foveated rendering, variable focus, already with existing tobii-hardware a game-changer, especially in interactions with npcs or other vr-multiplayer…

Does it automatically come with eye tracking, though? I haven’t seen any practical implementation.

And other than blurring out of focus things, can you think of anything else that can enhance the VR image?

The kickstarter backers receive a free eye-tracking module. :blush:

Watch the video of tobii on their youtube-channel. There are quite some benefits. Besides that imho the biggest benefit will be that npcs will react to you if you look at them (immersion) and the fact that you can increase the quality settings because you save roughly 30-50% with foveated rendering where you render only the center portion with full quality and gradually decrease it. Hell you could save even more in rendering the peripheral in black and white because there your eye can not detect colors… there we will see a lot of optimization.

It’s called “eye accommodation” and magic leap and a Egan today glyph do it using “light field displays”.

Know what else irks me about VR? The black space on the periphery. Yes, Pimax 8k has made great strides in eliminating this with the vast FOV, but it’s still there. Why can there be some type of lighting that extends the colors at the edges of the panels? The brain will make up for the rest and it’ll look like a greater FoV.

I actually like everything being in focus with current resolution limitations. Everything being in focus is the least of my problems with VR. I’m sure in the long run realistic focus will be a boost to presence and immersion, but it’s not something I consciously miss in this non-photorealistic era. FOV is far bigger concern for me. Until we get much higher resolution graphics or move beyond pixel-based displays, focus will remain much lower on my personal list of concerns.

I would be happy just with a VR unit that can be manually focused.
Tell me if you know of any that can do this and is affordable.