No it doesnt need to be perfect for VR, but then the question becomes whether the benefits outweigh the cost.
As an example, doing things like rendering high detail in the center, and less in the periphery is already done by Nvidia VRworks, without the computational cost added by an eye tracker.
True, but would the computational cost of the eye tracker be handled by the cpu or gpu? if it were the cpu, im pretty sure that wouldn’t be a problem…
i just think that if it’s possible to save even a bit of gpu power with eye tracking, while also adding depth to gameplay and socializing in vr, it would be worth having, when the headsets come out next year.
from the vr eye tracking demos i’ve seen so far, it seems like that much should be possible soon. but i could be wrong.
Surely if the eye tracking has the precision for menu navigation and button clicking, then it’s good enough for foveated rendering. Like the previous poster started - just have a slight larger area for any small errors. This has to be better than rendering the whole panel at 8k
There is a company, AdHawk Microsystems, making tiny sensors that don’t use cameras or processing power from one computer, they are sending kits for enterprises.
They can predict the position of the eye with 50 milliseconds in advance.
I wonder how feasible it would be to skip the eye tracking and perform the foveated rendering only in the center. Because of the blur (rings) from the fresnel lenses in the Vive, I am already used to moving my head to look at things in VR. Is this something that could be helpful? Would something like that even work?
It’s working, it seems; the system is very accurate:reads thousands of times (not hundreds) the eye in a second, even the saccadic movements. The anticipation is only a plus.
About eyetracking, i do not want to advertise, but i have and use the Tobii Eye Tracker 4C.
And i must say that it works very nice. I can do “everything” without a mouse and i also make a lot of kills in fps. If Pimax would use those kind of censors, i would very happy.
I wish there was an easy way to see who has backed the kickstarter. Then we know who to blame if we don’t reach 3M. Personally I’d rather received the eye tracking as part of the stretch goal without having to pay for it separately… regardless of if the technology is mature or not… (rather than waiting another 5 years for a mature product). Negative nancies need to cough up or shut up.
I think they are still a ways away from that unless they have finalized deals with their partners…, They would need to go in with updated controllers first.
To be honest, I do not want another youtuber to say how great it is and how deep he was immersed. I want to have it reviewed by someone who understands the stuff and can comment on things, like FPS jitter, lens warping, supersampling and rendering resolution, tracking accuracy and overall configuration requirements for the games I would like to play (it is not Fruit Ninja ) and who can challenge Pimax about the features & issues he would eventually observe.