Eye Tracking Completed

G’day Pimaxians,

I noticed @mixedrealityTV has a video testing the Pimax Eye Tracking module (here: CES 2019: Trying Out The Pimax Eye Tracking Module! - YouTube ) and he said it’s complete. Do we have a realistic projection as to when backers will receive theirs?

I’m at work, so I’ll admit watching the video was a little interrupted, but I’m sure Seb says that although it’s a finished product, they don’t want to release it until there’s “more content.” I must ask what that has to do with it, as It’s a stretch goal - don’t we already own one? It looks interesting and I’m looking forward to using it in a Virtual Desktop type environment…

As a big-nose owner I’m not looking forward to moving the Pimax even further away from my face to account for it (lens’ already sit on my nose) but as a gadget it looks fun to tinker with.

I’m not sure what Pimax person to tag to this post to get an official reply so … @Matthew.Xu @PimaxVR @deletedpimaxrep1

9 Likes

I tried a Tobii demo last year and I was super thrilled with the simplest of demos where I could see my avatar in a mirror really looking back at me with the eyes. That alone made me decide I wanted it, I don’t care that there is no other content yet haha

5 Likes

Im excited too but I can see why they wouldn’t want to release it yet. Finish letting nvidia and steam vr create foveated rendering. work out any unforseen issues with the tech and improve the hardware where necessary.

Release it to us when it’s ready. There only competition on this front is htc.

5 Likes

I can see why they wouldn’t want to release it to the public - sell them something they can’t plug in and use - but backers already own one. Release it, and the API and I’ll work on my own content - this is an Enthusiast HMD, after all

9 Likes

I 100% agree. As backers we are a different breed, we want the best stuff NOW please :slight_smile:

8 Likes

you all sound like babies

1 Like

While I kinda get your point from the amount of rubbish on this forum. It does seem a bit silly/strange to not release modules purely for random reasons? If they are ready then send them to devs in the community who can start doing something with them etc… The more people that have this tech the better. It’s not goint to help anyone by holding back here.

6 Likes

I think you’re reading my post with your own bias and tone. Projecting, maybe. Im just asking a valid question about a product that is created and I already own. I’m not being impatient or demanding - I just found out one of the stretch goals has been completed as a product - and asking what happens next/when

1 Like

I have no idea what you post has to do with mine.

You’re right - I misread your post

1 Like

Who’s crying here? We’re not complaining and whining, we’re stating that the waiting for more content by Pimax to release a finished product is something we’d rather not.

I see your point with Pimax wanting to wait until some cool stuff is developed and they can hitch a ride on that commercially but as backers we get a front row seat since we helped create the company with our money and we did it for the rewards. Let retail wait, not us.

5 Likes

As much I’d like to fiddle with eye tracking asap I totaly get that pimax do not want to release it now even to backers. I does no good for them if they release it without almost any content. ppl will bitch about it and make it less interesting for non backers with negative input that will last even if they release cool software later. I heard that it’s gonna take another 3 months.

1 Like

I thought only people like me who are living without electricity and internet (using candles instead of computers) don’t know that eye tracking is the next biggest thing in VR

1 Like

The hardware might be ready but the SDK (analysed Unity SDK) is currently very rudimentary. It is just providing x and y coordinates for both eyes. If Pimax wants to have useful content using eye tracking the SDK should be enriched a lot. Posted this in the non public Development section of the forum yesterday but I think I should post it here as well so forum members have an idea what is currently missing and would slow down content development.

  1. Multi user profiles (should be available also in PiTool)
  2. Calculation of 3D gaze cursor
  3. Access to IR camera images
  4. Eye openness indicators (binary, better float values 0.0 to 1.0)
  5. Blink detection
  6. Saccade detection/filtering
  7. Heatmap tools
  8. Pupil sizes

Be also aware that there is currently no established eye tracking standard. Every company has a specific SDK. I did develop with Fove 0 in the past, so I have some dev experience with eye tracking.

8 Likes

Thx for sharing this. I think it is very interesting to wittness the step by step progress. Pls feel free to post any news on this topic here as well, apreaciate it.

2 Likes

I respectfully disagree. No one will create eye-tracking content (games) until there is a large number of users with eye-trackers.

4 Likes

Awesome how people still poke @deletedpimaxrep1 even though she hasn’t been around for a few months now :revolving_hearts:

1 Like

There is something better for eyetracking. Simulate the Varifocal technology of Oculus, with a virtual IPD adjustment in real-time.

Then you can read letters, or feel a real depth in far objects. Like Oculus want to achieve with Varifocal technology.

2 Likes

Let them keep it until content comes out. This way if any tweeks or hardware adjustments need to be done, it will get done before its in the public’s hands.

1 Like

Foveated rendering will be amazing. I hope it will be a universal system that everyone can implement in different headsets’ drivers, and easily implement in game code.

I wonder also whether or not foveated rendering will be based on Deep Learning algorithms, as shown by Michael Abrash. In that case, I wonder if the Tensor core for the RTX gpu will be of benefit. I wonder too much perhaps. But, the future of VR in general, is very bright this year.