Pimax could use the hand tracking to allow the user to see their keyboard and mouse. (Idea)

So, Pimax has now created their own Pimax experience software. I was thinking of Virtual Desktop and thought of a fantastic idea for the hand tracking. This would require Pimax to have their own sort of Virtual Desktop in the Pimax Experience. If you’ve ever used Virtual Desktop, one thing you may have noticed is that typing and moving your mouse can be difficult since you can’t see them when you’re in VR.

The idea is simple. We use the hand tracking to create a virtual equivalent of your table, keyboard, and mouse. Sounds difficult, right? Well, maybe not. There are only several prerequisites. The user must know the length of his letter keys, the length of his spacebar, and the length and width of his keyboard. These numbers can be easily acquired with a ruler, and will be given at the start of the setup.

First, we have the user place their hands flat on the desk where they will be using their mouse and keyboard. This step allows the software to determine the level that the mouse and keyboard will sit. Similar to when you set the floor in room setup. Then the user will touch the front edge of his desk. This step tells the software where the edge of the users desk is. Next, the user types out a small paragraph on their real keyboard. This paragraph would contain all letters of the alphabet, numbers 1-9 (and 0), along with some punctuation marks. Finally for the weird stuff we can just have the user press his shift key, or bracket keys. By doing this we can get the general layout of where the keys are located on the user’s keyboard.

Next Pimax’s software would create a simple rectangular 3D keyboard model with the same key layout and length as the users keyboard. Then, that keyboard model would be placed in the same location as they originally typed the paragraph. The final result is a basic, but fairly accurate model of the users keyboard in which the keyboard’s position is accurately reflected in real life.

For the mouse, setup is even easier. All the user has to do is place their hand on their mouse, and click their left and right mouse buttons. We have the user do this in several locations on their mouse buttons to get a general sense of how big their left and right mouse buttons are. Finally, a basic 3D model of the users mouse is placed on the virtual table. This clicking step is important since it allows us to not only determine the size of the mouse buttons, but also gain crucial information about how the user holds his mouse.

After doing setup, the user if left with a virtual table, and a virtual mouse. Since the software has seen how the user grips his mouse, we can do a sort of “fake positional tracking”. When the user’s hand and finger locations are similar compared to when he was gripping his mouse earlier in setup, we can assume that the user is holding his mouse. And since grip doesn’t change much when moving a mouse, we can positionally track the location of the mouse by tracking where the user moves his hand. Pretty cool, right? We are left with a keyboard that matches the basic layout of the users keyboard, as well as a generally accurate representation of the users mouse that is even positionally tracked.

This is a little complicated, but the final result is just what people have been wanting for a long time. Plus, it’s all done without tracking pucks or special keyboards. Virtually any normal keyboard and mouse can be used, regardless of their specific keyboard’s layout. This could really be a surprisingly killer feature of Pimax’s hand tracking module. I might even consider buying one if Pimax could pull this off.


Incredible idea @arminelec @SweViver you will have people rushing out to get a pimax experience! May have a simple problem when or if the keyboard moves but could be corrected on key strokes. Sign me up please and thank you!


Chance is the leap module cameras have good enough a view of the keyboard and mouse, you wouldn’t need to do any teaching or substituting.

You could probably do reprojected raw passthrough, or render out the depth map, or automagically align your substitute models with your real world peripherals.

I take it the module comes only with two only-near-IR-filtered cameras, and not the supplemental colour camera that the Kinect and the Dragonfly had?


As long as your keyboard is stable and doesn’t move much it should be all good. Good idea with the keystrokes correcting the positioning.


They are infrared I believe. Maybe they can see keyboards that have LED keys? That would be even better.

1 Like

Thank you for sharing your idea @shinytomb and thoroughly explaining it with such detail.

Although Pimax is not officially funding/supporting this project yet, I am hopeful that at some point they will understand the true power @SweViver and I are bringing to the VR world, using their headset and the PE.

Nevertheless, we had a similar idea/concept to yours for virtual mouse and virtual keyboard. We planned to work on it after the virtual desktop feature is ready and robust. Although our approach is more simple for the setup of the environment than yours, I have attached your idea to the ticket in backlog. When the time comes to work on it, we will see what can and cannot be done, considering all technical possibilities as well as limitations.

Thank you for making a separate topic for it. It is always easier to read threads when they have their own proper title and not merged into/with another topic like most of the content of the forum is at the moment :slight_smile:

Regarding the module @jojon, yes it is IR only. The module that I currently have for development is not the final model that Pimax has/will include for users. Therefore, I cannot comment on the details of the module and its capabilities, limitations, performance, accuracy, etc. at this point in time.



Cool to hear! I’d love to see this idea implemented. :slight_smile:


The module projects structured light illumination (well, as least I assume it is still the Prime Sense Kinect v1 style technology, and not time-of-flight cameras), so no need - it should see everything in range, that reflects the right wavelengths.

@arminelec, @SweViver: Nobody can fault you for lack of ambition. :slight_smile:


With the cameras being ir it might be possible to use ir reactive stickers. Much along similar lines as AntVR cyclops roomscale carpet.

1 Like

well, IMHO, if Pimax doesn’t realize and shows some TRUE appreciation how much their products benefit from your hard work and give you guys all the TRUE support they possible can that would be simply ignorant and foolish. It took them (Pimax itself) so long to implement or correct even some minor stuff in the pitool in the past. compared to that you are developing pimax experience at the speed of light.We all know even the best hardware is worth nothing without capable and userfriendly software its driven by !

@PimaxUSA as COO I hope you and Pimax owner(s) give @SweViver and @arminelec the appropriate support and reward as those employees are making Pimax looking as good as Pimax might like to or thinks it is as company but failed so many times.


The Leap should work ok with detecting keyboard and mouse, but if it doesn’t, then we might need another Pimax module with stereo cameras and thus also make it Mixed Reality compatible.

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.