Some optical headtracking system (like Opentrak with a cheap Ps3Eye camera) could be read by pimaxserver (opentrack could output data in UDP packets that could be read at any port) and use this information to deliver calibrated HMD angles and absolute 3d position, then appear to steamvr as having full headtraking. The “calculated yaw” from Opentrak could be the CALIBRATION SOURCE for the YAW in Pimax, becoming an almost perfect in traking yaw instead of severe drifting as now is happening.
In my experience, the cuality of the tracking data delivered by OpenTrack is very good, and is free open and
well documented.
As a customer this could be a very big product improvement, and surely will be for others.
You only need a sticker on the device. Yes, not even leds:
But the traking data should be combined in the correct way, so coding is needed in the pimax side to serve the
programs coherent data.
This is not going to provide full room positioning, because de FOV of a single camera is limited. But for “seated experiences” is good enough. Now I understand that the hability to freely move your head, even few centimeters, in a simulation adds a lot to the inmersion factor.