My 8k+ (awaiting RMA / refund ) sends OpenVR poses with all the velocities (linear/rotational) and accelerations set to 0. I assume all Pimax HMDs do the same…
It is normal to set the accelerations to 0 - if OpenVR takes them into account at all, the effect is minimal. However, it is not normal to send 0 for the velocities. OpenVR depends on receiving velocity data to perform pose prediction and with velocities set to 0, pose prediction is effectively disabled. This probably goes some way to explaining why HMD motion feels ‘laggy’ in Pimax headsets.
Is there a reason that no velocity data is sent in the HMD poses? Is it a bug?
I am curious - do you have another HMD to compare velocity data that HMD sends? Just because API has fields, does not mean it is functional or does anything.
But HTC HMDs don’t - which is what Valve was focued on the most - would be interesting if Index sends it. I won’t be surprised if their (Valve) lighthouse driver don’t care about this element.
WMR and Oculus use different tracking, not LH based so differences with those not necessarily count.
The HTC HMDs don’t send accelerations, which is entirely reasonable given the tiny effect (if any) they have on the pose prediction. However the Pimax is the only headset I’m aware of that doesn’t send velocities (the Index does), which have a very noticeable effect - OpenVR certainly does ‘care’ about the velocities…
The HMD pose as provided to OpenVR (i.e. the vr::DriverPose_t that gets passed to VRServerDriverHost::trackedDevicePoseUpdated()). I’m doing some development that involves hooking that to allow modification of the poses. For debugging, I often log and then plot position / velocity data, which makes it really obvious that there is no velocity data in the Pimax poses… There are just 0s where it should be.
Yeah, this is definitely an issue. The workaround for my team has been to manually calculate velocity/acceleration from the positional data. Would be highly preferred if this data were populated by default, though, as I’m fairly sure it would be more accurate if pulled directly from a raw IMU sample.
There’s another issue - it seems the Pimax HMDs update the pose at the display refresh rate (so 90 Hz +/- a few tens of Hz). That is super slow - the Vive, for example, updates it’s pose at ~ 1000 Hz. Presumably frames are being rendered using a pose that’s a full frame period old (so ~ 11ms as opposed to < 1ms on the Vive). Combined with the lack of prediction, it’s no surprise it feels ‘laggy’.
That is an interesting finding. I have just checked some of my old logs from Pimax OpenVR driver and you are right, the updates from the Pimax driver are around 11ms apart. Which is strange because it is not an IMU speed (or, at least, let’s hope), which is normally 1kHz for a headset, nor the lighthouse optical tracking speed, which is 60Hz.
The “regular” OpenVR headsets (Vive, Index) report the poses at IMU frequency, but it means that they are basically predicted from the last known “fixed position” determined by the lighthouse.
The “regular” OpenVR headsets (Vive, Index) report the poses at IMU frequency, but it means that they are basically predicted from the last know “fixed position” determined by the lighthouse.
Kind of I guess… The IMU is the primary source. Beyond initialisation the data from the optical tracking isn’t used to generate poses directly - it’s fused with integrals of the IMU data (via some sort of extended Kalman filter) to constrain the inevitable drift in the IMU derived data.
Which is strange because it is not an IMU speed (or, at least let’s hope), which is normally 1kHz for a headset, nor the lighthouse optical tracking speed, which is 60Hz.
I assume/hope that the process described above is being carried out and it’s just that the pose update is triggered from within the same ISR/loop that updates the display instead of being handled separately, with separate timing… The missing velocities do give cause for a little doubt though…