I would like to request a smoothing slider so we can get rid of the micro jitter in our HMD’s
This is causing a lot of extra aliased shimmer (in my case almost unbearable due to the jitter in certain sims) and its a general nuisance. The reason for a slider is that different people will tolerate different levels of smoothing and honestly you would not need much. In for example flight sims where aliased shimmer can be a real problem this would be a fantastic feature i have wanted for quite some time now as ive tried just about everything on the hardware level including re calibration etc.
Since you have never been afraid to push the tech and come up with new idea’s i really hope this is something you can implement.
Please do not compare the two issues, my thread is specific to bad calibration / tracking issues. Not micro jitter on tracking
Though I really like your idea, for the those of the Vision series HMD’s that are well calibrated and just need that extra micro jitter to go away so shimmering will go away as well.
No probs. Removed the link so we dont end up confusing things.
I have the same issue with the pimax 8kx - I cant stand the microjitter which I’m not able to get used to (I tried LH2.0 and LH1 with no improvement) - I tried oculus and it is super steady when not moving with my head and I’m convinced that all those steady trackers must use some sort of jitter smoothing. For example on my trackir (using opentrack software) the jitter smoothing is a must and works really nice - this would be inspiration how to implement it.
Found this on a Vive site.
There isn’t smoothing per say but there’s a few things at play under the hood. All VR devices including basestation tracked devices experience something called “judder” in which the pose estimate of the device will shift slightly from frame to frame, even if the device is stationary. This a result of the sensors not being able to update fast enough and accurately enough to firmly anchor the pose IRL. On basestation tracked devices, the judder will depend on your basestation placement and your overall environmental conditions (e.g. shiny surfaces).
In order to help reduce judder and to overall increase the device’s tracking refresh rate, sensor fusion is employed in order to take IMU data from the controllers to supplement the basestation’s fixed cycle rate. The IMUs can update much quicker and can be used from time to time to provide pose estimations for frames in which there isn’t enough basestation data to derive an estimate, assuming that you eventually get basestation data in a future frame. This IMU update is a form of smoothing in a sense because it smooths out the pose estimate data and adds resolution but it’s buried really deep inside of SteamVR’s hardware stack so you can’t modify it and the devices would have much greater judder without it. This is happening at hundreds of hertz per seconds - it’s microsmoothing not macrosmoothing.
So, overall - there isn’t a high level of smoothing with SteamVR tracking data but there is some level of IMU-based smoothing using sensor fusion that occurs as a requisite for the system to be somewhat accurate since the IMUs have crazy fast refresh rate. If anything, the judder itself can pose a problem for virtual production and may require smoothing in and of itself.
And it got me pondering, are Pimax using the data from the IMU in the HMD to smooth the data, or are they just using the LH tracking only.
Your guess is as good as mine
@PimaxUSA @Alex.liu @Doman.Chen
Any chance to swing this question by the engineering team ?
There are two stages where the tracking data from the hardware are processed. The first is the lighthouse driver which talks to USB devices responsible for the optical (sensors) and motion (IMU) tracking. The driver reports so called “driver pose” to the SteamVR runtime, which runs additional filter, predictor, and other kinds of “magic” on that a then supplies it to the application as demanded.
The second stage is what is totally opaque to both the app and the driver writer and is completely up to SteamVR (Valve). While the (headset) driver has control over what it reports as the “driver pose”, since Pimax uses stock lighthouse driver written by Valve, they can only work with what the original driver provides. Which is typically a pose which is “pivoted” by the optical sensors and “extrapolated” by IMU feed (as described in the article you quoted).
Optical sensors run at ~ 100 Hz (https://www.valvesoftware.com/en/index/base-stations), while the IMU in the headset at 1 kHz.
There has been however reports that Pimax does not report some synthetic data in the driver pose (HMD poses have no velocity data).
This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.