I have already spent some time here on discussing different aspects of the Pimax headsets, usually related to how OpenVR works, how the rendering works, parallel projection, etc. There were also numerous speculations about the FOV of different modes and this, boosted with sometime quite fuzzy and sometimes outright wrong info from the manufacturers, made me write this tool.
The sole purpose of this tool is to collect as much information about every attached VR device (in OpenVR) as possible and display it in a more suitable way for a human. Apart from the all different “properties” which includes (among others) model and firmware versions of every hardware and can be invaluable for someone debugging a system problem, it also gives the complete information about the headsets view geometry and calculates all different FOVs (individual for each eye and total for the headset).
The FOVs calculated are “technical” FOVs, i.e. they represent how the application “sees” and renders the image. This means, and I cannot stress it enough, they are basically the construction limits of the headsets. The user cannot see more than that (because more is also not seen by the app and therefore not rendered), but can see less depending on other design aspects.
Anyway, this tool is an attempt on providing formalized (and normalized) approach to measure all the different FOVs.
There is a “user manual” available in an enclosed readme and there is another article I published on the subject of FOVs in VR headsets here (VR headset rendered FOV calculation | VR Docs) which explains the idea behind.
The tool is open source, and as such open to a public scrutiny, if any doubt arises. I have already tested it extensively on my Pimax 5k+ and OG Vive and collected outputs from others who helped me to test it for Rift CV1, Pimax 8k, and Index, but will be very much interested in receiving more info about basically any other headset. If the info proves consistent, I would most likely publish it as a database on the web.
For exchanging and comparing all the data, the tools supports a normalized output in a JSON file, which includes all the information otherwise printed in a readable form. If anyone would be willing to share that info with me, have a look at this (Contributing | HMD Geometry Database) about how to do it and let me know.
You can grab the binary on the github directly here (Releases · risa2000/hmdq · GitHub).