Pimax 5k+ teardown + panel info

In fact I never noticed but indeed, in the lenses there’s a very small inner circle. That’s what the sweet spot for undistorted image will be. If your irises aren’t exactly aligned with that sweet spot then I assume that distortion will happen. That’s of course why the vertical placement on your face of the HMD matters so much too. Depending on your face shape this might not be that comfortable (in my case I need to move the HMD to a point that isnt comfortable)

1 Like

So you haven’t been looking through the middle of the smallest fresnel rings of both lenses until now? You’re probably not the first too.

It also brings another question, does the middle ring even align with an IPD graphic crosshair when attempting to converge both images? I’m guessing not since the screens don’t move.

1 Like

Well initially I didn’t. I just put on the HMD so that it felt comfortable and that I could see clearly. However that’s the “problem”. The part where you can see clearly (what traditionally is called the sweet spot) is much bigger than the distortion “sweet spot”. So I found out I had to move the HMD up a lot on my face for distortion to get better. However I could only improve it, never really solve it, I always saw distortion

1 Like

sjefdeklerk
can you approximately measure the distance between the lens to the panel please , I might try some different lens just out of curiosity

1 Like

Well I actually just put everything together and the HMD works like it did before. Which BTW means that everything seems normal in the Pimax landscape but as soon as I start SteamVR everything is totally messed up. Which makes me think that it’s a Steam problem but I really tried everything, including deleting everything, opting out of beta, re-installing older versions etc etc etc… So not sure what’s going on.

However it seems that I didnt damage anything when disassembling. it’s actually really not that hard, you just need a steady hand and some patience. The flat-ribbon cables are actually really easy to disconnect and connect again, if you understand that you need to flip up the black part, as I explained before

1 Like

was it the wide ribbon in the middle that was slightly pulled out , looks off in the photo’s

1 Like

Yeah I figured that one might have been the problem. But everything works like it did before. Not sure what’s going on, if the pimax environment works with tracking, then why doesn’t it in steamvr … weird stuff

1 Like

Is steamvr in safe mode ? or swap usb’s about
try some oculus stuff , that doesn’t require steamvr

2 Likes

Hmm good one, hadn’t tried that. Lucky’s doesn’t start at all anymore, but then I tried “CocoVR” and that works perfectly. Not sure what to think of this … Will try on a different PC later on, does feel like a software problem

3 Likes

Have you reset the firmware?

Badass search @Sjef !!! Keep going bro, unfold what Pimax hide and jump the surprise. :face_with_monocle:

3 Likes

You mean flashing an older FW? No I didn’t, I highly doubt that’s the problem. The problem occured on ot the non-beta first, then I installed the latest beta, it flashed another FW and same thing. Maybe something in my system, will try it on another system.

2 Likes

Just figured might have been something wonky with the firmware.

In that troubleshooting list Brush firmware means flash or reflash.

Good to hear you got the headset working

The “whole” picture is 2560x1440 png, which, on my monitor, is white with black “eye” in it. The picture is the direct output from the Pimax compositor, and is basically a mask of the image which is displayed on the panel. The black area marks an active area, i.e. the actual visible image, the white area is the unused area of the panel. The unused stripe on the right is caused by my IPD (70 mm), which makes the left eye image shift on the panel to the left. I replaced the actual image with the black mask to better show the dispositions.

1700 pixels is the horizontal width of the black (i.e. active/used) area. In other words, when using Normal FOV (~ 114° per one eye), the used area on the panel corresponds (roughly) to the area 1700x1440 pixels.

1 Like

Exceptional work, a pleasure to have so much information of the hmd, I would like to see the displays of the 8k.

3 Likes

@Sjef could have been a good chance to try some different matt foils to mask the sde of the panels? :sunglasses:

1 Like

In my case the horizontal placement plays a big role as well, much more than with any other headset.
Since there is no per eye adjustment, and lots of actually unecessary free space in the nose area there could be a niche:

Nosegap blockers that come with a asymetrical shift to compensate for asymetrical eye distances.
Let’s say a ± 2mm left and right version should already help in most cases.
Maybe @Davobkk is up for the task :slight_smile:

3 Likes

To bring some of Risa’s explanations down to layman level, where even somebody like myself can somewhat fathom them: The game renders the FOV it is given, at the resolution requested (usually…; At the end of the day, the game is free to do whatever it feels like), without any regard whatsoever to the native resolution, nor format, of the display panels in the HMD. Then the piserver compositor distorts this intermediate image, and places it within its designated 1700 pixel wide area on the 2560 pixel wide screen, given by one’s IPD, so that the spot within it that is at the centre of the distortion curve, is centered right under the centre of the lens. (I don’t know whether at the smallest IPD setting, the inner edge of the picture touches the inner edge of the screens, and vice versa for the outer edges, but I could easily imagine it.)

…so nothing extra or unnecessary is rendered for the unused part of the screens.

The avalanching performance hit, when increasing FOV, is because the game usually renders to a single flat viewplane, with an even pixel density distribution across it, for each eye, which means that the farther out to the side something is, the larger the area (and so: more pixels), of the flat viewplane, covered by a single extra degree of FOV, coming in from the position of the camera, and intersecting the viewplane at a very oblique angle (just like when slicing a cucumber diagonally, for larger, oblong, slices… :P).

(EDIT: This is also why the increase is more drastic with the Parallel Projections option on, since it swivels the view planes forwards, parallel with a line between your eyeballs, necessitating them being that much longer to cover those oblique angles: And why StarVR recommends delvelopers split their viewplanes in two, for each eye, and set them up in a curve, so that they wrap around your head, instead of needing to stretch out infinitely to your sides, in order to catch those last degrees toward 180 degrees.)

Given that the problem with pop-in in games is often described as mostly occurring in the right eye, my general speculation it that when the games in question set up the culling frusta (or view cones, in stealth game terms) that they use to determine whether an object is visible to the camera, and needs to be rendered, they do not take into account things like the cant of the camera frusta, nor their asymmetry (seeing more to the left than to the right, for the left eye, and vice versa for the right; You may recall Rift users had the same problem with Fallout4VR, in the beginning - same issue), and could in a hypothetical drastic case perhaps even do something silly, like using the left eye field of view, or at least view vector, for both cameras, rather than their combined FOV. (Then you have the matter of whether the game culls separately for each eye, or in a single pass for both, but I suppose that would be mostly a matter what is more efficient for any given optimisation case (how many objects do we have, and how large are they? How can one’s code be parallelised? That sort of things… :7).

(EDIT: Should you see things looking good in the periphery, and then popping down a Level Of Detail, or even disappearing, when you turn to look at them, I guess that could be the game seeing it move past the far clipping plane, and them distance culling based on their bounding box intersecting the frustum volume, rather than checking actual distance; It is a longer way to the far corner of the frustum, than to the middle of the rectangle right in front of you. :9 – they could probably also kick down a LOD and/or mipmap level, for things that are to the side…)

No, it does not, but not because of the screens not moving (the image is shifted to the sides on-screen, in lockstep with the lenses, which amounts to the same effective result (albeit with more mechanical play room), as having them as physically conjoined units.

It is because the whole of the optical assemblies are canted outwards, so when the headset is properly fitted, you will see the left side perfectly focussed when you look ten degrees to the left, and the right when looking ten degrees to the right. These two situations will also each constitute the optimal matching of distortion correction to lens distortion, for that one eye, radiating symmetrically from the centre of each lens, set at those ten degrees, per direction.

Consequently, we can not achive perfect optical conditions for both eyes simultaneously: When we look straight ahead, we get a more-or-less identically mirrored looking-through-the-lenses-slightly-across compromise, for both eyes.

(This can, as a bit of a positive-ish side effect, be perceived as one getting a larger overall portion of the view being clear, even though each lens seen on its own may in fact fall off rather quickly, since, added together, at least one eye stays decently in focus when looking around. Again something that can be inferred from Rift user testimonials.)

(If I am too far from the lenses, I can compensate by increasing the IPD further, to higher than my actual physical one, until I the lens axes intersect my eyes, but: A) that will move me out of whatever presumed single fixed eye relief distance Pimax has set for the distortion algorithm (EDIT2: …as well as the whole camera projection, to begin with), resulting in a more contracted view of the virtual world, B) I will lose FOV, since I’m farther from the “keyhole” I’m peering through, and C) when I look in any other direction than those ten degrees outwards, in the direction of each eye, I will be more off, than when closer, and end up looking at the inner edges of the lenses, instead of somewhere reasonably close to their centres.)

Thanks for your research, sjedfeklerk! :slight_smile:

9 Likes

Yeah I’m thinking I might need something like that my optometrist says one of my eye pupils is over to the left 2mm more than my right eye

1 Like

This is exactly why I believe the lenses are too small for everyone, and why it might be some don’t have clarity issues where others do at 64mm IPD for example.

No ones actually looking through the middle of either lenses because the minimum IPD required to do that is 70mm

1 Like