5K+ DIY Diffusion filter ( In progress thread)

Hi! after buying the Samsung odyssey+ I’m now convinced that some kind of diffusion filter or equivalent processing must done for the 5K+. 8K SDE is greater than the OD+ if I compare the screens shot. So only a hypothetical 8K+ would probably be better. Some of us here have already mentioned interest for this in other discussions. I would like to compile any ideas and attempts here.

I will do preliminary testing with my iPhone, IPAD PRO screens and some VR lens or equivalent optics.

Here some taught :

2018-12-05 (Frc)

  • Polarize lens : circular like IMAX googles : doesn’t seems to affect pixel SDE perception;
  • Out of focus lens : Seems to require to much out of focus to blur SDE detrimental to overall image definition

Image from voodoo review

Size up to see the “pixels” structures disregard the pixelization artifact

1: To put things in perspective this experiment here is not to strap a 50 cents pouch into my 1K VR headset.
2: on first approach filters directly applied on the screen would be rejected because of the difficulty to applied them on the panels
3: This could be applied even on the next generation headsets to further improve SDE, if needed
4: This mod will cost $$ and probably need a crowd support to achieve require testing and selection of the filters

here is what I’m looking a the moment :grinning:


It would be really helpful to have an OD+ teardown. I think the anti-SDE is not at the lens level but rather some kind of layer on the screens.

Been messing with some shrink wrap stretched over empty glasses frames , it definitely reduces the sde with vive/vivepro but also blurs the image a bit , ok for a viewing a film or pictures on the vive pro but not fps type games

1 Like

If it is something with a component that aligns to pixel grid it would have to be applied directly to screen because of parallax

I think it’s the case, just blurring the image would be strange to me.

You can still try laminating pouches like in this video:

It seems to me that he succeeded perfectly in removing SDE in ancient Oculus DK1, where SDE was as big as an egg :full_moon_with_face:

1 Like

If it’s done that way, they are using the same lithographic process they use for building the screen. This of course is impossible for us.

After looking at the magnified screen shot made by voodoo I’m still not convinced that this filter is something that is aligned so tightly with the SDE

Probably right but something a this lens level is more accessible for DIY modder

I really don’t think a general diffusion filter is used because the image doesn’t look blurry to me, I can still see individual pixels on a very white background, they’re just larger and more square looking than the original odyssey.

1 Like

I agree… I own both the original and the plus Samsung Odyssey. I haven’t had much time lately to spend a great deal of time in VR with them since I got the Plus… but the Plus does appear more blurry to me though however… the Plus is more comfortable than the original. I haven’t fully made up my mind which one I would prefer though… I need a little more time but as of right now, I think I would prefer the plus with its comfort and its marginally improved controller by having the Bluetooth built into the HMD.

EDIT… Doah… I miss read your comment… sorry… I see you said in fact it doesn’t look blurry… it does to me… for objects in the distance. Close objects though is clear. This is a comparison between the OG and the Plus only though. The plus is still more clear than other WMR hmd’s I have tried. Figured I would add this edit instead of deleting my comment.

I think it must be a filter/diffuser of some kind , imo way to difficult/expensive to produce and align a refractive mesh of some kind

Indeed, but I fear that applying something at lens level may introduce too much blurriness. Still it would be interesting to leverage the lens holder that is supposed to be provided. I don’t think it’s provided yet.

Something similar has been done before with passive 4K 3D tv’s, where a different polarization filter was added to each row of pixels… from all accounts, it didn’t add much to the price of the TV. Now granted, the pixel density is much higher on a VR headset, but I think it would still be possible with precise equipment.

Why 8k+ would be better? AFAIK it uses the same screens than 8k, so SDE should be the same.

About the just-a-simple-sheet-of-diffusing-material-applied-directly-to-the-displays option, I can relate my own experiences with the Rift DK1, and Vive regular retail version.

So on the former I tried thin laminating pouches, back in the day, and, later, on the latter: matte smartphone screen protectors.

With the DK1 I was quite happy with the amount of diffusion - it really needed it, whereas on the Vive, the material, although more transparent, was a little bit too diffuse in relation to the resolution of the screen, so that it made the pixels “bleed” a little bit into one another, rather than just enough to fill up the space between them - in both cases we are in essence talking about intentional blurring - it’s just a matter of how much - somewhat harder in the latter case, given the pentile arrangement results in the screen effectively having two different resolutions, one for the green subpixels, and one sqrt(2) lower one for the others.

The big problem in both cases, is that the sheets had rather large granularity to them, which resulted in the chromatic aberration of each and every one of those grains casting a pinprick rainbow, in yet another a static pattern, on top of the ones the Vive and Rift CV1 OLED screens have already.

The more “filled out” pixels also constituted a pattern of their own, so that it now looked like you had a wall of colour-shifting tiles in front of you, instead of a world out there, somewhat occluded by a chain-link fence. I am thinking here that the SDE may not be exclusively a bad thing, and that it does much for the spatial side of perception, what low persistence does for the temporal one – like how if you pixel a slanted line solid, one bit deep, the jaggies are extremely apparent, but if you break it up, by omitting every second pixel, your brain will fill in the intermediaries and more easily ignore the squareness of the pixels, taking the bitmap image as a proper line. In the same vein, if supersampling enough, it makes you perceive better resolution, as more detail is revealed with minute head movements, bringing it into view from behind the SDE’s chain links.

…so it becomes a matter of tuning: You want the diffusing properties of the material matched to the DPI of the screen, and you want it as milkily smooth and homogenous as possible.

Any turbid material is also going to cause some saturation and contrast loss, but much of that can be compensated for with image balancing.

More complex solution will have other properties.


The 8K+ don’t exist and is not even plan by pimax publicly so supposition is it will have new lcd and new upscaler

I’m guessing (and this is a pure guess but it’s what I would try if I had their resources) that Samsung has cunningly repurposed a technique normally used for CCD sensors in cameras.

These too have gaps between the light-sensitive areas, but for quite a long time now most sensors have included a microlens array directly above the sensor so that light that would have fallen into these gaps gets refracted onto the sensor.

The same principle should work in reverse for an OLED panel.


Could someone please tell us the exact dimensions of the Pimax 5k+ lens?

I’ll measure them tonight and let you know.

Main comment in thread updated, take a look any comments appreciated
@Sjef, @VRGIMP27, @D3Pixel, @Heliosurge