StarVR One - 4 Viewport Rendering Capability

I recently purchased a StarVR One headset in order to check out it’s ultra wide FOV and because it was cheap. The StarVR was designed to handled 4 Viewport Renderings. There is this demo which has a version especially made for the StarVR One called “ShowDown” it uses UnReal 4.0 and was released by Epic games.

You can few the ShowDown demo on this Youtube link - STARVR ONE - Still The Most Immersive VR Headset! THIS Is The Experience With Optimized Content! - YouTube

My question, does anyone know if there is other 4 Viewport Rendering demos, games or app out there. I’ve done searches but maybe I’m not using the right search word in order to find more demos or even a game which has 4 Viewport rendering in order to test with my StarVR One.

Any direction toward finding content would be appreciated or if you tell me exactly what search term I should use because I’ve already tried “4 Viewport Rendered Demos, 4 Camera Rendering demos” and other similar search terms and came up with nothing.

Thank you

1 Like

Most likely you would need to build the project yourself. It’s not Unity btw, you are mixing up with Unreal.

Showdown Unreal project is here: Showdown VR Demo in UE Legacy Samples - UE Marketplace

StarVR UE plugin is here: StarVR – Developers

Probably need to install the correct version of Unreal (4.22?), load up the Showdown sample and add the StarVR plugin.

It’s not gonna be a walk in the park if you have no Unreal experience. But the instructions on their developer website look pretty good!

They are showing the 4 viewport on these instructions near the end, so it’s probably going to do what you are looking for.

1 Like

Thanks for the reply. Already I’m learning so that’s a start. Based on MRTV’s Showdown video he did for the StarVR One, he stated there was a special version of Showdown which had been compiled specifically for the StarVR One.

I’ve reached out to a few OpenMR members who owned the StarVR One to see if they still have a copy. In some of the owners posts they mention the original Showdown and then also talked about the StarVR One version.

So should I look for Unreal 4 Viewports rendered projects/demo which would be supported by my headset with StarVR UE plugin? Do these demo not run as an .exe file sort how an OpenXR demo works with a compatible OpenXR headset.

I’m trying to understand how these things work. Aren’t there people who develop their own Unreal Projects outside of big company which you can purchase?

Thank you for any explanation you can provide.


What I meant is that if you can’t find the precompiled version anymore, you could always build it yourself. The magic that makes the 4 viewport is in the StarVR Unreal Plugin I linked above.
By default Unreal will set you up for OpenXR nowadays, but this isn’t what you want.
You want to use the StarVR plugin.

The instructions on StarVR developer site show you how to create or load an Unreal project (could be Showdown, I put the link to that one too) then insert the StarVR plugin and configure it.

It’s a complex task, if you are not very tech savyy, I wouldn’t attempt it, and instead find someone with Unreal experience to do it. It would probably take them a couple of hours. I would do it for you if I had time, but I don’t.

1 Like

Thanks for the time you taken to explain this stuff to me. I’ll do some research and see if I want to try it because these types of demos really showcase the real quality of the StarVR One.

Maybe I can find some small project to purchase and play around, something similar to Air Car. Everything doesn’t have to be HL:Alyx

Why purchase? Showdown is Free, and apparently people got it working. You should try to build that one first.


Got it, go with Showdown first.


@mbucchia, I don’t want to bother you to much because you sound like a busy person and I don’t expect you to teach me Unreal Engine via OpenMR. I do however have a question, when dealing with Unreal demos does the Unreal Engine have to match the demo exactly. I download a version of the Unreal Engine that was within the version range of the ShowDown as shown by the download for the demo (Supported Engine Versions 4.9 - 4.24).

When I attempted to open the Showdown, it stated the Engine I had did not match the demos engine and I could only open a copy but the engine version I had installed fell within the advertised 4.9 - 4.24 version range. Does this make any sense?

how much did you pay for your star vr one ?

Did you try alyx already , its awesome in that hmd !

Actually I’ve spent last night and this morning attempting to get Showdown to work with my Varjo VR3 so I could have a reference point. I wasn’t able to use the Varjo OpenXR Unreal Plugin because it was not compatible with the Unreal Engine version needed to play the Demo.

I was however able to enable the built-in OpenXR plug and this allow me to do VR Preview from the menu item. The Showdown looks outstanding in the VR3 and I’ve been playing around with setting within the Varjo control software and the Unreal engine to see which setting give me the best experience.

I found the menu item which allowed me to change Viewports from 1 to 4 but I can’t really see a difference or maybe without the Varjo OpenXR plugin there is no difference in the experience. I think the demo looks better with the Focus screen off but it’s really hard to tell because the demo pretty much looks good at all the various setting I made so I don’t even know if any of those setting are changing anything.

I believe that Showdown demo on StarVR used 4 viewports to render two main and two peripheral views in order to cover a large horizontal FOV. While Varjo uses 4 viewports to render two layers of the same scene, one for the regular display and one for the hi-res display.

I would expect, unless you build your own version of demo with targeting the particular features I mentioned above, it would not work.

Hmm, I am probably getting my memories of things half-heard mixed up, but it may be that both systems took the picture-in-picture approach…

1 Like

That was an outstanding historical Tour De France of the StarVR One. I missed this video somehow but leave it to Thrill Seeker to do the in depth kind of VR journalism that I enjoy.

From what I’ve read on Reddit, OpenMR, and a few other sources there just really isn’t that much discussion about the StarVR One’s 4 Viewport Renders and the possibility of other demo/sample project out there. It’s kind of the same for the Varjo VR3, quite a few gamers own it but we really never talk about the focus screen since outside of flight/racing sims, it serves no purpose and there just isn’t any gaming content being developed to take advantage of the two small focus screen.

I’m naturally curious about old tech because some times I find the engineers go over board in the design of a product and actually give us a product that’s just waiting for the future to catch up. I plan to dig and dig until I find content which can take advantage of the 4 Viewports of the StarVR One just because I want to see what could have been.

Did anyone ever plug the StarVR One into two GPUs or was it all just talk and speculation. Hell everyone thought a PSVR2 would not work with a PC until someone actually plugged it into a PC. Yes, it treated as just another monitor only, but it did work.

Could you go into details about this Picture In Picture approach?

To be honest, I only remember it vaguely, so you might be right. I also remember that the geometry of the different views was totally different.

I can give you a rough outline, as I… let me be charitable to myself, and call it: “understand”, it… :7

There is not much to it - the PiP designation pretty much gives away the whole schtick: You render a low resolution view of the whole per-eye FOV, including what’s in the middle, around your sightline (not necessarily in the middle of the bitmap, if the FOV reaches farther in any direction, relative to straight ahead, than in the others), a wide camera frustum - what Varjo calls the “context” view; And a high resolution one encompassing only that small part in the middle - a narrow frustum - maybe less than 100 by 100 degrees - the “focus” view ; Then composite these images together, so that the latter replaces its direct counterpart area in the former.

So you render the focus area twice, which is a bit wasteful in itself, but at least the one of them that you “throw away”, is the lower resolution one, and in that one, you could perfectly well mask that part out in your shaders - maybe incorporating it into the hidden area mask…

This gives you a degree of foveation in every cardinal direction; The high resolution “focus” imagery is surrounded by low resolution “context” imagery on all sides.

One do not need any special hardware (like the multiple displays in upper range Varjo headsets), to do this; It is just the rendering, so it could be mapped to any HMD. -In that case, for a one-display-panel-per-eye HMD (…which I’m pretty sure the StarVR, too, is…), the composite output bitmap does of course need to be a single resolution, which the two source images are resampled to, matching the HMD screens.

This is conceptually slightly different from slicing up the full per-eye view frustum into two or more tiling segments, whose view plane shares can be independently rotated perpendicular to its frustum partition, for less anisotropy (which manifests at the unnecessarily rendering-expensive stretching you get at the edges), can potentially each be rendered at its own resolution if one so want, and then reprojected (…and resampled) to a contiguous single view screen.
Setting up and rendering multiple views incurs a cost in itself, of course… NVidia actually supplied some functions for these sort of things with VRWorks, back when the GTX10x0 series of graphics cards launched, but hardly anybody ever used them, because they are of course proprietary, and will only work with NVidia GPUs. :stuck_out_tongue:

1 Like

I think this is the section where you are explaining how the StarVR One display work with multiple viewports but it really went over my head but thank you for taking the time to drop this knowledge.

1 Like

You should be aware that I could of course have misunderstood the whole thing – I am only a user, myself, after all. :slight_smile:

You can render the imagery that goes to the screens in any way you desire, so long as you turn-, and and scale, and glue any disparate fragments you maybe have split the work up into, together into a single coherent picture, before you send it to the screen. :stuck_out_tongue:

I made quite a write-up on quad views rendering recently as part of porting over some of the Varjo’s OpenXR extensions to generically work on any headset (Quest Pro first, and Pimax soon) in order to offer dynamic foveated rendering.

As risa said, I’m not 100% sure this is the usage that StarVR One made of it. Though looking at some of StarVR One documentation it would look like the 2 additional views do overlap (as opposed to extend) and this would work similar to Varjo’s “Bionic Display” (higher PPD panel near the center) or their approach to dynamic foveated rendering.

That aspect doesnt matter too much AFAICT. The gain from rendering so many less pixels (you can see some numbers on my page), something like 2/3rd less pixels, is overwhelming compared to the slight loss of rendering the content of the focus views twice. Also, Varjo and my implementation are doing a smooth alpha-blending of the peripheral and focus views near the edges so it’s actually good to have those pixels. Because the application has no way of predicting how the platform will do this alpha-blending (which may vary with certain conditions), it cannot really create a tight stencil.