Spotlight on: Tobii (DFR) Spotlight. Will it work in all games?

I sincerely applaud you and the rest of the development team for your efforts; if anything, it makes it even clearer how vital a truly open standard such as OpenVR is as a foundation as well as properly documented SDKs for the specifics of each vendor’s interpretation of those standards.

I’m not surprised 7invensun‘s SDK is what it is, after all not even their B2C product was viable even on a driver level.

Here’s to hoping that Tobii do a better job and, with regards to your OpenXR work, that more titles support OpenXR to make use of it.

(I’m looking specifically at you, ACC and iRacing!)

3 Likes

That is the Aims of OpenXR.

Yes do you hear that Pimax? :slight_smile:

With what? Does anything support the benefit of a VR3 with its super resolution fovea screen? Jesus, do you pay the subscription? Why not an Aero?

And each game it seems almost

You’re mixing up eye tracking and per-eye rendering.

Eye tracking is to retrieve the current “eye gaze”, a functionality of the headset. Vendor are doing differently from one another unless they adopt OpenXR and get a common interface. This is what we were talking about.

Per-eye frame rendering (which isn’t even a real term, I just made that up), is the fact that a game makes several rendering passes for each frame that ends up displayed on screen, and some of these passes are used to construct the image ending up in your left eye, some for the right eye, and some for completely different things (eg: render instruments or other virtual screens inside the game). Every game engine works differently, by design. There is no standard to “hint” at what a game is rendering.

In the very old days of GPUs, you had to render to a specific memory region (the framebuffer) for something to be displayed on screen. So it was actually easy to detect “hey this is going to the screen”. But now everything is “rendered to texture”, and whether these textures end up being applied inside the 3D world or they are ending up inside your desktop window (for 2D apps) or sent our to your VR headset; it’s not something that Direct3D or Vulkan specify.
In early days of VR, games would submit a single texture with both eyes rendered directly to it, which also made things a little simpler because you could assume that “left side” is left eye and “right side” is right eye.

But now it’s the jungle. Games render single eyes in textures, and perform multiple passes out-of-order. All to make things more efficient. So when we try to make a guess for “where is this texture ends up” we have to either find a common logic (eg: the game will always render exactly 1 pass to the left eye, followed by one pass to the right eye - HINT: in real life, this never happens) or do per-game logic.

1 Like

That’s so much more complex than I ever knew. I for example, being very-much a layperson, see the monitor when I play VR in MSFS and see that there’s a left and a right eye and (incorrectly) assume that it’s rendering it as one big huge pass with both eyes included. Thanks for the insight. I can see why @fholger may have decided not to make things game-specific, but rather tried to allow other users to guess-and-test. That was a very nice idea. Pimax’s canted displays and the fact that FFR works irrespective of the FSR position left me realizing this is a TON more complex than I thought.

1 Like

Yes MSFS renders both eyes separately, then it copy/paste each of them onto what you see in the Window.

For the guess-and-test in vrperfkit unfortunately this has low chances to work IMO. Because the number of passes is dependent on the content being rendered. Take the example of MSFS, the number of passes done by the game depends on your settings and where you are in the game. It might even depend on the type of plane you are using. It also depends on whether you are in the menus or not.

So sure one minute the pattern might be LLLRRRLR, but the next minute the game decides to render one extra pass before and all of a sudden the new sequence should be XLLLRRRLR (X for “do nothing”) or maybe even LLLLRRRRLR. So this can only work for very simple engines.

2 Likes

When I got the VR-3, there wasn’t an Aero yet by a long shot. In fact, I believe I was one of if not the very first non B2B user of the VR-3.

See this:

https://community.openmr.ai/t/twack3rs-varjo-vr-3-impressions/35834

Of course I would have gotten an Aero instead had it been available.

2 Likes

So basicslly fholger said “good luck” mwhahah

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.