StarVR One is available now for enterprise customers!

But the problem we are facing is not that we need to observe the particular “projector” from infinite number of places at the same time, but that we observe it from one place (let’s forget for simplicity that we have two eyes) and depending on where this place is it looks differently.

If I relax the requirement on what you describe to one single spot, I basically got the thing which exists already (and as I mentioned earlier, was used when filming Mandalorian). The only drawback is that instead of the headset you need to “wear” a stage around you :slight_smile:.

You can either have the 4D light source emitting the different light depending on the direction it is viewed from, or simply change the light statically, based on the observer’s position. If you extend the latter into the space you get the Mandalorian stage, if you put is as close to the observer as it gets you get the VR headset. Both are far easier to achieve than the light field projector and produce the same results.

In other words, if the scene is artificially generated, you can create exactly same “light fields” as Google was trying to capture for the real scenes, this time however with an arbitrary precision and range. And then play it in the headset the same way Goodle did play theirs.

3 Likes

I’m, going to have to leave that for later in the week - I should have been asleep hours ago, and that looks like a looong bit of that most cumbersome-ly overcomplicating of tounges: Patent-ese… Hmm, I can’t even see the figures, just scrolling through. :stuck_out_tongue:

Is this patent for tech used in the Glyph, or more recent - like the alleged lightfield prototype?

I do notice that right at the top it says “curved mirror and partially transparent plate”, which reinforces previous impressions that the “retinal projection” label is nothing but so much marketing guff: So their projection screen is a mirror, instead of a matte or retroreflective surface - well yippety-doo-dah. :stuck_out_tongue:

Not that I mean them and their stuff disrespect - just the up-propping, un-straightforward, purple prose (like in this very sentence :P). :7

If this solution allows you to focus naturally on different things, I highly doubt it’s entirely a software solution, because, you know, physics. -Besides: If it’s all software - what do they need the special glasses for? :stuck_out_tongue:

If the software can render lightfields cheaply; That is absolutely super! Also absolutely necessary… But it can’t magically defy matters of real world optics.

What they could do with a custom combining lens, which you do say they have, could be to e.g. have it designed in such a segmented way that it takes partitions of the screen (just a grid of pictures taken from slightly offset positions), and projects them onto the mirror (visor) so that they overlap 1:1 on it, each emanating from its own position. -That would produce a lightfield; The picture coming from the emitting lens sub-part on the left will reflect off the mirror and travel on to the right, and vice versa. The question is a matter of the geometric solution: How large does the array of lens segments need to be to cover the required range of eventualities? (…which is not that broad, fortunately - the target that is our eyeballs is only so large, and we wouldn’t want to waste resolution on light that shoots off sideways where we’ll never see it, anyway. :D)

No, just that wide enough a range of them needs to be available at the same time, that our eyes have that whole range to freely choose from, without the hardware needing to monitor exactly what they are doing internally, or at least how they converge. :9

…and it is an extremely tight range, for our purposes - just enough to encompass the minute angular difference between light that comes at us from a fly that sits on a window a metre away, and the light that comes from the tree in the garden, right behind it.

That said: On a wider angle scale, you could have a single screen, with each half outputting imagery for both eyes, and achieve full human stereo overlap.

Welcome to the “CAVE” virtual reality system. :slight_smile:

There are of course multiple different ways to achieve for-all-purposes similar end goals - Oculus’ old.Half Dome mechanical varifocal research prototype, Magic Leap’s multifocal product, etc, etc. :7

Being able to display lightfields, either through techniques such as mentioned in this thread, or though holography, is one that is ridiculously density- and performance -heavy, but to my mind very elegant and powerful.

We play back Google’s lightfield captures in our headsets, but we do not see each frame as a lightfield – our view is of something that responds to HMD movement, but not to eyes accommodating or rotating.

You know what… I think the three of us are each arguing slightly different aspects of matters, and all of us erroneously conflating things the others say with things they didn’t necessarily actually touch upon. :7

2 Likes

the different angles is how they achieve the discrete focal planes A Revealing Leap Into Avegant’s Magical Mixed-Reality World - IEEE Spectrum
but that still does not explain how they are able to project these simultaneously and in such a way that by merely shifting your focus you can see objects clearly. Perhaps what they are really doing is always pushing the all the distinct planes and let your eyes do the focus defocus work just as you would in real life. Since their demos only show a select few objects at a time this would not be too resource costly, but again not quite good for VR. As for the reflective lens they needed that to get good AR otherwise they would need cameras and that introduces latency and weight into the system as well additional cost vs using a more traditional combiner. However it is still the same DLP retinal tech just using different focusing optics. See a good write up here Avegant “Light Field” Display – Magic Leap at 1/100th the Investment? - KGOnTech

2 Likes

Umm, no - that is exactly what it does explain.

I think all the talk of “planes” may in part be a term that is being used to make the general gist of things more readily understandable to the public, but which can unfortunately lead somewhat astray. Maybe try to approach the concept from the opposite direction entry point - forget about the planes.

There are a few things with planes:

The distance my eye is focussing at, at any given moment, is its momentary focus plane, and anything nearer, or beyond it will appear out of focus to me.

One can project several discrete overlapping images, filtered for focus at respective distances, emitted by depth offset, instead of laterally, like I described above, but that comes with its own complications.

Filtering rendered frames for depth, such as mentioned above, could by all means be part of one way to algorithmically render a lightfield.

These three are distinct concepts, that must not be confused for one another.

EDIT: The Patent 202 figure 1j in the linked Karl Guttag article (which I didn’t read, I’m afraid - just skimmed briefly - I really should get some sleep by now) looks to be an Avegant take of the same concept as you had in Oculus’ Half Dome research prototype, and I believe it said right under it that this is not what is being used now.

Oh… I have to make this quick mention: Something came up about myopia and glasses, down one or the other of those links… With a lightfield display, one should conceivably be able to compensate in software, for reduced focussing ability of one’s eyes… :7

3 Likes

Yes when you put it that it way it actually does make more sense. They then interpolate between the focus points as needed(probably some kind of frame reprojection? or instancing?) Again would be really neat to be able to use this in VR since Avegant said they are exploring display tech outside of DLP which I suppose means that they can do this on other hardware as well.

2 Likes

I suspect you get some smooth transitioning intrinsically, from optical imperfection bleed-over between the discrete views… :7

Lightfields gobble up resolution exponentially, so I could very well imagine they are looking at whatever can give them more of it. :7

…so is this something the said recently? -I have long been wondering whether they are still in operation. :7

This was said about 6months or so ago in an interview with CEO Ed Tang where he did a keynote on what Avegant is researching now. They have gone the Valve route by creating the tech and then giving it out to hardware partners to make.
Plus they now also have foveated displays listed on their website which is also interesting from a VR application again very relevant in Pimax/StarVR case since this can vastly reduce GPU cycle consumption and potentially eliminates the need for eye tracking.

1 Like

Aha, thanks. :slight_smile:

Hopefully, I should be getting my StarVR One soon. Although is for development, I’ll try make some comparison to the XR unit

7 Likes

Awesome!! Looking forward to your review

2 Likes

It’s good to know someone here is getting one. Can’t wait to hear your impressions VR-TECH. Please test out Iracing and Assetto Corsa with it.

Also, for those of you thinking it’s impossible for us consumers to obtain the headset, think about it for a second…many business will be selling these on ebay sooner or later. Heck…VR-TECH can probably get you one :stuck_out_tongue:

My schedule is limited as I’m working on a Virtual Reality project plus my regular work. I will try some basic testing but may not have too much time to fiddle to make games work. If the headset work with games great, but it was purchased not expecting to be a gaming headset. It will be icing on the cake if it does :slight_smile: That is why I purchased the 8KX. Hope Pimax send those units soon.

3 Likes

It would be very useful to know which games work right out of the box and which ones require tweaking. Try out the following games when you get it:

Half Life Alyx, Boneworks, The Walking Dead, Asgard’s Wrath (through revive), Pavlov VR, Onward, Skyrim VR, Arizona Sunshine, Swords of Gurrah, Dirt Rally 2.0, and various driving/flight sims. Can spend like a few minutes in each one.

I am jealous!!! :wink: Where do you live? :wink:

4 Likes

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.