no need to do excuses for G2 bad performance, both 8kx and vp2 work miles better than g2 even having higher res, though vp2 uses hacks with dsc but 8kx has tons of res. It doesnt matter how it looks on a screen, as screen is limiting sharpness but comparing to pancake gaming you can render 4k resolution and output it to 1080p with some stuff like nvidia or some games bultin settings and it doesnt matter that 1080 is low, GPU will struggle more to render a game in 4k, on steam there are more settings as you know and first thing to ensure max res and other limits not in place as otherwise you wont render res high and performance wont decrease with resolution changed. Eg comos same res or close as index but it was one of the worst hmd in regard to performance i tried and by that time that hadnt even working reprojection, drivers, renderer hacks and other tips are vital for good performance. G2 just introduces apart of resolution lot of bottlenecks in its renderer pipeline and eats all the resources while hmd like index, vp1, rift S dont. And performance on index wasnt fine in first year, i didnt use it for quite a long until they fixed their drivers, colours, vertical columns and implemented best reprojection algo i saw so far, on other vendors it wasnt enjoyable due to artifacts, oculus mb is good as well for pcvr but i dont use rift s so cant say how its now but on cv1 repro worked fine.
g2s res isnt that high, 4320x2160 but eg i lower to 20% SS on g2 all gpus graphs are green but game doesnt feel smooth enough, its tracking or steamvr implementation adding bad xp in terms of smoothness on top of poor performance wmr for g2 has.
Seriously, though: The question for somebody (me) not versed in the OpenXR API (or any API), is just what is in the most basic session with that API. -What is the minimum/standard/recommended amount of stuff⢠an application developer has to deal with, in order to work with it? Does the standard always give the developer an arbitrarily long array of frustums and render targets, and mandate the rendering of all these, so that support for such can be had without the developer having to delve into optional, or even not-yet-existing, extras, or is it still just the one for each eye as standard?
I feel that things that are relegated to extensions that needs specific implementation in applications, are almost just as likely to be ignored and fall by the wayside, as vendor-custom features without any OpenXR integration, unless they come from one of the biggest of the biggest elephants in the industry, as a standard feature in all their new products; And anyway - I do not see how they could possibly work retroactively for most things.
Never mind foveated rendering, and eye-tracking and such; Where does simply HDR and larger colour gamut fall into this, for one⌠How do current APIs and VR runtimes treat these? -Do they already receive, handle, and output them hitchlessly, just as they do sRGB? It would be ridiculously sad, were this to not be basic functionality, but buried in an extension nobody bothers to look at.
This is for obvious things that should really be old hat by now â Future things are of course less predictable, although for backwards compatibility, I suppose an extension could likely reproject âold styleâ frustum slices to newer formats with, say, a spherical viewplane with non-uniform pixel density across it, or lightfield cross sections⌠:7
Got my replacement VR3 this week. Yesterday I tried it first, no dead pixels but the headset just kept disconnecting after a few minutes every time. Fiddled a bit with the USB cable at the headsetâs end (you can get the cable out like with most headsets) and that seems to have solved it, seems a connector wasnât staying in place and it hasnât disconnected after that. Today played about an hour without any problems.
Also tried the @NextGenVR FoV mod. I went from about 88 horizontally to about 102-104 horizontally which felt like a big increase. However I really couldnât stand how it felt on my face, the original foam is just SO much more comfortable that for the moment Iâve switched back to that.
hmmm odd. So the mechanism might not work very well. Either way, 71 seems to work really well for me. Will also try setting it to 70 (havenât even bothered yet since 71 seems to work so well). Thanks!
VRsimguy is getting a Varjo to test. He loves the G2 and Pimax for their resolution and FoV respectively- I think heâll be pretty impressed with the Varjjoâs display; bowled over might be the phrase- should be a good series of videos.
Varjo sets mine to 67.5 which feels very comfortable though I remember having it measured a few years back by optician as 65.
I agree with you that the original foam is more comfy than 3mm neoprene strips but I crave fov and every mm matters at least thatâs what she said.
I dont know if this is the right section for this or not as I could not find a for sale section.
If there is please let me know and we will move topic!
For sale is an immaculate Varjo VR-3 with very low hours of use and is in minty condition!
For those who know this is the ultimate in VR headsets right now with insane PPI and unbeatable picture quality.
The unit was lightly used, never sweat in and comes with the original packaging & remaining subscription which is about 8-9 months.
I have flawless 27-0 heatware
Please note that this is not a pleb HMD and requires a computer with serious GPU horsepower, such as 3070 to 3090 RTX
About varjo supports performance, I reported a dead pixel about 17 feb and iâm still waiting for rma which they agreed to within a day or 2. I donât have to send back the faulty unit until I receive the new one. Tbh the wait hasnât bothered me as my dead pixel is almost out of view so my hmd is still perfectly useable. I can see why my rma is taking so long as they have a backlog of orders where people ordered months ago have still not received their hmd. But theyâve just recently cut their lead time to 4-8 weeks so I expect my replacement hmd soon.
Yeah also took a while before they sent out my RMA unit. I also didnt have to return mine before receiving the RMA unit, so they put a lot of faith in the honesty of their customers But in the end it all went fine, received my unit, returned the dead pixel one and was happy with the whole process albeit that it took a bit longer than I had hoped.
BTW it did surprise me that dead pixels was even a thing on a high end unit like this, youâd expect them to revise units a bit better before sending them out, my dead pixels were very obvious, in the middle of the panel.
Sorry to necro but just wanted to chime in here, can you not try the display flasher tools to see if the pixels are just stuck and clear them out this way? @Heliosurge I sort of agree but only partially with your VR 2.0 views Varifocal and or Lightfield Displays(tunable lenses were largely abandoned by Oculus labs for a long time now according to my inside source so I have doubts about this and so has the cascaded display idea with LCD switching.) The most likely move in this space to save weight but also have high performance relative to urrent lenses are custom metamaterial lenses see herehere also some AR lens news here You do not need lightfield displays to properly render a lightfield, see Googleâs lightfield demo on Steam. See also Avegantâs early videos of their lightfield AR headset, its just a DMD chip with a projection onto a combiner, the trick is mainly in the software. Hi Res maybe 8k/eye(more than likely all you really need is 4K depending on how close you can get the display to the eye) ideally I would like to see this solved with a projection system and retinal mapping if Texas Instruments can make large enough DMDs with a tiny bezel you could approach this in a few ways. One way might be to literally create a projection on a curved screen in front of the user but this would make for a fairly hefty sized device possibly can be solved with the use of mirrors/pellicles but this adds complexity. Or you could mount the DMDs in front of the userâs entire FoV again use LED or laser projectors to get your RGB light and then with the use of eye tracking dynamically adjust the projection to where the user is looking but to minimize the size of this setup would probably require a specialized lens something along the lines of this WFoV(absolutely this) ET with DFR & DDC(this is just a must at this point) clarity across WFoV(depending on the lens tech used we will likely see this trickle down)