Yes, drawings for one eye. [8K=4096x2657, 4K=1612x1989, per eye]
Proportions are observed with a small error of +/- 10%.
Dimensions, of course, are not accurate. Visually increased the rectangle 8K.
But the proportions are correct with a small margin of error.
Measured by ruler on the monitor in centimeters +/- 0,5-1cm
This can be done more accurately if desired.
This is simple for clarity and superiority of 8K over 4K in the plan FOV, width of the view
@deletedpimaxrep1 @PimaxVR, I just saw this post over on reddit regarding the DP chipset used in the 8K, and there is some speculation that it uses DP 1.2. Here is a link.
It sounds like the chipset in the HMD is using Display Port 1v.2 which maxes out at around 1440p 170 Hz (85 Hz each eye) which is why they canāt reach 90 Hz on the displays. Even if the GPU uses Display Port v1.4 which can send 1440p at 180+ Hz, the chipset in the HMD cannot process it.
Their comment about the v2 being able to show at 90 Hz proves that the LCD display is capable of doing it with 2 HDMI inputs, but the Display Port chipset in the HMD is too old. The solution is to either use 2 cables as they did in the V2 HMD or update the display port chipset to v1.4.
If I am understanding correctly, this is the reason why you are struggling to achieve a steady 90hzā¦ so why not just upgrade to DP1.4? If users with GTX980 are slightly limited in refresh rate (maybe they only get 80-85), I think that would be a fair compromise. After all, Iām pretty sure most of us who are backing the 8K have a 10 series graphics card (and anyone who doesnāt obviously isnāt very serious about virtual reality).
Edit: Looks like the HMD already supports DP1.4, false alarm.
You know that the 980ti was one of the most powerful cards when Vive and Rift were released? Money doesnāt grow on trees.
Iām not saying its a bad graphics card, Iām saying the displayport technology is outdated and it would be a shame if we all suffered because they couldnāt get this headset working perfectly at 90hz on DP1.2. Simple as that.
Edit:
Literally the only difference weāre talking about is that GTX 980 users get a slightly lesser experienceā¦ instead of every person who buys the 8K having a slightly lesser experience. Which is worse? You tell me.
As the video says, itās running a refresh rate test. Does this mean they are supersampling through the Pimax software to get that rendering resolution? Or, is this the default resolution?
^ This
If DP1.4 solves all the problems, and the only downside is that pre 10 series GPUs just run at a slightly lower refresh rate, that seems like a fine solution.
Donāt make the rest of us suffer for the sake of legacy compatibility.
Yep, exactly. Again, that reddit post seems to be speculative; but it does sound rational.
Agree, everyone that buys this kind of high-end headset is expected to have the power to drive it anyway.
Had a look at the ANX7530, hereās the product information page:
http://www.analogix.com/en/products/dp-mipi-converters/anx7530
So it looks like the HMD does have DP1.4 support already.
Hey Asaku01, this is the computer I just bought to power the Pimax 8k, would you consider this to be fairly high powered?
Itās not the 1080 ti but iām thinking this should be more then enough right?
MSI Desktop Computer Aegis 3 VR7RE-036US Intel Core i7 7th Gen 7700 (3.60 GHz) 32 GB DDR4 2 TB HDD 512 GB SSD NVIDIA GeForce GTX 1080 Windows 10 Home 64-Bit
1080 is a strong card, no doubt about it. Whatās more important though is how optimized is the software and thatās up to the dev. People say that the 970/980 are enough for the Vive&Oculus which run at ~2.520.000 res. Compare that to Pimaxās 7.372.800ā¦ Thatās about 300% more juice needed for the same fps in the same conditions. A 970 has 3.9TFlops, a 980 has 4.7Tflops and a 1080 has 9TFlops. The 1080 is 230% more powerful than a 970. Not, the 1080ti on the other hand has 11.3Tflops which puts it at about 289% more powerful than a 970. Moral of the storyā¦ no matter what GPU you want to buy today wonāt get you better performance than a rift/vive + gtx970. Butā¦ if the game runs at ~120fps on a 970 you donāt have to worry about reaching 90 with a 1080. Keep in mind you can always turn details off for increased performance. All the numbers were taken directly from nvidiaās declared specs. Your mega ultra volcano rgb titanium (+etc) custom card may have higher frequencies and thus perform better but keep in mind that the 970ās youāre comparing to can be custom too (usually up to 8% better performance on the best customs). I canāt answer your question directly as itās directly tied to applicationās performance.
@Canuck 1080 is a strong card, no doubt about it. Whatās more important though is how optimized is the software and thatās up to the dev. People say that the 970/980 are enough for the Vive&Oculus which run at ~2.520.000 res. Compare that to Pimaxās 7.372.800ā¦ Thatās about 300% more juice needed for the same fps in the same conditions. A 970 has 3.9TFlops, a 980 has 4.7Tflops and a 1080 has 9TFlops. The 1080 is 230% more powerful than a 970. Now, the 1080ti on the other hand has 11.3Tflops which puts it at about 289% more powerful than a 970. Moral of the storyā¦ no matter what GPU you want to buy today wonāt get you better performance than a rift/vive + gtx970. Butā¦ if the game runs at ~120fps on a 970 you donāt have to worry about reaching 90 with a 1080. Keep in mind you can always turn details off for increased performance. All the numbers were taken directly from nvidiaās declared specs. Your mega ultra volcano rgb titanium (+etc) custom card may have higher frequencies and thus perform better but keep in mind that the 970ās youāre comparing to can be custom too (usually up to 8% better performance on the best customs). I canāt answer your question directly as itās directly tied to applicationās performance.
The devil is in the fine print. The specs sheet says it supports āHBR 2.5ā at 6,75 Gbps, while DP 1.4 (actually it should be supported from DP 1.3) is supposed to support HBR 3 at 8,1 Gbps. It is amazing how you can put anything in the specs sheet, no .
Now this HBR 2.5 might give the total throughput of 4*6,75 Gbps = 27 Gbps * 0,8 (protocol overhead) = 21,6 Gbps. This is still far safe from 17,8 Gbps (which is my estimate for Pimax8K bandwidth requirement).
But you may wonder, if analogix has cut some other corners in this chip, or if DPs 1.4 on gfx cards are capable of using this HBR 2.5 speed (since according to wiki it is not part of the standard), while at the same time HBR2 is not sufficient and HBR3 is not supported.
Thanks Asaku01!
I have that system on the way and was buying a second computer shortly. I can get a second identical one or opt for one that has all the same specs instead it has the 1080ti but only 16 ram instead of 32.
Iām thinking iād be better with the 1080ti instead of an added 16gb of ram!
If youāre building a gaming pc go for 16. 32+ are mainly for workstations.
Ah thanks, now you have me thinking I should return the two we bought that have 1080 with 32 ram and go for the 1080ti with 16 ram haha
Iām certainly going to consider doing that since I just found one with awesome price.
Thatās what I would do
Hmm, found out that the video card 1080ti in the MSI system iām looking at is the Armor version of it which has had pretty bad reviews. That must be why the system is such a good price.
Not sure if I want to risk it
Hello,
Maybe you should wait a little ? It is possible that Pimax had to dalyed a little bit the delivery, and Nvidia should launch the consumers version of Volta serie early 2018 I think ?
Prices of Pascal GPU may go down.
@VIVE/OCULUS release
top cards=980 gtx ti going for +700$
GTX TITAN X=999$
970 gtx=350$
minimum required was 970 gtx
Most people did had that requirement. But we are talking minimum specs here.
1 year later
1080 gtx =600$
1070 gtx =450$ ==>same power as 980 gtx
1060 gtx =350$
So in 2018 volta maybe
2070gtx =maybe same power 1080gtx @450$
but 2080gtx a lot more power
and 2080gtx ti enough power to power this pimax 8k for sure
Now lets be honest, you think VIVE 2 is gonna go 4k @90hz? ANd not only that you think VIVE is gonna be 4k native @90hz with 200FOV? LOL
What card you think you gonna need for that? i tell you at least top card volta or SLI.
So they will not bring this out, onyl pimax is trying to push for it, because pimax doesnāt want cake eating, they want to push VR limits, so they develop Brain warp etc so people can try to experience something that we can only experience maybe in 2-3 years. They know most people cannot run it, but they also know when gpu cards catch up people who bought it will be happy, but VIVE/OCULUS is waiting till gpu cards are out that can support their hardwareā¦that is the difference between the companies.
1 company plays safe, other company is evolving and pushing tech.
Today you can choose to experience something that you can wait maybe 2-3 years before VIVE get up. It iwll not be perfect, but it will be pretty neat.
But i donāt believe VIVE 2 will be 4k@90hz 200FOV, maybe 4k@90hz 140FOV, FOV to reduce gpu power. Still need top cards of next generation to run it. And not a big gamers % buys cards over 600$
Today only 1080gtx ti (800$) able to get in some games 60fps @ 4k monitor
Most games between 40-55fps @4k.
ALl iwant to say, VIVE /oculus is like 1080P
PImax is like 4k
With gpu out now you know what you gonna getā¦if you wanna have a taste of 200FOV with higher resolution then VIVE and minimum 75hz with 90hz as goal you buy pimax.
If you wanna have smooth 1080p experience, buy VIVE with todays GPU is very smooth.
If you wanted 8k pimax @90hz for 1000% you would had it if they lowered their resolutionā¦but thats the whole point of this kickstarter. The resolution +200FOV.
IF you want 90hz guaranteed, steer away and wait till consumer version comes out and then buy and stop complain now. If you wanna experience 8k@75hz 200 FOV 1000% pledge now and get eye track/wireless and have a good gpu to at least have a taste of what you can get, it will only get better with better gpu i believe