Will the 12k be wireless? What Does Physics Say?

The 12kX proportedly displays 6k x 2
And is advertised as:
“Reality 12K” = 5,760 x 3,240 = 37.324 Megapixels

8K = 7680 x 4320 = 33.17 Megapixels so let’s use 8k as an example for the sake of argument. But if we come within 10% we’ll come back to this.

per: bandwidth calculation:
8K, chroma 4:4:4 at 60hz = 64 Gbps.
8K, chroma 4:4:4 at 120hz =128 Gbps.
8K, chroma 4:4:4 at 200hz = 210 Gbps. (Purported display capability of Pimax Reality12K)

Per WiGig Whitepaper:
WiGig maxes out 7GBps (beam formed signal, no compression), 10GBps compressed.


Now you could say: What if we don’t do chroma 4:4:4 (text would be blurry, but I don’t care) and keep 60 hz input (then do some sort frame interpolation and rotation on the headset itself - on board motion smoothing) keep in mind this hasn’t been done before, but we love Pimax, and they are putting snapdragon chips in the HMD, so…
-EDIT - it appears Pimax may be doing exactly this with “Split rendering” but only with the wireless? According to Kevin Henderson they are not using spilt rendering, but “hybrid rendering”.
8K chroma 4:2:0 at 60hz = 32.08 Gbps

Aha! Then you say: What about DSC Compression? If we can do it for video, why not games?
8K chroma 4:2:0 at 60hz = 20Gbps with DSC, right?

Except DSC Compression is precisely what I suspect causes blanking in the original version of the 8KX at 90hz as the data rates are not continuous anymore. I believe it also made cable extensions fail. Furthermore, you cannot multiply compression algorithms. As a quick example you can’t compress a .bmp to .jpg and the put it in a .zip file and expect the product of the two compressions as a final result. The second compression offers diminishing returns. (None if the algorithm is the same). So DSC lowers our bandwidth but reduces the compression WiGig itself can achieve. Moreover: It also introduces a latency that grows with resolution. If that latency exceeds 4.5 ms at 60hz, it results in noticeable lag which may be unacceptable in VR.
EDIT: See reply by SSJ3 below about Foveated Compression Codec

Finally: what about WiGig 2.0 or 802.11ay?
Ah yes, reportedly it will achieve 20–40 Gbit/s with MIMO. But…
802.11ay has to have an absolutely line of sight transmission and whether you can do that with a dongle on a headset is questionable. To achieve MIMO I believe you would have to have say three transceivers around the room (like base-stations) and the room would have to be small or closed to allow beam forming as well. Certainly this sounds as difficult as a wire (if not more difficult) and still will not put us in the bandwidth we really want.

DisplayPort 1.4a can deliver 32.4Gbps
DisplayPort 2.0 can deliver 74 Gbps

All you have to do is look at the cables on the 12k as they connect to the headset. I think that there’s two. Literally current DisplayPort bandwidth wasn’t enough for Pimax’s vision. To tie into current GPUs we need to stick with DP1.4, which offers 32Gbps each. DP2.0 would be nice, but then you can’t use any current GPU. (There’s some debate whether 30xxs could be made to do DP2.0 via the vendors, but not current ones). I’m not sure about the two DisplayPort cables but mixing them back together on a PC software side can present another challenge for Pimax. (The first 4k monitors used 2 display ports to get the bandwidth they needed.)
EDIT: 2xDP1.4 present in the promo on the website.

Thus I hang my head wishing “12K” VR could be wireless, but alas:
Wireless games seem forever doomed to be limited by the GPU inside the headset (which will perform badly or be hot and heavy) or be tied via a cable (or two).

See PIMAX 12K QLED - THIS IS PIMAX GONE WILD! Everything You Need To Know! Incl. Pimax COO Interview!!](PIMAX 12K QLED - THIS IS PIMAX GONE WILD! Everything You Need To Know! Incl. Pimax COO Interview!! - YouTube)


I might double check your math later, but I wanted to quickly point out two things:

  1. Kevin has previously mentioned that current engineering models indeed have two DP1.4 connectors at the PC end. I think that it would make sense to have that as the default, with an option to upgrade to a DP2.0 cable down the line if you have the GPU to make use of it.

  2. It uses Tobii’s Foveated Transport. SIGGRAPH 2021 Talk: A Lightweight Foveation Codec for VR​ - YouTube

1 Like

Excellent Post! Thanks for confirming 2xDP1.4 in use as I simply couldn’t find where I’d seen it before. The Foveated Transport Codec sounds great, but states that there’s 40ms-55ms of latency and this was for much smaller source data from HTC Vive. I would think it could be leveraged on wide FOV (hinted at in the video) since the far outer portion could be further compressed. On the other hand, it demonstrated 6:1 compression, which put some things in range. I will definitely say Tobii has something special there. This reassures me about the display specs in general as I was having trouble understanding how even DP1.4ax2 could achieve the stated display, particularly with 200hz… I still have a hard time seeing WiGig 1.0 getting us in range.
Way to go Tobii! Wish the eye tracking module worked better with my 8KX. Maybe it will also be better with the 12K. This does explain why they are committed to eye-tracking, where before I thought it was cute for reducing rendering requirements, but woefully unsupported by any games.

1 Like

Wait! it says the reprojection IS in the headset. Split rendering. Is that true currently with an 8KX? I thought that currently PiMax relies on Steam VR for reprojection. I guess I’m fuzzy on whether there is split rendering such that the headset does anything to the render. Kevin mentioned in his interview that it has “Hybrid rendering”.

The 8KX doesn’t do split rendering, it doesn’t have any chipset on the headset like an XR2 for doing that.

As for the wigig and wireless etc. I would imagine that it’s not going to be at full resolution, or at least not largest FOV. There will be tradeoffs for wireless, but the fact that it is modular does leave hope for future iterations and potentially upgraded wireless solutions down the line.

Personally, I will be assuming the 12k is wired for the foreseeable future, except for when in standalone mode (which I will likely only really use for movies).

Your speculation is on point, but I think we’re still missing way too much information.

1 Like

Not going into what my personal expectations regarding the 12K are and just from an entirely technological perspective, it appears as though you are either not aware of or severly underestimate the relevance of the announced Qualcom SOC in the HMD.

Apart from allowing basic standalone functionality, it allows for compute intensive decompression on the HMD side. Just look at what Meta is able to pump through a bog standard wifi connection and at what res and hz and you can get an idea of what can be transferred when combining a similar approach together with foveated transport, DSC and on the bandwidth of .ad or .ay connection.

It’s doable, the question is if it’s doable for Pimax.


Completely agreed.

I think basically lossless compression should be doable over wigig for the full FOV (perhaps not full 200hz though but sure nothing can run that anyway right now so not a concern).

I think the biggest issue with wigig is the line of sight aspect. It wasn’t a huge issue with the Vive wireless adapter but the form factor that we see of the pimax wireless dongle doesn’t make sense for something that needed LoS.

1 Like

Wait, what? Could you please link to what was shown, I must have missed that?

1 Like

I’m referring to the video presentation when the 12k was announced.

1 Like

OP is really looking at this all wrong. The calculations are founded on wired protocols and sending uncompressed raster data. And then it’s assuming sending this over wireless somehow and noting that it won’t work.

Yes, it won’t work. Not that way. And for more reasons than just maximum bandwidth limits.

In fact, the situation for wireless is much worse than OP’s numbers suggest. The maximum bandwidth limits of the protocol are not achievable under real world scenarios. Usually not even close. And wireless suffers continuously from RF interference, which means you get bit errors and drop outs. Because of these issues, the whole nature of the protocols you use in wireless is very different from wired.

Look at actual wireless protocols in devices that exist today for guidance on what the situation might look like for the 12K.

Let’s take the Quest 2’s Oculus Link or virtual desktop. These are generally running around 150 Mbps.

The Quest 2 has a resolution of 1832x1920 per eye. The 12K is speculated to have a resolution of 5760x3240 per eye. That’s about 5.3 times the resolution. So if this is scaled linearly then 150Mbps * 5.3 = 795Mbps.

WiGig maxes out at 7Gbps. ~1Gbps would be a pretty reasonable sustained rate for it to achieve under practical conditions.

So if the 12K implements a similar protocol to what the Quest 2 uses but just with higher resolution, the back-of-the-envelope numbers do work out with WiGig just fine.

But in fact the 12K could do much better than this. With eye tracking, it’s possible to implement foveated transport. And it could also implement WiGig 2 rather than WiGig 1. There are a number of ways that Pimax could potentially implement much better wireless performance than the Quest 2 even when considering the difference in resolutions.

Not only does wireless on the 12K theoretically work, it could potentially work really well.


I absolutely missed that they showed something related to wireless functionality.

1 Like

Right I carefully iterated that it would have to be beam formed and perfect signal. To achieve such rates. We weren’t even close so I didn’t argue about the throughout being far worse

Guess we’ll see. The math doesn’t bode well, but the transport codec may be key, at least for wired display? I don’t know but as I listened again to Kevin he briefly stated that the 8kx bumped up against bandwidth constraint already, then immediately mentions tobii. Seems very clear that was a workaround. We will see. DangIt Where’s KEVIN!!! Its 6 weeks since CES. Drop a virtual introduction - do an online demo. show us a beta unit. Prove it exists and is entering production soon! I BELIEVE IN PIMAX but do not leave us wondering until Q4 I will not forgive you of that!

He mentioned that the weight of the 8kx was largely due to heat sinks. I thought it was the lenses.

Interesting discussion… but, obviously, someone hasn’t seen the full video of the introduction. It was actually said there that wireless with large FOV/resolution is not supported.

Just to spread some panic. :grin:


Yeah that’s what I was trying to prove. It is not a wireless HMD of 12k 200hz as advertised. More like the oculus in that it can’t possibly be fun in wirelss unless you’re “Sabing your Beat” with some on board game that sucks. I just want them to focus on the display and knock off the "“platform” business. The PC will be the platform. The cables will be needed. The wireless and Ziggy box will be cute and off to the side much like the eye-tracking and hand-tracking are today.
Also excited to see the weight may come down with the use of fans instead of large copper heatsinks


You are severely underestimating several different things, as has already been pointed out by a few others. The math lines up perfectly fine, Sargon already pointed things out, and doesn’t disprove wireless as a concept like you think it does. Even on the cable it won’t be 12k/200hz because Pimax themselves said things like the eye tracking and foveated transport reduce the sacrifice vs other Pimax HMDs. The 5k Super doesn’t get its full “Large” FoV at 180hz, it has to go to “small” which is 40 degrees less than “Large”. The 12k should still have its 35PPD clarity at 200hz, but I expect it will need to make some degree of FoV sacrifice at 200hz, wired or wireless.

Wireless will work just fine, the cable won’t be required for high fidelity wireless with WiGig.


Where is advertised in that way? - watch the introduction video as dstar suggests.


No you’re right they don’t advertise that, but there are people that will read wireless and think that and have said as much on youtube and this forum. The point of the discussion was to find how we could come close. Tobii transport changes the whole mathmatics and frankly my whole conclusion.

All I really wanted to show/discuss is that you can’t have anything near raw video at even wigig speeds at 200hz and reality12k resolution. Tobii transport codec is a game changer - conceedingly. What other things did I miss outside of reduced resolution/FOV or reduced Hz. Thats why I brought it up - did everyone know about Tobii transport codecs but me? Did everyone know about split or hybrid rendering? Boy do I feel dumb!

I want the matrix in a pair if sunglasses too, but if a company says they can provide them, I want two things, to see how the heck they do it and then play the games - thats part of the fun for me! Good discussion all. Love the insights even if I feel dumb!