October 15th, 2008

VGA over wireless: fascinating, but is it even truly feasible now?

IOGEAR (I believe it's all capitalized) sent me some spam today, but it's spam that I'm actually semi-interested in. Cuz, hey, it's the geek factor, right?

Apparently they have a new device which connects to a USB port on your computer and wirelessly transmits video to a receiver that connects (via VGA) to a monitor up to 30 feet away. This lets you wirelessly set up extended or mirrored desktops. Which, I think, is pretty cool.

But my second thought was, how is this even possible? The device supports up to 1600x1200 with 32-bit color depth. If you do the math, it doesn't seem to work out:

  1. USB supports up to 480 megabits per second, which translates to 60 megabytes per second.

  2. A 1600x1200x32 screen -- at 1,920,000 pixels times 4 bytes (32 bits) per pixel -- requires 7,680,000 bytes, or almost 8 megabytes per frame.

  3. At a basic refresh rate of 60 Hertz, you'd need almost 480 megabytes per second bandwidth to transmit all that data. That's 8 times the bandwidth of USB, to say nothing of whatever wireless protocol they're using. In fact, if you look at the spec sheet, it says the "real throughput" is only 130 megabits per second, which makes it sound like they're using 802.11n. In which case, the bandwidth limitations of USB are insignificant because the wireless is the real bottleneck here.

  4. According to my math, at 1600x1200x32, that gives you slightly over 2 frames per second (130 megabits per second divided by 8 bits per byte = 16.25 megabytes per second, compared to our roughly-8-megabytes-per-frame calculation from point #2 above). And that's a very, very, very poor refresh rate.


So unless I'm doing my math wrong, the bubble of such a wonderful device seems to be burst by the real-life bandwidth limitations of the technologies involved. I suppose they could use some kind of compression to wirelessly propagate only the actual changes from one frame to the next, but then whenever you had a lot of movement onscreen, you'd start dropping frames like crazy (or getting some truly bizarre artifacting). I suppose it could incorporate some kind of non-lossy compression too, but that would demand substantial realtime processing power at both the transmitter and receiver ends.

Am I just missing something incredibly obvious? Or is it true that this technology, while very promising, just isn't able to make good on its claims? I've done a little searching, and while I've found the product mentioned by some news outlets, I haven't found any mention of the bandwidth issue I've addressed here.

I'll keep poking around, but I just wanted to pen these off-the-cuff thoughts to see if any tech-savvy LJ-ers could point out a simple error in my math, or something else I'm overlooking.