Please do keep in mind that I wrote this article about two and a half years ago. 802.11b was just hitting the scene, and I was running around a bit like an idiot trying to figure out what the tech was good for.
It has become common knowledge since then that a wireless card's listed rate is the speed at which it sends 802.11 traffic (sometimes referred to as the "radio rate"). There is significant overhead in the 802 protocols, accounting for more than half of the data sent. The best benchmarks I've seen for actual user data (including headers) on an 802.11b link is about 5Mbps. I haven't tried it myself, but I'm told that 802.11a and 802.11g, billed at "54Mbps", only push around 20Mbps of usable data. In my experience, running TCP/IP incurs about an additional 30% overhead or so.
At the time, I was more interested in the impact that a microwave oven had on 802.11b than in generating actual benchmarks. I thought that if I made my measurements the same way I could learn something from the difference.
There are, of course, much better tools for measuring the throughput of a link than ssh (ttcp for one).
But this experiment is flawed in more important ways. Probably my biggest mistake is in only attempting this on one channel, over only a couple of runs. It would be much more meaningful to have tried many runs on each channel, against a variety of devices.
Now, a couple of years later, it is readily apparent to me that ovens are the least of your worries on 2.4GHz (since even at their worst, they turn off after a couple of minutes). Panasonic 2.4GHz phones seem to be the worst of the lot in my experience. X-10 cameras are right up there, too.