Related link: http://www.apple.com/isight/
Although I do have some EE and CompE up in the cranium, I’ve typically gravitated toward computer science and software engineering. With that said, I must admit I’ve started dabbling around with hardware quite a bit lately; sometimes, it’s just down right painful.
You see, in my world of software bliss, I can just assume the world away. I don’t have to deal with real world forces like friction, slippage, changes in voltages as batteries drain power, or even the noise that’s in images taken from a webcam. Hardware, however, interacts with “real” stuff. Other than that amorphous cloud called the internet, this isn’t the nice clean abstraction of reality that I’ve grown so used to during development.
A lot of my recent hardware hacking has dealt with the iSight. Fortunately, the QuickTime SequenceGrabber takes care of most of the broad strokes, but there’s still the inherent real world pains of interfacing to a webcam that must be dealt with. Classic annoyances I’ve encountered include just not getting a response from the camera at all and/or getting back bogus images that are all black or distorted. But those issues are easy to deal with.
Earlier today, however, I dealt with something that was quite a bit more subtle.
The code I’m working on depends on a fresh image from the camera about every two seconds. Any sooner than that just messes things up. That’s simple enough, but for some reason my code was running unpredictably bizarre. After a long frenzy of troubleshooting on multiple levels, I finally determined that my iSight (or something between it and my code) was buffering a couple of extra images — stale ones.
Apparently my desired frame rate was way too slow for a state-of-the-art webcam that touts 30 frames per second. Who would’ve known? While that synopsis might sound simple, it took a lot of debugging to figure out, and my software tools weren’t much help since the “problem” wasn’t in my code.
Anyway, the fix was easy: instead of snapping a single image whenever I want it, I snap about five and use the last one. That gives me a fresh pic and there’s no real performance penalty either. I say all of that just to say that working with hardware is…well, hard.
What’s the most subtle hardware/software time sink you’ve ever encountered during a development cycle? How long did it take to figure out?