I read an article on Slashdot the other day about an newly released open source application. I read a few of the comments, and I found this one (slightly paraphrased):
$ apt-get fooapp "You have searched for packages named fooapp in all distributions....Can't find that package." Sorry, I'm not interested.
The comment suggested that since he can’t try the software in a pre-built distribution then it isn’t worth trying.
Unlike a few years ago when every Linux user ran
configure by hand, the speed and convenience of installing packages has put the compiler on the back burner. Packages aren’t a bad thing, but I think it’s a poor reflection of an administrator’s skill set if they shun the development tools that are available for every Unix environment.
I’m not saying that packages should be avoided. I build them myself for software that I have compiled and tested manually. Once packaged, they’re pushed out during the remote installation process. After that, I am considered the distributor for certain applications in the infrastructure, and the primary support contact. I’ll also concede that it doesn’t make much sense to recompile Gnome or KDE if my OS vendor provides a pre-built package, along with support and regular upgrades. I won’t be too optimistic about installing packages that aren’t built or approved by the author of the software.
When I interview an administrator, I usually throw in a couple of programming interview questions, such as, “How do you determine which shared libraries a program requires?” or “What steps would you take to compile the Apache webserver?”. I don’t expect them to be a full fledged C programmer, but I think it’s important to know how to build software. The candidates that have demonstrated these skills have also been more proficient in debugging, tracing system calls, and identifying performance problems ahead of time.