October 2006 Archives
One of the biggest benefits of a conference is meeting new people with surprising ideas, and on the last day of Kosmopolis in Barcelona I grabbed a couple of those marvelous opportunities. I had a brief talk, followed up by email, with the founder of Dispatx, a site dedicated to collaborative art production and to exploring the processes behind collaborative art. I also struck up a conversation with an author who gave me a whole different perspective on the subject matter I was talking about at the conference: the influence of interactive web technologies on journalism.
I stumbled across a message on the Ubuntu Devel mailing list this weekend which I found pretty disturbing. Here is the body of the message. This mail thread is commenting on a recent Slashdot article on the pain of upgrading from Dapper to Edgy. The thing that disturbed me was this comment:
Although it is very difficult to diagnose problems from blog and forum
posts (hence the analysis below is probably wrong, incomplete and
unhelpful) I think a large number of problems fall into the following
* Using apt-get dist-upgrade rather than upgrade-manager
- Could this be reduced by emphaising on the release notes, on
ubuntu.com and in the support channels, the correct way to upgrade?
- Could apt be patched to give clearer warnings that dist upgrading
could break your system, and recommend that the user run upgrade
manager instead. In fact, just run update-manager when the user tries
this, whilst siulanousy taking their pony away from them
The replies in the rest of the thread never refuted that using upgrade-manager (I assume he’s meaning update-manager) is the “right” way to upgrade. However, at least one person replied back and stated what I was thinking, namely that it’s absurd to officially discourage (or prevent!) Ubuntu users from upgrading their systems by `apt-get dist-upgrade`. I’m still digging around to see if update-manager is the official means of upgrading a system.
If some of the pain that I experienced is because of doing a dist-upgrade rather than update-manager, then someone needs to do a better job of making the community aware of how they should be upgrading their systems. I didn’t run Automatix or Easyubuntu. I didn’t have Beryl/Compiz installed and definitely didn’t have the Beryl repositories in sources.list. And I didn’t have any binary video drivers installed which I had downloaded from the vendor. These are some of the other factors contributing to a painful upgrade, so it must just be my use of “apt-get dist-upgrade”. This was a pretty vanilla install I was upgrading from. I would have expected it to go smoother.
If you are upgrading from Dapper to Edgy, beware. I’ve read way too many tales of pain and woe from the above referenced Slashdot article to discount possible problems with this upgrade. You may want to try running update-manager with a “-c” flag (checks if a new distribution release is available). Actually, first backup your data. Then update. Or just install from scratch.
This week on the Perl 6 mailing lists
“IMHO, @Larry got overly precise in the above S02 quote: s[More precisely] = “Usually”"
– Jonathan Lang, in ‘where constraints as roles’ (was: ‘how typish are roles’)
Eurobuddy Matt was doing a presentation when there was an API change at GoogleMaps. He’s a quick fella, but I’ve been caught with my pants down.
New presentation rule: Don’t depend on APIs not changing.
Fredrik Lundh just posted an excellent overview of the “with” statement. If you haven’t been watching, the “with” statement is new in Python 2.5.
Frustration 1: When attempting to `apt-get dist-upgrade`, one package (mzscheme) refused to uninstall properly. The /var/lib/dpkg/info/mzscheme.prerm failed to stop the little scheme webserver (I guess it’s a documentation server or what-not), so I had to delete that prerm script and it uninstalled fine.
Frustration 2: xorg didn’t like “ati” as my video driver. X failed to start and upon examining the X logs, it seemed to not like that particular driver. When I changed “ati” to “fglrx” in my xorg.conf, X started right up.
Frustration 3: When attempting to install from the desktop CD, I couldn’t get past trying to partition my drive. Here is the hard drive layout I wanted to go with:
Here is how I wanted to lay out the partitions:
And you can see the “No root file system” error I was getting. This is the same laptop that I’m using right now. I have Windows XP installed to sda1, / on Ubuntu to sda2, swap to sda3, and /home to sda4. And I wanted to format sda2 and sda3. Why is this not working? I guess I’ll try to use the alternate install CD tomorrow. I’m sure someone will point out something I’m doing that’s just plain stupid, but I’m not seeing it at the moment.
Frustration 4: I attempted to install to my son’s laptop (an antiquated Compaq Presario 1200) this evening and it seemed like it would go well. Until it installed to 64% and just sat there. Twice. I guess I’ll be trying the alternate install CD on that laptop as well.
I’ll update more later.
When a new OS distribution is released, it is often best to do a clean install and start from scratch, right? Well, apparently not.
This past weekend, I shredded my hard drive (via
shred). It literally took a day.
I downloaded the DVD ISO of Fedora Core 6 the morning of the release at work without much trouble, and burned it on a DVD. When I went home, I popped the DVD onto my machine, and to my dismay, it could not be read! So now what do I do?
I went to my half-broken iBook to download the CD ISOs for FC6. It took about 45 minutes to download the ISO of the first disc. Another 10 minutes to burn it onto a CD. I put in the first disc onto my desktop and the followed the steps. To install the general and developer workstation stuff required only the first three CDs. Okay, better than I thought.
About two hours later, I made the second and third FC6 CDs, and finished installing FC6 onto my desktop. On first boot, I saw a
cpuspeed error. Worst of all, I had no networking! I rebooted my machine with an Ubuntu live CD, and there was no problem with networking. Rebooted with FC6, no networking at all. I turned off a number of services including
SELinux, and still no networking.
Alas, I decided to reinstall FC5 (overwriting FC6), which I still had the DVD for. After successful installation of FC5, I turned off all the services I deemed unnecessary, including
cpuspeed. Then, I ran
yum, and there were problems with the repositories due to the release of FC6.
The next morning (yesterday), I re-burned the FC6 CDs. I put in the first CD and chose the upgrade option. What surprised me was that although only the first three CDs were necessary for a clean install, all five CDs were necessary for the upgrade.
The upgrade took a long time: over an hour-and-a-half. It took 30 minutes to just check the dependencies.
This time around, everything works. No
cpuspeed error at startup (because I turned off the service in FC5). Most importantly, networking worked. But that annoying system beep! Quickly, I turned off system beep under Preferences > Sound.
I noticed a slight performance improvement. Sizewize with the install, not too much of a difference with FC5 (3 - 5 GB required for installation of general and developer workstation stuff).
But I do have to say, this is absolutely the prettiest Fedora Core: look-and-feel.
Despite the one day delay in getting hands-on with FC6, the good thing was that I didn’t have to go through
yum hell like a lot of people were chattering about.
I performed an update of the system via
yum without any problems.
Next, I used
yum to install all the additional packages I needed:
All the packages were installed successfully.
Then I downloaded and installed the following packages:
John the Ripper
All the packages were installed successfully with the exception of
/usr/bin/vmware-config.pl failed. Using
vmware-any-any (http://ftp.cvut.cz/vmware/) didn’t help. I read that
config.h is not in the
kernel-devel package. Apparently, touching
/usr/src/kernels/2.6.18-1.2798.fc6-i686/include/linux/ solves the problem (see http://www.vmware.com/community/thread.jspa?threadID=59513&tstart=0), but it didn’t work for me.
Finally, the goodies, namely Java and Flash. Here is a great post: http://www.linux-noob.com/forums/index.php?showtopic=2533. I decided to be adventurous and installed the media packages as well (e.g.
mplayer). That document worked very well, and everything installed with a snap (many thanks).
It was surely a run-around to get Fedora Core 6 installed, but other than that, a lot of things that I have been accustomed to, work. Again, this is absolutely the prettiest Fedora Core.
Ubuntu announced today that Edgy is officially released. Desktop enhancements include:
- Tomboy, an easy-to-use and efficient note-taking tool
- F-Spot, a photo management tool that enables tagging, photo editing
and automatic uploading to on-line web management sites such as
- GNOME 2.16, which in addition to new features such as enhanced power
management, makes the GNOME desktop more secure, faster and more
- Substantially faster startup and shutdown with eye-catching
- The latest Firefox web browser, version 2.0, which offers inline spell
check support in web forms, easy recovery of crashed sessions, built-in
phishing detectors, enhanced search engine management with built in
OpenSearch support, and better support for previewing and
subscribing to web feeds
- Proactive security features, preventing many common security
vulnerabilities even before they are discovered
- Evolution 2.8.0, which brings new features such as vertical message
Unfortunately, I didn’t see anything about the inclusion or automatic configuration of Compiz/Beryl. I’m upgrading the laptop that I’m blogging this from as I type via `apt-get dist-upgrade`. I also plan to install from scratch on a second hard drive, a new server I’m setting up, and my son’s laptop. I’ll blog back with results of how the upgrade and fresh installs go.
There seems to be a lot of talk about the new “High Assurance SSL Certificates” to be introduced by Verisign. IE7 will be the first browser to support this type of a certificate. From my understanding, High Assurance SSL Certificates have nothing to do with better encryption, but the process an entity must go through before being granted the certificate.
Ingredients: 10mins, Google Coop.
Fedora Core 6 extends the use of the yum package manager which was also used in Core 4 and 5. In Fedora Core 6, yum has been further integrated into the Anaconda installer, the automated update service has been rewritten, and the speed has been significantly improved.
The most visible change in the installer’s use of yum is that it now permits additional repositories to be specified at installation time. This permits updates and extra packages to be incorporated into the initial installation rather than processed as a separate, later step.
Automated yum updates are now handled by the yum-updatesd service, which can be configured to simply provide notification of available updates (through dbus, which is picked up by a desktop applet, or through e-mail), to download new packages (and, optionally, their dependencies), or to install new updates as they become available. This replaces and extends the previous yum service, making it easier and more convenient to stay on top of updates. The graphical tools pup (package updater) and pirut (a graphical front-end to yum for installing and removing packages) further extend this convenience.
And finally, some parts of yum were rewritten in C, yielding a significant speed-up. Overall, package management in Fedora Core 6 is head-and-shoulders above previous releases and I think it has (finally!) become one of the distribution’s strongest points.
Fedora Core 6 was released today! Congratulations to all of the Fedora and upstream developers. The step to Core 6 is smaller than the step to Core 5 last spring, and the distinguishing qualities of this release are generally those of refinement and polish.
Fedora Core is one of the few Linux distributions that is hardened out-of-the-box: SELinux uses the Linux Security Module (LSM) interface to enforce system-wide policy that restricts the actions of individual processes. For the past few releases, Fedora Core has shipped with a targeted policy that is designed to protect the services and processes most likely to be attacked while leaving the rest of the system relatively unhampered. Fedora Core 6 refines and extends the targeted policy to provide better protection and less interference with untargeted applications. An improved range of policy option switches (booleans) permit the policy to be adjusted without rewriting.
SELinux works well, but since applications are not aware that SELinux is restricting them, error messages can be a bit bewildering. For example, Apache may record in its error log that a particular file does not exist, when it plainly does exist and has the correct permissions; what is happening is that SELinux is blocking Apache’s access to the file. The SELinux activity is recorded in the form of access vector cache (AVC) denial entries in the system log, but these can be time-consuming to interpret.
FC6 includes the first version of a diagnostic tool, setroubleshoot, which watches for AVC denials, notifies the desktop user of their occurrence, and optionally translates the messages into plain language with recommendations for conflict resolution. This is a fantastic tool that will go a long way toward relieving the frustration of administrators and users who are unfamiliar with SELinux (and ultimately, reduce the temptation to turn SELinux off altogether).
Leopold Toetsch : prolific Perl and Parrot contributor, Parrot patchmonster pumpking, “Perl 6 and Parrot Essentials” book co-author and long term almost-anything-related-to-computers hacker.
Now he’s here committed to answer our questions and let us know a bit more about his thoughts, feelings and Parrot (of course)
Enjoy the interview !
SQLAlchemy announced yesterday that version 0.3.0 was available. I’ve been hearing about and reading references to SQLAlchemy for a while now. I just downloaded and installed SQLAlchemy and have been thumbing through the documentation. It appears to have all the power of, say, SQLObject or Django’s ORM with even greater customization functionality.
From the SQLAlchemy front page:
SQLAlchemy is the Python SQL toolkit and Object Relational Mapper that gives application developers the full power and flexibility of SQL.
It provides a full suite of well known enterprise-level persistence patterns, designed for efficient and high-performing database access, adapted into a simple and Pythonic domain language.
SQL databases behave less and less like object collections the more size and performance start to matter; object collections behave less and less like tables and rows the more abstraction starts to matter. SQLAlchemy aims to accommodate both of these principles.
SQLAlchemy doesn’t view databases as just collections of tables; it sees them as relational algebra engines. Its object relational mapper enables classes to be mapped against the database in more than one way. SQL constructs don’t just select from just tables–you can also select from joins, subqueries, and unions. Thus database relationships and domain object models can be cleanly decoupled from the beginning, allowing both sides to develop to their full potential.
This will be an interesting project to watch if for no other reason than the projects which are working to integrate with it (such as TurboGears and Django).
This week on the Perl 6 mailing lists
“The | notation is mentioned in S012:1029, by the way. Obviously you still haven’t quite memorized all the synopses. :-)”
– Larry Wall, in ‘class interface of roles’
According to PHP Eats Rails for Breakfast, PHP is more dominant than Python or Ruby because a survey shows that there have been more lines of code written in PHP in the 3000 surveyed projects.
Of course, if a metric as broken as SLOC really suffices for advocacy these days, it will only take a handful of web apps written in assembly, COBOL, and AppleScript to leave PHP far, far behind as the new hotness.
It was flattering to get 25 attendees to a workshop on Web 2.0 in Barcelona. The workshop had been scheduled for 11:00 in the morning on a Saturday, which is early for Catalonians. (Barcelona is a city where you can walk the streets at 11:00 at night and run into a group of 50 people out roller-blading for fun.)
I had been invited to talk about Web 2.0 at Kosmopolis, a unique conference on the arts I will expand on a bit later on in this article. I took advantage of the open-ended subject to speculate wildly about the future of the arts, and wrote an article directly aimed at the conference with the title Characteristics of new media in the Internet age (best accessed through its wiki).
Thanks for the e-mail advertising your products to help me remove viruses from my computer. I’m sure they would be very valuable. However, I think you may have made a slight mistake.
I never sent you this message containing the virus.
I apparently missed the announcement until this morning while listening to a couple of podcasts which mentioned the new beta of Flash 9 for Linux. Apparently, chromatic is less than pleased with the general “Linux” label since this release is only for 32-bit x86 Linux.
Here is a page from Penguin.swf, apparently an Adobe employee involved with the Linux port. There are links in there to the download, install, and FAQ page.
I installed it and tested it on GooTube. One of the issues with Flash 7 for Linux is the audio/video synchronization problem. I first watched the angry cat video to see if this issue has been resolved in Flash 9. It played fine. It looked and sounded like the video and audio were in sync, but with cat videos, you never know. So, I decided to check out an episode of Hope is Emo to see if there was proper synchronization. So far, it looks good.
For a general definition of blogging: You should be asking for help on puzzling problems. You should be tagging your best pictures on flickr. You should be getting recognition for your genius and hard work, as well as business, money, priceless feedback, and readers who one day may become friends. You should be sharing the things you learned so other people will learn from them and maybe teach you something new. You should be entertaining the rest of us; and we will return the favor.
Here, your audience is unbounded in space and time. Yes, your writing will last forever — get used to it.
Fedora Core 6 will be released on Tuesday. It is the latest and greatest distribution from the Fedora project, hardened out-of-the-box with SELinux and including Xen 3, the latest KDE and Gnome desktops, AIGLX eye candy (for the few cards that support it so far), a handful of new or rewritten management tools, and a much faster version of the Yum package-management system.
Another Fedora-related download is available today: my book, Fedora Linux: A Complete Guide to Red Hat’s Community Distribution is available in PDF form from the O’Reilly web store. This provides a convenient way to access the content while on-the-go (and at a 50% discount from the book’s list price). For those who prefer paper, the print edition will be out before the end of the month.
It’s exciting to see my first book reach fruition. I wrote it as a hands-on guide to the complete Fedora system using the same lab-based approach as Chris Brown’s SUSE Linux, and made it the type of book that I wished I had when I was coming up to speed on Linux. I hope it will prove valuable to a wide range of Fedora users.
Reg Braithwaite comments on the irony of justifying programmer baby-talk with a quote from SICP. Unless you’re already a lambda-calculus guru, Structure and Interpretation of Computer Programs (that’s SICP) will take some work to get through. Yet after demonstrating that you can perform all necessary operations (including fundamentals such as addition and counting) through recursion as well as explaining macros and continuations, the authors still suggest that the primary purpose of source code is to communicate ideas to other programmers.
Perhaps software development may finally begin to improve if we can kill the idea that ignorant and inexperienced novices ought to be able to maintain code that matters.
“Customer Service” has become an oxymoron.
I’ve created su.pport.us as a way for customers to help customers. Computers, gadgets & digital anything was suppose to make our lives easier. Well, that might be the case once the damn things work.
Problem is: they almost never do and the companies providing them saddle you with the burdon of getting them to. And how? By calling script-wielding, underpaid, frayed-nerved phone support or endless googling across blogs.
This week on the Perl 6 mailing lists
“The whole point of reserving these namespaces is not to prevent users from misusing them, but to ensure that when we eventually get around to using a particular block name, and those same users start screaming about it, we can mournfully point to the passage in the original spec and silently shake our heads. ;-)”
– Damian Conway, on POD specifications
The YouTube purchase by Google received almost equal portions of praise and derision. The praise was driven by a belief that the purchase (along with Yahoo!’s purchase of Flickr) validated the benefits and power of free-ranging user-generated content. The derision (aside from claims that Google paid too much) was driven by the expectation that Google would drown under a flood of copyright infringement claims.
Maybe it’s time to look again at the premises of the peer-to-peer movement earlier in this decade. Before it became commonplace for sites to allow uploads of anything from communities–even commentary on articles–people were expected to keep control of their content. All kinds of collaborative filtering and search tools were tried out in the mission of helping people find each other’s content.
OK, this is going to be mostly off-topic for me, but hopefully it’ll get a useful point across.
When I arrived at my house last night after the Python meetup, my wife informed me that our washing machine was no longer working. She said that it wasn’t spinning and was kind of making a grinding sound. Nice. Not exactly the thing you look forward to hearing when you walk into your home at almost 10PM. I listened to it and validated that my wife’s auditory assessment was indeed correct.
She found the owner’s manual for the washing machine and I flipped to the troubleshooting section. The manual provided little help beyond making sure the machine was plugged in and the lid was closed. Since it was late, I went to bed with the intentions of waking up early to see if I could figure out what was wrong with it.
First thing this morning, I found an appliance repair website that had a decent troubleshooting section accompanied with detailed instructions for getting at the part which may be faulty. I started dismantling the washing machine per the instructions on the website. Each step was spot on. In 20 minutes, I had sufficiently dismantled the washing machine and located the faulty part. For anyone interested, the coupler opposite the motor was split in half. This explained the grinding sound and the failure to turn the tub or the agitator.
The point is that good documentation can be invaluable. This experience directly translates to having good books on hand for programming, sysadmin, and troubleshooting tasks. In that vain, I can highly recommend Safari. There have been times when I just needed to look something up quickly, didn’t want to buy the book, and didn’t have time to go to the bookstore and Safari came through nicely.
I know that this is mostly off-topic and fairly obvious. But I think it’s an experience (and an application of the experience) worth sharing.
Reviewing software for security bugs is a highly recommended best practice. There are various techniques for doing source code reviews, one of them being “static code analysis” which (in most cases) involves the use of a ‘grepping’ (pattern matching) tool along with a database of patterns that indicate potential security flaws. There are disadvantages to static code analysis: high rate of false positives and the inability to detect logic errors that may lead to security bugs. That said, static code analysis tools can be used to perform a quick first pass on the source code to detect bugs that can be easily identified by a grepping technique (”low hanging fruit”). Some of the free static code analyzers (security) are: Flawfinder, RATS, and SWAAT.
I’m a little slow again on happenings in tech-land, but about a month ago, the open source project Compiz saw an amicably intentioned fork to the project. The new project is named Beryl. Quinn Storm had been working on and managing a set of patches to Compiz for a while which seemed to become the Ubuntu forums community standard for managing compositing eye candy. Quinn stated reasons for the fork on September 15, 2006. On the 18th, Quinn and others formally annnounced Beryl. Everything that I’ve read seems fairly friendly from Beryl toward Compiz. I’m not even going to comment on whether this is a well-founded split or not. I really don’t know. Solerman Kaplon brought up an interesting question as a follow-up to Quinn’s reasons for the fork. Solerman asked if it was not the case that the main Compiz branch was indeed attempting to prepare the code base to easily and properly allow for Quinn’s (and other community members’) changes.
Like I said, I don’t know if this was a “founded” fork or not (whatever that means - all one really needs to have a founded reason to fork an open source project is desire and ability). I do know that I personally hate seeing forks like this. The Compiz folks have apparently put a lot of effort into the project to bring it up to where it is. If Beryl becomes the community choice, I hate to see the Compiz folks’ future work and talent not utilized. If the community chooses Compiz, I’d hate to see Quinn’s excellent additions discarded.
Regardless of the current “winner” of the compositing market, these projects demonstrate the intense community interest in getting eye-candy on the Linux desktop which has an amazing wow-factor. If you haven’t seen a demonstration of Compiz/Beryl, just take a peek at this video. Personally, I find this more appealing than anything I’ve seen on Vista or Mac. Yes, this is an opinion. Yes, you may think differently. No, I won’t argue with you.
So, what does the future hold for eye-candy on the Linux desktop? The momentum behind Compiz/Beryl and Xgl/aiglx are putting Linux on par with visual effects on any other platform. I think we can only expect more. Mac has clearly been driving aesthetic appeal and visual effects on the desktop for a while. Linux can now at least contend. And with the talent that has been emerging in this area in the open source realm, Linux may even begin taking a lead role in what is expected on the desktop (in general) regarding appearance. Microsoft and Apple can ignore the visual effects of the Linux desktop for a while, but it won’t be long before the aesthetic appeal of desktop Linux begins to turn the heads of the average end user. I’m interested to see how they answer.
And then there’s the issue of the usability of the Linux desktop. And that’s a matter for another day.
I just love challenge-response mail confirmation systems. “Hi, I get a lot of spam and someone sent me an e-mail pretending to be you. Would you mind filtering my spam for me? It only takes a minute, and if someone’s forging your address on spam, I’ll totally let it through if you simply respond to this message!”
Where do you think interactive, digitized arts are going? How will they change our relations to each other and the social settings we live in? Read my article, and then fix and enhance it in the wiki version that’s at O’Reilly Media’s recently unveiled WikiContent community site. I plan to leave the original article up for curiosity’s sake and to update the wiki with the help of sympathetic readers everywhere.
This week on the Perl 6 mailing lists
“When I first read ‘Warnock applies’ on things in p6 summaries a year or so ago, I thought it was some really energetic programmer who went around and applied patches as soon as people posed a question.”
– Carl Mäsak, on Warnock’s Dilemma
While listening to Perlcast’s interview with Pragmatic Andy Hunt, Andy said “Bugs tend to clump together.” I’ve said that many times myself — it seems to be true — but I never asked why.
I blogged yesterday about Python and Haskell making you a worse programmer. A reader of that entry identifying (him|her)self as Reedo graciously posted a link to this Mark Dominus blog post which talks about patterns driving the evolution of programming languages. The title of this post (”Design Patterns are Signs of Weakness in Programming Languages”) comes nearly directly from Mark’s blog. Here is his summary:
Patterns are signs of weakness in programming languages.
When we identify and document one, that should not be the end of the story. Rather, we should have the long-term goal of trying to understand how to improve the language so that the pattern becomes invisible or unnecessary.
This was a fantastic article well worth reading. Nothing in it failed my “smell test” from an initial read. The only thing that struck me as almost not right is that Rails and Subway aren’t programming languages (he calls them programming systems, which is an acceptable title for them), so the codification and integration of MVC that he talks about doesn’t seem to exactly fit since it’s not taking place in the language proper. Or perhaps one proper place for such codification and integration in sufficiently high level languages as Python and Ruby is in libraries and frameworks?
I’m still mulling over the whole thing. But it’s an interesting thought that if you find yourself repeatedly using a pattern, your language is broken in that regard.
While so much of the technical weblog universe is going through towels mopping up their drool over the rose-tinted “how many more thousand developers will it take to get a fourth project out of beta?” Google Software Development Process, Aaron Korver wrote an insightful piece a couple of weeks ago on the potential down sides of agile development.
If you think still think XP and agile means that some stinky fat man chained up behind you and get greasy potato chip crumbs all over your jumpsuit while leaping up gleefully to point out a missing semicolon once every twenty minutes or so, you really ought to read Korver’s thoughts. Any measurable improvement in productivity will take work and application and dedication. A sober consideration of the potential benefits and drawbacks can only help.
I stumbled across this blog post the other day on, once again, Ned Batchelder’s blog. It’s an interesting rant about how learning Python and Haskell have proved more frustrating than liberating because the author isn’t able to use Python or Haskell at work. When faced with a task at work which has to be done in C#, the author states that he thinks in Python (or Haskell) and has to translate it to C#.
The post came across as more of a “Why I’m More Frustrated by Knowing Python and Haskell” than “Why Python and Haskell Make You a Worse Programmer”. The frustration that this author describes is real. I’ve felt it many days when firing up Visual Studio to work on my at-work C# project. I would definitely prefer to code in Python than C#. And if I knew Haskell, I’d probably be saying the same about Haskell.
While most of what the author of the sited post was getting at was tongue-in-cheek/rant, he alluded to a potentially legitimate productivity penalty for knowing Python/Haskell. Many times, he would code up something in Python or Haskell to prove to himself that he could, in fact, accomplish the needed task more quickly and simply by using his preferred language. Even given the time “wasted” in such an exercise, I don’t consider his broader perspective necessarily a productivity hindrance. It would be extremely difficult to prove that writing a throwaway implementation of a task in the non-target language followed by a second implementation in the target language really costs more in the end than writing it only once in the target language. Is the second implementation in the target language more maintainable as a result of having written the throwaway? Did “practicing” with the non-target language help solve some specific problem more quickly later down the road? Does “playing” with the non-target language increase a programmer’s mental accuteness and clarity and allow easier, quicker thoughts when coding in the target language? I’m not asserting that this is the case. I’m just speculating. But I’m speculating that it’s likely the case. Specifically, I’m speculating that a broader perspective and regular exercise with a variety of technologies is beneficial in the long run.
Ned, in his typically Neddish manner, provided a number of excellent suggestions to alleviate said C#-frustration. Definitely check out Ned’s blog (linked to above). He doesn’t say, “Try to forget Python and Haskell.” He seems to be saying to embrace them, but sometimes work is, well, work. My conclusion is that if you know a language like Python or Haskell (or Perl or Ruby) and have to work on a language like C# or Java or C++, you’re probably going to be frustrated. But keep yourself well rounded. Keep working in your language of choice when you can, whether at work or home. Try to do things idiomatically within the limits of the language you’re working in. Don’t try to force fit ideas from your favorite language into a less favorite language. And read Ned’s blog. (Did I say that already? :-) He has some more tips on lowering the frustration level.
Luis Villa revisited criticisms of OpenOffice.org after two years, and recently posted What OpenOffice.org is Still Doing Wrong. His thesis is that Firefox, rising from the ashes of Mozilla (okay, maybe not phrased that strongly, but I like the image), represents a better strategy for making usable and easily adoptable software.
Of course, feature-for-feature compatibility seems like an obvious goal, but reducing the technical debt of a project almost always pays off.
(paraphrased as earnestly as I can remember)
He had me. I don’t know much about multi-threading. What is the real difference, if any? Is my industry part of a massive farce?
This week on the Perl 6 mailing lists
“And here I thought you were a responsible, law-abiding citizen… :P “
– Jonathan Lang, commenting on Larry Wall’s
self.HOW does Hash