September 2002 Archives
Maybe it’s nostalgia. The games I remember from junior high and high school (Impossible Mission, Wizardry, Ultima, Echelon, the Gold Box series) still seem pretty compelling. I never finished Sam & Max, for one, and I’d replay Day of the Tentacle if I could figure out where that CD went. I never played Elite, though the idea really appeals to me.
I always wanted a version of X-Wing that would acknowledge that I’d destroyed the Star Destroyer in a previous mission….
Maybe it’s too difficult to reimplement some of the software we converts miss. Not everything was built on a virtual machine that can be reverse engineered and ported. Still, I think there’s room for a silver age. Given what a Richard Garriott or a David W. Bradley had and what they produced, why shouldn’t an enterprising game hacker or two come up with something that satisfies that nostalgia but uses modern tools?
Falcon’s Eye, Tux Racer, Civil… What’s new in Open Source game development?
Related link: http://www.kroupware.org/concept-1.0.1/index.html
The German Federal Agency for IT Security has commissioned an open-source groupware client and server to provide email, contact management, appointment tracking, and task lists.
What are appropriate roles for government in software purchasing and software licensing endorsement? On one level, funding development of custom software (open-source or not) may make economic sense independent of ideology. The cost of the development can be weighed against the cost of purchasing the required number of seats of an existing commercial product. (With support and upgrade costs factored into the mix for both.)
But does a government’s responsibility go further than just a cost-benefit analysis? When a government is paying for software, that money is (usually) coming from its citizens. Should those citizens expect to get something more for their dollars or Deutschmarks than just the ability of government administrators to process words and crunch numbers? They should expect the government-funded software to be available for public use under open-source licensing schemes.
In the United States, drugs that result from federally funded pharmaceutical research must be licensed to the Federal Government (for use in public assistance programs like Medicare) at more favorable terms than what’s available to private companies. Something similar can happen for software research as well.
Commercial developers argue that this is a government subsidy for open-source developers. Perhaps, if intellectual property laws can be construed as a subsidy for closed-source software developers. Open source software, and especially government funded open-source development, isn’t appropriate in all circumstances. But, if you believe that it’s the government’s business to create and nurture resources for the common benefit of its citizens, perhaps funding infrastructural open-source software products should be one of those resources.
What should a government’s role be in open-source software development?
Related link: http://www.infosync.no/news/2002/n/2248.html
Witness the humorous result of a tech writer unintentionally getting the wrong name for the right technology, right around the fourth paragraph of this article. Please pardon the possibly shameless self-promotion implicit in my pointing out the error — I guess I’m just amazed that we seem to have made that much of a splash.
Related link: http://dialspace.dial.pipex.com/town/estate/aax20/green/
Some geeks brought together two of my favorite topics — wireless and sustainability — at this year’s Big Green Gathering in Somerset, UK, with an environmentally-friendly wireless network. Definitely a cute proof-of-concept, and the organizers might just have hit on a way to get all those overweight computer geeks into shape: If you want the laptop to stay powered up, keep pedalling!
It started off, as most threads do, as a fair innocuous question; I was doing some research for a book I’m working on (more on that soon…) and needed to know about one aspect of Perl 6 regular expression syntax.
It quickly headed off at a tangent, until it was about tuple syntax. Perl has typically been very good at doing what you mean, but that can cause some confusion. Someone noticed that when lists return their length in numeric context, weird things can happen:
(7,8,9) == 3 # true (7,8) == 2 # true (7) == 1 # false () == 0 # true?
Chip Salzenberg quickly pointed out what was wrong with this: Perl 6 was subconsciously appropriating Python’s tuple syntax without its cleanliness of explicit interfaces.
Someone else naturally suggested that adding even more magic would fix the problem. I found myself longing for
@foo.length to do the right thing and for
3 + @foo to be a syntax error.
That’s basically when I realised the pressures of a language designer: lists are the second most fundamental data structure in the language after the scalar, and we’re already proposing adding inscrutable levels of complexity, context-sensitivity and magic to them. This has to be curbed.
But what I was proposing as a solution wasn’t Perl; it was Python. Perl has mutable types and context-sensitivity, and that’s all there is to it; take that away and you’ve got something that isn’t Perl. All these things, however magical and funny they may seem, have been part of Perl since time immemorial - much like when analysing human languages, it’s only when we stop to formalise them that we see the edge cases and the places that the language “doesn’t work”.
Larry’s job is to walk the tightrope between formalisation and approachability, between clarity and ease of use, between regularity and informality. It’s a hard job, and I’m sure he’s thankful for his able assistants; but restructuring and redefining a language that’s prided itself on its unstructured and undefinable nature isn’t the easiest job in the world. No wonder, you might say, the design is taking so long.
How do you think Perl 6 can maintain the approachability of previous versions while being more regular and formalised?
Is it even possible?
Related link: http://www.wired.com/wired/archive/10.10/wireless.html
Negroponte adopts a bold view of technological and business evolution in telecom. As with any prediction that heralds major changes, this prediction is hard to evaluate objectively. But if you listen to the experts in telecom and read the research of the past few years, you find a concensus building among people who are taking a long view of communications, and it has a lot in common with Negroponte’s vision. What’s not in this article is the valuable role fiber to the home can play.
What could bring together the Electronic Frontier Foundation, the MIT
Media Lab, the American Civil Liberties Union, and a school
principals’ association? Answer: a press conference I attended today
on Internet filtering and the Children’s Internet Protection Act.
Very few people showed up, which was demoralizing but not surprising.
Putatively, there was no news hook to draw the journalists. But we
by the EFF and the Online Policy Group that found:
Schools that implement Internet blocking software with the least
restrictive settings will block between 1/2% and 5% of search results
based on state-mandated curriculum topics.
Schools that implement Internet blocking software with the most
restrictive settings will block up to 70% of search results
based on state-mandated curriculum topics.
If that’s not a scandal to rock our educational establishment, what
is? I proposed that somebody organize a student boycott on this
issue, using a recent protest against standardized testing as an
Last year, in Massachusetts, a number of high school students
organized themselves and refused to take the required mandatory
tests. (Several in my home town of Arlington were suspended for this
act of civil disobedience.) I don’t want to argue the potential good
or ill of standardized tests here, but just point out that the protest
got media coverage and attention from school boards.
In the same way, students could refuse to turn in a homework
assignment and cite lack of access to the necessary materials. Those
with Internet access in the home could point to the discrimination
created against those who rely on school or public library computers.
But this pie-in-the-sky suggestion shows how resistant the
moralist-industrial complex is to protests. Those of us who understand
what filters do and what is wrong with them almost always have
uncensored access and can treat the issue as academic (in many senses
of the word). I myself, after writing
over a period of six years, stopped doing it because I felt I had
written everything worth saying on the subject.
(For those who want something that goes beyond the usual diatribes,
try my articles
Why I Do Not Install Filters On My Children’s Computer
Nazis, Neos, and Other Nasties On the Net.)
Few of the people directly harmed by the censorship are like Bennett
Haselton, who started the
anti-censorship site as a high-school student.
Students don’t know what they’re missing, and to ice the cake,
many censorware products block sites like Peacefire that offer the
truth about those censorware products.
Similarly, librarians have no clout with anyone in power, and teachers
have more pressing issues on which to expend their limited political
capital. Thus, Internet filtering resembles welfare cuts and police
dragnets in housing projects, in that the people who suffer from them
either don’t know how to protest or face nothing but further danger if
A teacher and a principal came to speak at today’s press conference,
but they offered only modest critiques of filtering from the
standpoint of its operation. It was good for us all to come together,
because those of us with technical or civil liberties backgrounds
could offer the teachers other angles and hopefully lead them to
express stronger opposition to the filters on principled grounds.
And it was also good to let the teachers explain their concerns to
us–for instance, the fear that some elementary student will go home
someday and tell her parents she saw something inappropriate on a
school computer. Teachers can easily be driven to hide (ineffectively)
behind censorware by the fear of having some rabid control freaks like
the Parents Rights Coalition descend on them and tear apart all the
good work the schools try to do to foster open discussion.
One of the best spokesmen concerning censorware is the one who knows
who won the
2001 EFF Pioneer Award
for deciphering several filtering programs. Seth is a crackerjack
programmer who ought to be earning six figures somewhere. But the
modest publicity he got for the EFF Award did not translate into job
prospects, and he can’t publish much of his
because he’ll be
by censorware companies angry at having their operations revealed.
We discussed the World Wide Web Consortium’s
specification, which has stagnated after a flurry
of commercial interest, but which is still very much on
the minds of European governments, who hope to turn it
into a mechanism for localized government control over
Seth wisely avoids talking to people about their values. Instead, he
talks about the architectural implications of censorware. He
criticizes the use of the term “filter.” Its associations of
cleanliness and protectiveness hide the architectural truth, which is
this: no one can design a system that “protects” kiddies from erotic
content without providing China with a system that “protects” its citizens from
news, and providing Saudi Arabia with a system that “protects” its
women from sites about women’s rights.
He also points out that the Internet is young and has been in the
public eye for only four or five years. It took the government much
longer to regulate radio and other new technologies. We have no idea
what regulation may ultimately settle in place for the Internet–but
we’d better be debating it.
Thus, censorware is part of a much larger issue of architecture and
control, the issue about which Lawrence Lessig has been urging us all
to get active. That’s why the journalists should come to the next
press conference about censorware. And perhaps why I still haven’t
written everything about it worth saying.
Related link: http://www.osdn.com/bcg/
At OSCON 2002, I popped my head into the Boston Consulting/OSDN Developer Survey talk, partly to see my buddy Jeff Bates and partly because I’m curious as to the survey results. I’m trying not to fool myself that my readers are the same people, or that my developer peers all fall into the same groups.
Still, it’s interesting to see a few strong trends!
- Development is seen as a creative endeavor (slide 10). That’s probably one good reason it’s a hobby…
- When it’s not a hobby, there are efficacious reasons for development. (slide 13)
- Either way, the challenge of coding and the learning opportunities it affords are powerful motivators. (slide 13)
- Hobbyists and learners make up a big percentage of the responses. (slides 14 and 15)
- The pursuit of knowledge is an important goal. (slide 16)
- Professionals and students comprise most of the surveyed. (slide 24)
- There are often internal motivations to continue developing. (slides 32 and 33)
Why do I find this so interesting? Maybe the biggest benefit is being able to identify four reasonably distinct sub-audiences. If there’s reason to believe that the audience here is similar (and that’s just an assumption!), it means there are good reasons to think about my responsibilities to each group.
Of course, this is a small survey, primarily of developers, and it had some degree of self-selection. Still, it’s useful data, and it’s given me quite a lot to think about for nearly two months.
What kinds of conclusions do you reach from this survey?
June and July were bad months for free software. First the
chunked encoding vulnerability, and just when we’d finished
patching that, we get the HREF="http://www.openssh.com/txt/preauth.adv">OpenSSH hole. Both
of these are pretty scary - the first making every single webserver
potentially exploitable, and the second makes every remotely managed
But we survived that, only to be hit just days later with the
problems. Would it ever end? Well, there was a brief respite, but
then, at the end of July, we had the OpenSSL buffer overflows.
All of these were pretty agonising, but it seems we got through it
mostly unscathed, by releasing patches widely as soon as possible. Of
course, this is painful for users and vendors alike, having to
scramble to patch systems before exploits become available. I know
that pain only too well: at The
Bunker we had to use every available sysadmin for days on end to
fix the problems, which seemed to be arriving before we’d had time to
catch our breath from the previous one.
But I also know the pain suffered by the discoverer of such problems,
so I thought I’d tell you a bit about that. First, I was involved in the
Apache chunked encoding problem. That was pretty straightforward,
because the vulnerability was released without any consultation with
the Apache Software Foundation, a move I consider most ill-advised,
but it did at least simplify our options: we had to get a patch out as
fast as possible. Even so, we thought we could take a little bit of
time to produce a fix, since all we were looking at was a denial of
service attack, and, let’s fact it, Apache doesn’t need bugs to suffer
denial of service - all this did was make it a little cheaper for the
attacker to consume your resources.
That is, until Gobbles came out with the
for the problem. Now, this really is the worst possible position to be
in. Not only is there an exploitable problem, but the first you know
of it is when you see the exploit code. Then we really had to
scramble. First we had to figure out how the exploit worked. I figured
that out by attacking myself and running Apache under gdb. I have to
say that the attack was rather marvellously cunning, and for a while I
forgot the urgency of the problem while I unravelled its HREF="http://online.securityfocus.com/archive/1/278270/2002-06-18/2002-06-24/0">inner
workings. Having worked that out, we were in a position to finally
fix the problem, and also, perhaps more importantly, more generically
prevent the problem from occurring again through a different
route. Once we had done that, it was just a matter of writing the HREF="http://httpd.apache.org/info/security_bulletin_20020617.txt">advisory,
releasing the patches, and posting the advisory to the usual places.
problems were a rather different story. I found these whilst
working on a security review of HREF="http://www.openssl.org/">OpenSSL commissioned by DARPA and
the USAF. OpenSSL is a rather large and messy piece of code, that I
had, until DARPA funded it, hesitated to do a security review of,
partly because it was a big job, but also partly because I was
sure I was going to find stuff. And sure enough, I found
problems (yes, I know this flies in the face of conventional wisdom -
many eyes may be a good thing, but most of those eyes are not trained
observers, and the ones that are do not necessarily have the time or
energy to check the code in the detail that is required). Not as many
as I expected, but then, I haven’t finished yet (and perhaps I never
will, it does seem to be a never-ending process). Having found some
problems, which were definitely exploitable, I was then faced with an
agonising decision: release them, and run the risk that I would find
more, and force the world to go through the process of upgrading
again, or sit on them until I’d finished, and run the risk that
someone else would discovered them and exploit them?
In fact, I dithered on this question for at least a month - then
one of the problems I’d found was fixed in the development version
without even being noted as a security fix, and another was reported
as a bug. I decided life was getting too dangerous and decided to
release the advisory, complete or not. Now, you might think that not
being under huge time pressure is a good thing, but in some ways it is
not. The first problem came because various other members of the team
thought I should involve various other security alerting
mechanisms. For example, CERT or a
mailing list operated by most of the free OS vendors. But there’s a
problem with this: CERT’s process is slow and cumbersome and I was
already nervous about delay. Vendor security lists are also dangerous
because you can’t really be sure who is reading them and what their
real interests are. And, more deeply, I have to wonder why vendors
should have the benefit of early notification, when it is my view that
they should arrange things so that their users could use my patches as
easily as I can? I build almost everything from original source, so
patches tend to be very easy to use. RPMs and ports make this harder,
and vendors who release no source at all clearly completely screw up
their customers. Why should I help people who are getting in the way
of the people who matter (i.e. the users of the software)?
Then, to make matters worse, one of the more serious problems was
reported independently to the OpenSSL team by CERT, who had been
alerted by Defcon. I was going, and there
was no way I was delaying release of the patches until after
Defcon. So, the day before I got on a plane, I finally released the
advisory. And the rest is history.
So, what’s the point of all this? Well, the point is this: it
was a complete waste of time. I needn’t have agonised over CERT or
delay or any of the rest of it. Because half the world didn’t do a
damn thing about the fact they were vulnerable, and because of that,
as of yesterday, a HREF="http://online.securityfocus.com/archive/1/291782/2002-09-08/2002-09-14/0">worm
is spreading through the ‘net like wildfire.
Why do I bother?
Your comments are welcome.
How do we fix this problem?
A university student from Australia asked me to comment on Internet
censorship, with a focus on the impact of broadband. Of all the
countries normally counted in the “Western, democratic”
civilization-mush, Australia has been the most intrusive in passing
laws requiring Internet sites to register with a government body,
conform to decency laws, etc. Specifically, the correspondent asked
With the introduction of broadband and the increased capabilities it
provides,will current Australian censorship laws be of any noticeable
My answer follows.
I would like to start with a story that is not quite
on-topic, but which may provide an interesting perspective
and which you can share in your report if you like.
It became front-page news in the Boston newspapers last
Thursday when the well-known and very opinionated chancellor
of Boston University, John Silber, decided without
discussion to revoke the charter of a Gay/Straight Alliance
at a high school that was operating on the campus. He said
that a group discussing sexuality has no place in a school
with students as young as eighth grade.
As a parent of a high school student in this academy, I
attended an informational session a couple nights ago where
we learned that:
Members of the Gay/Straight Alliance were still meeting,
were still using school facilities, and were still being
advised by faculty members, all under the approval of
Dr. Silber. The only difference now was that the group
didn’t exist officially (which rules out certain formal
events and school funding, but these are minor effects).
Like most schools, the high school has mandatory sex
education for ninth graders, which Dr. Silber knows about
The high school’s headmaster and faculty are organizing
school meetings to discuss many social and political issues,
including the roles of gay/lesbian/bisexual people (the
subject of the banned Gay/Straight Alliance).
So what has changed? A few formal restrictions. The whole
thing does not seem to merit the media hoopla that
Dr. Silber caused. We do not know why he did it, but it
might well seem to the casual observer that Dr. Silber
revoked the charter deliberately to cause media hoopla. And
that he choose an action that would have minimal actual
impact on students or the school. One could speculate
further that he used the small area over which he holds
dominion in order to publicize his conservative views, which
cover the supposed use of inappropriate sexual images in
A lot of speculation there. But one could apply similar
lessons to your first question. Any “noticeable impact”? No,
I don’t think the laws will have any impact. The various
laws passed in the U.S. along these lines (from the
Communications Decency Act onward) would be only slightly
more effective, had they been upheld in court. Thus, one can
reasonably treat the laws as mere declarations that the
supporters don’t like what’s happening to society.
But what is their alternative? If there is no talk about
sexual functions, does that mean women should silently live
with date rape, that gays should furtively meet in
bathrooms, that teenagers should have unprotected sex? If
there is no public airing of racist views, won’t the
proponents simply continue to recruit by word of mouth,
prettify their message with euphemisms, and exploit their
sense of persecution to bolster their forces? (See my article
Nazis, Neos, and Other Nasties On the Net for a fuller explanation.) These are the
consequences of a ban, but would-be moralists refuse to
accept their role in them.
There should also be no illusion that censorship will stop
with new media like the Internet. Right-wingers just pick on
it because the businesses depending on it have less clout
than other media. The movie and music industry, to their
credit, have proven very dedicated and resolute during
recent years in rejecting regulation of their content. But
they are in the cross-hairs of people who want to rein in
The Communications Decency Act was passed in 1996, and
that’s already a different era in terms of Internet use.
Censorship of the Internet becomes more difficult as it
reaches more people in every company and becomes more a
fixture in their lives. And that is where your question
about broadband becomes relevant.
The issues of what content passes over the Internet remain
pretty much the same in a broadband era, but in the new era
people keep their connections on all the time, use the
Internet for more communications, and download larger files.
In other words, whatever they do that you like, and whatever
they do that you don’t like, will both increase.
This might lead to a small increase in downloading erotic
pictures, but the biggest impact we’ve seen so far is to
make the unauthorized exchange of copyrighted material
(music, movies, and software) an even bigger social issue
than the exchange of socially offensive material. And both
Congress and the courts have succeeded in imposing
restrictions on Internet use in this area, where they have
not done so where erotic content and hate speech are
involved–a rather perverse result in itself, as Prof.
Lawrence Lessig has often pointed out.
Broadband doesn’t allow anything new that wasn’t done
before. Nor does it make it easier for people to elude
detection, whether they’re sharing music files or child
pornography. But it simply makes more people interested in
exchanging copyrighted material, so it brings these
possibilities to the attention of the companies affected and
creates in them a sense of panic that leads to legal
action. Ironically, then, the forces most successful at
holding back censorship when it affects their content are
also most rabid in demanding censorship when it affects their
What lies behind attempts at Internet censorship in democratic countries?
I’ve spent a good portion of the last week down at the Alameda County Computer Resource Center, working with activists from the Independent Media Center and the Ruckus Society to assemble 250-plus PCs, install Linux on them, and ship them to activists and community centers in Ecuador. A similar concurrent effort has also been underway in Portland at FreeGeek. The motivation for this project comes primarily from our connections with Indymedia groups in South America, who frequently find themselves short of the tools they need to provide independent coverage of local and national news stories, like the upcoming FTAA summit in Quito. But, in a broader way, we are trying to find working models that allow us to turn the Bay Area’s leftover junk — in this case, old PCs, monitors, peripherals, and so on — into valuable resources for people in less economically advantaged regions.
Enter the ACCRC, a non-profit computer recycling organization based in a large warehouse in urban Oakland. With ten full-time staffers and countless volunteers, they’ve been collecting leftover computer hardware from all over the Bay Area for several years. Often unskilled volunteers are then given the opportunity to learn how to assemble working machines and how to get SuSe Linux installed on them. Finally, the refurbished Linux boxes are given away, to deserving individuals, schools, and other organizations across the US and the world, instead of being left to fill up California’s landfills and pollute local groundwater with heavy metals. Pretty neat, huh? Their 28,000 square foot warehouse is a computer archaeologist’s daydream, and one look at the place had us fairly salivating to get started. The ACCRC has been gracious enough to let us use their facilities and have our pick of their hardware bounty, for which we are immeasurably grateful.
Of course, none of this would be possible without Linux. The freedom of Linux — both in the libre and gratis senses — enables us to build a custom system that will run easily on old machines that won’t even install the latest version of Windows anymore. (And just how much would 250-plus licenses for the latest version of Windows cost, exactly?) Starting with a modified Debian distribution developed by the fine folks at FreeGeek — Portland’s answer to the ACCRC, if you will — we’ve put together a lightweight and easy-to-use desktop environment. IceWM was selected as the window manager for its clean, simple interface, and its familiar taskbar and program menu. The undeservedly little-known ROX desktop was chosen for its super-lightweight desktop management and file browser. Finally, KDE provides the web browser, the word processor, the e-mail client, the PPP connection manager, and a whole bunch of other goodies. Of course, there’s Spanish language support for everything. A candidate machine is then booted via a floppy whose sole purpose is to mount a root filesystem over NFS, a few of simple configuration options are selected, and, within half an hour, we have a finished Linux box ready to ship to Ecuador. Go, Linux!
So, the PCs, when they’re all ready, will get packed into a shipping container and sent to Quito. There, Indymedia volunteers and tech activists will distribute those machines to Indymedia offices, and then to schools and community centers in the surrounding countryside. We are also hoping to build a public wireless network in Quito, to improve community access to high-speed bandwidth. If you’re curious to learn more, you can check out the Indymedia article and video interview, as well as the Kuro5hin coverage. If you’re in the Bay Area or Portland, and are keen to help, please please send me an email with some contact info — we could really use the help!
We hope to have things under wraps by sometime next week. The project certainly hasn’t been without its stumbling blocks. Progress can be slow and sometimes frustrating, as anyone who’s tried to make old hardware and Linux and NFS go — hell, even just one of those three can cause an old hacker indigestion under the wrong circumstances — and sometimes the network conks out altogether for one reason or another, making everything come to a grinding halt. At times like those, though, we remind ourselves of the project’s goals, and carry on: That we may empower activists to communicate with each other, and tell their community’s stories to the world. That we may enable communities and their organizers to gain access to and share information on how they can better the lives of those communities and their members. And, maybe most importantly, that we might be able to provide some people with the tools they need to help others. It seems unfortunate that one’s l33t tech skills so rarely offer the opportunity for work this satisfying.
Related link: http://www.oreilly.com/~andyo/ar/sept11_one_year.html
An online newspaper called the
has just published a summary I wrote of changes in computing and
Internet policy springing from the September 11 attacks.
Topics include the failed GOVNET initiative, the atmosphere in
businesses, activities taken by the government to promote better
computer security, the prominence of software bugs, the resurgence of
surveillance in the U.S. and Europe, data retention laws, issues of
802.11 wireless networks, and chilling effects on the adoption of new
The URL above is the article’s permanent location.
I’d been writing the weblog now for a little while and I was still trying to get my legs under me. I wasn’t sure yet if I was making a difference - didn’t know if I was approaching it the right way. I decided to check in with my mentor Lester Stallman Raymond Bangs. It was late; I was sure he’d be home and up. He was probably blaring distortion music and hacking 500 words an hour on his favorite new obsession - Free the Mouse!
“I read some of your stuff”, he said, “It sucks”.
“Is it really that bad?” I was crushed.
“Listen Kid, it’s not you. Don’t take it personal. Everybody’s stuff sucks. It’s just that with the major corporations and governments trying to control everything now - not to mention the hits that privacy is taking - it ALL sucks. Hey, did you check out those open source links I sent you?”
“You mean the one that generates mp3’s that simulate
amplifier feedback? It blew out the speakers on my desktop.”
“Mine too!”, he said, “Ain’t it great!”
I had the TV on as a distraction. They were talking about September 11th. I had my music cranked and the TV on mute. Closed captioning was on and the words were scrolling across the top of the screen, “Ypsterday the mayor toured grnd zerro with reporters” - closed captioning always garbled the words - “wev trained and we’ll tran somm more”. I sighed.
“Sometimes it just seems that writing about this stuff - computers and software - doesn’t really matter much”, I complained, “It just doesn’t seem that important with everything that’s going on”.
“That’s just what they want you think!” Here he goes on one of his tirades, I thought. It sounded like he dropped the phone for a minute; I could hear him cursing.
Finally he continued, “If you think this stuff isn’t important then you need to learn something. All this Internet and open source stuff - it’s critical. It couldn’t be more important. This stuff is giving poor countries an opportunity for the first time to get their hands on top quality applications. This stuff is real software.
“And what do they do with it? They’re bulding sites to let them communicate with rest of the world. They’re publishing their national newspapers so readers all over the world can see their point of view. They’re building applications in their own languages to help drive productivity, create
economic growth and keep services revenue in their own countries where it will create good jobs.
(The TV was showing more reruns of Sep 11th. “Osamma blad binng flew asd afghanismm”, the closed captioning went on, “the fimere a polimen were heros that day”.)
“All this stuff adds up to be huge.”, he continued, “It’s driving communications to make things smaller. It’s driving us closer together and giving us something in common. It’s something we can share and collaborate on when everything else is driving us apart.”
His voice got quieter. “Listen up, I’m not going to kid you. There’s a lot of messed up stuff going on in the world. It’s a dangerous place with pitfalls everywhere. But don’t underestimate how important you are - how important we all are. This stuff is critical”
He paused. “And besides”, he said, “ALL your stuff didn’t suck. Some of it was actually OK. Good even.”
“Thanks.”, I said, “See you later.” I hung up.
Maybe he was right, I thought. I hoped so.
Related link: http://use.perl.org/~rafael/journal/7622
It’s always nice to hear that you’ve helped to inspire something good somewhere else… and it’s the more amusing because I’ve started to use Subversion myself!
Related link: http://news.com.com/2100-1033-956911.html
To me, fear-mongering quasi-news reports are worse than spam. That’s why I had to publicize this item, provocatively titled “Drive-by spam hits wireless LANs.” A reader thinks, “Oh no, the ante’s being raised agsin in the wireless security problem,” or (if you’re as perverse as I am), “Gee, there’s room for innovation in every dumb thing people do.”
But actually read the report–what is this supposed vulnerability in wireless? Spammers are just exploiting the old trick of finding a mail server that’s an open relay. Is that news?
And is the quoted speaker, Adrian Wright, telling us that driving by a wireless network is easier than probing IP addresses from the comfort of one’s apartment? Perhaps if companies attach internal mail servers (with private IP addresses) to their wireless LAns, and are less cautious with their configuration than with public servers, but that’s a lot of ifs.
I have neither a persecution complex nor a paranoid personality, but I think somebody’s down on wireless (an extremely promising technology) and is throwing any mud they can find at it–and sometimes mixing their own mud out of some pretty damn dry clay.
What else (besides the well-publicized holes) do we have to worry about with wireless security?
Related link: http://xml.apache.org/axis/
Just posted to the axis-user and axis-dev lists:
Subject: ANNOUNCEMENT: Axis 1.0 RC1 has been released!
The Apache Axis team has released Axis 1.0 RC1, available at http://xml.apache.org/axis/
This release of Axis is a major milestone toward releasing a 1.0 version in the (very) short term. Currently the team is targeting a 1.0 release date of September 13, 2002.
* Is a flexible, extensible Web Services framework for Java developers.
* Has a complete implementation of Sun’s JAX-RPC and SAAJ specifications.
* Has successfully passed the JAX-RPC and SAAJ TCK test suites.
* Is easy to use (including “instant deployment” by dropping a Java source file into a webapp)
* Supports bi-directional WSDL<->Java generation, both via command line tools and automatically in the runtime
* Contains support for the new version of the DIME specification for attachments
* Contains preliminary SOAP 1.2 support
To download the release, you can visit
Please check it out, and send feedback to firstname.lastname@example.org or email@example.com.
–The Axis Team
In the past two years, most of my work has been done in XFree86, with several terminal windows holding bash or vim sessions. I also like a virtual desktop setup with a web browser, terminals, and my mailer in separate viewports. (This is so important, I even hacked Fluxbox to lay things out the way I like them.
My GNU/Linux desktop works the way I want. That’s not a problem.
As I’m doing some travelling (and my buddy Rael wants to work together in wireless downtown Portland), I’ve borrowed a graphite iBook from the ever-pleasant Allison. It’s an older, slightly slower model, limited to 800×600, but it’s an opportunity to get used to Mac OS X and wireless networking.
Having secure X11 clients forwarding from my desktop to other rooms is very handy, and I’ve enjoyed that. My work habits are such that I haven’t adapted to the OS X side of things yet.
I miss my virtual desktops. Though Space fits the bill somewhat, it doesn’t have edge flipping. (I looked at the source code, but don’t have it figured out yet.) The screen size is a bit too small for four readable terminal windows, but upgrading to a newer iBook will alleviate that.
I like the idea of having a nice laptop with decent hardware support and Unixy goodness, and I know Gentoo and Yellow Dog Linux run here. Maybe I should dual boot. That would give me free software on a laptop, with decent hardware and the option to continue working with Mac OS X. (It’s also non-x86 hardware, which has advantages for testing my software’s portability.)
Of course, there are also x86 and compatible laptops, though I’m less than thrilled with my experiences with cheap and weird hardware. I’m also highly unlikely to enjoy paying a Windows tax — I won’t use MS Windows. I’m just not productive there. I do hear nice things about some of the Vaio line, and other Linux distributions do run there…
I have at least a couple of weeks to decide.
Have strong feelings one way or the other? I’d like your advice.
Related link: http://news.com.com/2100-1033-956409.html
Communications chipmaker Conexant Systems issued a brief announcement that should make life easier for home and small-office netters. Wi-Fi chips will be added to equipment used in cable modems and DSL. That should facilitate people using a single system on a high-speed Internet connection as a hub for the whole family’s or office’s computers.
There’s just one hitch. Many cable companies prohibit networks in their terms of service. I believe some ISPs providing DSL have done so, too.
The concern of the cable companies is that a lot of people sharing a connection in one home or office will place a strain on bandwidth. Whether this would happen in practice is open to debate, but in theory it’s a problem with cable modem networks (which are basically LANs serving a whole neighborhood).
More likely, both cable and DSL providers will need to deal with increased bandwidth use by adding fatter connections to their backbone providers. This raises the cost of providing access to end-users.
Wireless could make things even worse, because a lot of people will leave their networks open to users on the street, by design or out of laziness.
So the terms of service are not just being mean. There are real economic issues here. (Not that the Conexant Systems announcement makes any mention of them!) The question is whether the draw of home and small-office networking will prove so strong that providers relax their restrictions in order to gain customers.
The Conexant Systems offering provides another incentive for them to do so, and makes it a tantalizing possibility.
Related link: http://news.com.com/2100-1040-956357.html
In a desperate bid to combat illicit gambling, a major industrialized nation bans all electronic gaming of any kind, including computer games and video games consoles. Sound far-fetched? No doubt that’s what Greek citizens have been thinking since the enactment of Law 3037 in late July. Read on for a cautionary tale of just how far a government can go off half-cocked in an attempt to benevolently legislate the use of technology. And don’t think it can’t happen here.
Web services seem like they’ve been around too long already. They’re like a great new Summer song that’s getting old now that Fall’s on the way. Or like your favorite Uncle after he’s been at the house a month and you realize it’s him eating all your favorite breakfast cereal.
OK, so it’s not that bad. But from all you read about them, you would think that they’ve been around forever. And they have if you consider the life span of the average new technology. But I think this is different. Really different. Here’s why.
Web services are actually delivering value. This isn’t just a marginal, incremental improvement - it’s a real change.
By defining a contract between the client and the server that’s easy to understand and based on technology-agnostic XML, Web Services have found a killer app: It makes integrating two systems easy.
Sure - you can do a lot more than just integrate two systems using Web Sevices. But this is a killer app and it’s real.
Here are a couple easy examples:
Exposing business systems to your customers. Every company has back-end systems that they run their business on. And just about every company has web- or extranet-applications that their customers use. How do you bridge the two? Put a web service between them and it’s easy. It no longer matters that one department uses Java and the other uses VB. Everyone speaks XML.
Simplify passing data to suppliers. In the last two years I’ve worked on two e-commerce systems that needed to send orders to fullfillment companies. In one situation we connected via MQ Series; in the other the supplier wrote an application we connected to using RMI. We could have saved huge hours if we had just written a web service and defined everything in XML. (And I could have spent more time re-reading old Seinfeld scripts.)
It’s worth repeating - connecting two discrete systems together is the Web Service Killer App.
But are we really just beginning? You bet. Most corporate IT departments are just getting started with the technology.
- Apache Axis is just beginning its adoption.
- The EJB 2.1 spec isn’t even final yet and will turn every J2EE server into a Web Service server.
- .NET is moving, but most recruiters will tell you that .NET projects have only been staffing in earnest for 6 months or less.
And looking over the current crop of JSR’s listed with the Java Community Process, it’s clear that the whole next generation of Java and J2EE will all be focused around incorporating web services deep into Java. And this won’t even hit production apps for a year.
So (as my kids ask every Summer when we’re driving on vacation), “are we there yet?”
Nope. Not by a long shot.