Related link: http://www.devx.com/opensource/Article/20111
DevX’s Executive Editor A. Russell Jones suggests that governments avoid jumping on the Open Source bandwagon because Open Source software, by its very openness, is more vulnerable to exploitation. This attitude reflects a deep misunderstanding about how both security procedures work, and about how Open Source projects work.
His argument rests on three ideas:
- Someone who is part of a project can place an exploit within the code: “the security breach will be placed into the open source software from inside, by someone working on the project.”
- While there is sufficient scrutiny on major projects to prevent this kind of exploit, since Open Source permits anyone to create their own distribution, a smaller, less scrutinized spin-off can easily have this kind of exploit: “distributions will be created and advertised for free, or created with the express purpose of marketing them to governments at cut-rate pricing. As anyone can create and market a distribution, it’s not far-fetched to imagine a version subsidized and supported by organizations that may not have U.S. or other government interests at heart.”
- The extensive peer review to which Open Source code is subjected doesn’t suffice to uncover such exploits, because these exploits can be withheld from the publicly available source code: ” the model breaks down as soon as the core group involved in a project or distribution decides to corrupt the source, because they simply won’t make the corrupted version public.”
Jones’ conclusion: governments simply cannot afford to take the risk of using Open Source, even given the benefits its flexibility provides, because of these security risks: “To limit their vulnerability, governments can’t afford to give everyone a choice, nor can they afford to provide access to the source code for their software.”
There are deep misconceptions here about how Open Source projects work, and about what the differences are between Open Source and proprietary software. I want to start, though, by pointing out a much more important and much more disturbing theme.
Too often people assume that secrecy equals security. Nothing could be further from the truth. Today’s strong cryptography is based on the assumption that an “adversary” will know both that something is encrypted, and what the encryption scheme is. The notion that hiding the means of encryption will somehow make the data in question more secure is a notion that has been obsolete since World War II. Strong crypto assumes, rather, that despite the fact that the encryption algorithm is a matter of public knowledge, that the data in question will remain encrypted and secure.
Open Source software is based on a similar notion of security. Hiding source code is a bad way to assume you’ll achieve security, because even a powerful and highly proprietary company can’t
guarantee that source code won’t leak out. Instead, security should be based on a worst-case scenario: assume your “adversary” has access to the source code.
Starting from worst-case assumptions is just plain common sense. Any other security plan is simply madness. Open Source software inherently takes this approach to security.
The best example of this would be the
Net BSD operating system, which is not only completely Open Source, but also has a security auditing procedure that would be the envy of any corporation.
Setting aside this fundamental philosophical difference about security, is Open Source really any more vulnerable than proprietary software in the ways that Jones describes?
Jones worries that someone from within the project could include code allowing some kind of exploit. How is this different from proprietary software?
Programmers, whether working on Open source or proprietary code, routinely include surprises in their code. Most of these are harmless; they’re called Easter eggs. With proprietary code, there’s little chance these will be uncovered once the binary ships. With Open Source, constant peer review will bring this code to light.
When these “surprises” — intentional or otherwise — do turn out to be security risks, Open Source appears to have a better track record at prompt and effective correction. Open Source projects respond with security fixes within
to weeks. Microsoft has taken href="http://www.silicon.com/software/security/0,39024655,39118331,00.htm">six
months to respond to a major security hole in Windows, and has a number of known but unresolved security issues with Windows.
Jones acknowledges that the Open Source community can provide an effective resource for identifying and addressing vulnerabilities. His claim is that because the source code is freely redistributable, a company or organization with questionable intentions could distribute a “hacked” version to an unwitting customer.
It’s hard to see the force of this argument, or what it has to do with Open Source. If a government agency, or any other software licensee, chooses to use an unknown company of unverified reputation as its software supplier, all for the sake of saving money, then it takes a risk. This is true whether the software in question is proprietary or Open Source. There are any number of well known and reliable companies who supply and support Open Source software, and indeed for many application areas the vendor choices are more numerous — and hence more competitive — in the Open Source arena than in the world of proprietary software.
Jones closes his argument by questioning whether the Open Source community can do an effective job of policing itself, asking, in effect, “who’s watching the watchers?” Again, one could ask this question all the more acutely of a company like Microsoft, that allows no outside or independent audit of its source code.
More importantly, one could ask this question of Jones’ own company, DevX.
Journalism is a difficult profession, demanding a rigorous editorial line between “church and state”. In this era of increasing media consolidation, it’s important to think about who the investors in or advertisers are with any given media company. Before buying into the editorial line coming from any media company, one should first think about potential conflicts of interest between the company’s revenue sources and its editorial stance. In other words, buyer beware.
DevX draws revenue from two sources: from online advertising, and from creating microsites for technology companies where it regurgitates corporate content under the guise of a nominally independent site. This is not exactly a business model to inspire confidence in editorial independence, particularly given that proprietary software companies, including Microsoft, figure prominently in DevX’s revenue.
I wouldn’t want to directly acuse Jones of bias based on conflict of interest. DevX’s business relationships, however, do suggest that an argument like Jones’ should be held to a higher standard of evidence and rational argument than he has offered us so far.
Has the openness of Open Source software created an added security risk at your company, or has it enhanced security at your company?