If open source has a mantra, it must surely be, “Free as in speech, not free as in beer.” Software, we claim, should be unencumbered, allowing consumers of the software to run, copy, distribute, study, change, and improve the code as they see fit — in other words, free, as in liberty, to exploit the code.
But “free as in speech” isn’t just a simile for open source. It’s also one of its best practices. Most open source projects operate as a community, where freewheeling, democratic, and even contentious debate improves both the code and the community. All are welcome. “Citizenship” is attained via contribution. Leadership is awarded by merit.
Better yet, the evolution of the community (the people and the product) plays out in public, typically in a forum, mailing list, or newsgroup, and, of course, in each project’s source code tree. Thanks to the wonders of the Internet, the government of an open source project is uniquely chronicled in real-time for all to see and critique.
As this issue of Linux Magazine goes to press, the kernel development community is focusing some of that invaluable discordant debate on how to manage kernel security risks. The quandary: announce exploits or keep them a secret? Broadcast vulnerabilities to the good and bad guys, or maintain the status quo until a fix is available?
To date, Linux kernel developers have favored immediate disclosure — followed almost immediately by repair. Indeed, most open source projects react so speedily. However, the danger of the kernel team’s policy that “honesty is the best policy” is that nothing is immediate in the real world. Even if Linus Torvalds or Andrew Morton fixes an exploit within hours, the new kernel takes time to propagate to distributions, vendors, and eventually, users. In the interim, the bug is out of the proverbial bag.
Torvalds has already weighed in on the matter, saying, “Kernel bugs should be fixed as soon as humanly possible, and any delay is basically just about making excuses. And that means that as many people as possible should know about the problem as early as possible, because any closed list — or even just anybody sending a message to me personally — just increases the risk of the thing getting lost and delayed for the wrong reasons.”
And if you have any doubt, just look at the debacle that is Microsoft Windows security. At one point last year, Microsoft wanted to quell publication of flaws discovered by security researchers and users until the company could analyze the issues and respond. (That makes me laugh. It reminds me of the final scene of Raiders of the Lost Ark. Who’s working on the exploit, Mr. Gates? Top men.)
Moreover, following any other methodology would grossly undermine the open source process. Yes, open source can be chaotic. Yes, it is hard to capture and reproduce the results of the Linux kernel team. Yes, loosing an exploit does carry some risk. But any other choice bifurcates bugs into first- and second-class citizens. And with that comes first- and second-class citizens.
Inifinitely practical, Linus Torvalds has famously said, “Given enough eyeballs, all bugs are shallow.” If “Free as in speech, not free as in beer” is open source’s Constitution, then Linus’ quote — the so-called Linus’ Law — is its First Amendment.
And in the case of kernel exploits, Linus’ Law should be followed to the letter.
Martin Streicher is the Editor-in-Chief of Linux Magazine. He can be reached at