Should open source licenses fight evil?

Open source has come under fire in recent years, with companies like MongoDB hoping to shift its very definition to include proprietary software. But it’s the more recent, and more well-intentioned, salvo that could do the most damage.

Last Thursday, in response to Chef’s willingness to do business with US Immigrations and Customs Enforcement (ICE), one open source developer pulled his software from Chef, causing customer systems to go down. By Friday, Chef not only fixed the outage, but also reversed its policy of doing business with ICE.

You can’t blame a software developer for wanting their code to be used only for good. But others would go further. Many are advocating for open source software licenses that would prevent software from being used immorally or unethically, however that might be defined.

It’s these latter licenses, like the Hippocratic License, that mean well but do harm to the very open source software they hope to help.

Картинки по запросу Should open source licenses fight evil?

First, define some harm

The Hippocratic License is clear in its aims: “This license is derived from the MIT License, as amended to limit the impact of the unethical use of open source software.” In other words, it’s an extension of one of open source’s most open of licenses. The problem, however, is that it isn’t. Open, that is:

The software may not be used by individuals, corporations, governments, or other groups for systems or activities that actively and knowingly endanger, harm, or otherwise threaten the physical, mental, economic, or general well-being of underprivileged individuals or groups.

Seems clear, right? Don’t use my software to help ICE separate families at the border! One problem with this is that some feel that ICE is doing good work (yes, really). My point is not that one group is right or wrong, but rather that it’s a subjective question.

That problem with subjectivity goes on forever.

For example, I’d put pornography companies on the list of those barred from my software, as I’ve seen individuals and families destroyed by pornography addictions. As strongly as I feel about limiting my software’s use to support pornography, however, there will be others who feel just as strongly on the other side. We both would feel we are in the right. That works for Twitter, but it is impossible for a license like this to parse.

Of course, the license only applies to “the physical, mental, economic, or general well-being of underprivileged individuals or groups,” but why? If something is evil and causes harm, why would we limit its applicability? And could we legally do so? “Underprivileged” is an arbitrary distinction without any formal legal definition (whereas, for example, a “suspect class” is reasonably well-defined in the law). “Underprivileged” is subject to all sorts of misapplication and misunderstanding.

It’s well-intentioned. It’s just wrong.