Imagine you sell hammers for a living. And imagine a man well-known to be a violent thug is pursuing his next victim. He pops into your store. “One standard claw hammer, please. Sixteen ounce. Wooden handle.” You’re curious. “What’s the hammer for?” He hedges. “Well, you know…just building stuff. Can you hurry, please? No need to wrap it. And I’ll pay cash.” What should you do? After all, it’s just a hammer, and you’re in the business of selling those. Yes, he’s a murderous thug. But it’s just a hammer. You know, for building things. At least, that’s what he said he’d use it for. Surely he wouldn’t lie about that. And even if he is lying, well, you’re only supplying a hammer. You’re just trying to make a living. You’re not responsible for what he does with the hammer. Right?
Now, just change “hammer” to “complex Internet monitoring technology,” and change “murderous thug” to “authoritarian government with an abysmal human-rights record,” in the above example, and you see roughly the moral question faced by a number of high-tech companies today. It’s for this reason that Netsweeper, based in Guelph, Ont., has been uncomfortably in the spotlight lately. Netsweeper sells web-filtering and mobile-device filtering solutions, the kind of stuff that’s useful to a school trying to limit children’s access to pornography, to Internet service providers looking to help customers limit their exposure to computer viruses—and very likely to totalitarian regimes looking to stifle opposition. According to Citizen Lab, a censorship watchdog at the University of Toronto, Netsweeper’s customers have allegedly included telecom firms in a number of countries—including Qatar, Yemen and the United Arab Emirates—that have troubled records in terms of human rights and political freedoms. According to Citizen Lab, Netsweeper has been used to block access to websites on subjects like human rights and homosexuality, and sites dedicated to opposition points of view.
Perhaps not surprisingly, Netsweeper is not anxious to talk about this. When approached by Canadian Business, Netsweeper’s director of sales and marketing, Scott O’Neil, declined to comment. Other media outlets that have reported on the company of late have had the same experience. Given the company’s silence, it’s hard to guess just what justification they might give for providing services that are likely to be used for repressive purposes. But the candidate explanations are obvious.
To start, the company might point out that they’re not doing anything illegal. And that’s true, but it’s also less than compelling. There are lots of behaviours—both personal and corporate—that are legal but clearly unethical. Any company that thinks its obligations are exhausted by adherence to the law is effectively begging for further regulation. The company might also point out that they’re merely selling the technology, not dictating how it is used. But given that Netsweeper’s products allow customized keyword filtering, it’s hard to imagine they don’t know roughly what political ends their technology might be used for. Finally, the company might point out that their product isn’t the only game in town, and that if they don’t meet a repressive regime’s web-filtering needs, then surely some other company will. But while that argument might hold at least a bit of water if put forth by a hammer salesman—after all, there are thousands of places to buy a hammer—it is considerably less plausible when put forth by a company that is one of relatively few makers of a specialized product.
So there’s little available to a company by way of justification for helping regimes engage in repression of their own peaceful citizens. No, to the extent that there’s even an ethically interesting dilemma, here, it isn’t rooted in the sale of Internet-filtering technology to brutal dictatorships. The real dilemmas have to do not with the fact that there are brutal dictatorships in the world, but with the fact that the governments of the world are spread along a spectrum, from fair and democratic to violent and totalitarian. Add to that the fact that the behaviours governments seek to monitor and act upon vary enormously too. At one end of that spectrum, there are peaceful pro-democracy rallies. At the other end is terrorism. In the middle are hockey riots and clashes between idealistic protestors and cops in riot gear. Recall that in the wake of the London riots of early August, Research In Motion faced pressure both to assist British police in their investigations and to resist doing so. There are genuinely tough questions here, and not just from a public relations point of view.
There’s a lot of talk these days about corporate social responsibility, and plenty of debate over just what that phrase means. But whatever else a socially responsible company does, it certainly owes society two things. One is an attempt to make sure that its presence among us actually does some good—that it’s not just meeting some demand, but actually contributing to human well-being. The world of commerce is a rough-and-tumble world, one where we tolerate aggression and competitiveness that would, in other areas of life, be considered beyond the pale. We tolerate such behaviours because, on the whole, commerce makes life better. It’s not just good for shareholders, but good for the community. Any business that loses sight of that risks undermining the moral justification for its own existence.
The other thing a socially responsible company owes is accountability, a willingness to stand up and explain its own behaviour.
A company that won’t explain itself, that won’t even try to explain to its community why it believes its practices are justified is effectively a rogue company—in roughly the same sense that totalitarian regimes, disdainful of the standards of the international community, are sometimes called rogue nations.