Remarks

No, Silicon Valley, You Don’t ‘Have to’ Do Business With Nazis

Refusing to spread the messages of racist bullies is the least a company can do. It doesn’t need to be cloaked in paeans to fiduciary duty.

One of the strange and disconcerting byproducts of the past couple weeks, beginning with a white supremacists’ march that looked like something out of 1930s Germany, is that a group normally seen to be our most craven and cynical is enjoying a victory lap of sorts. For the conventional wisdom of the moment, I refer you to this New York Times feature about “the moral voice of CEOs,” which devotes several thousand words to praising the executives of America’s large corporations for, basically, saying that Nazis are bad. 

In case you missed it, the action began last week, when Merck Chief Executive Officer Ken Frazier resigned from President Trump’s manufacturing council over Trump’s failure to unambiguously condemn an act of domestic terrorism. Frazier’s resignation prompted a dozen other business leaders to follow him out the door, causing the world to momentarily forget that Merck charges some cancer patients more than $150,000 a year and has been defending a nine-figure gender-discrimination lawsuit. “The CEOs had found their voice,” the Times tells us.

Credit where credit is due, I suppose, but it’s probably worth asking whether big companies should do more. That’s the question currently racking Silicon Valley, where there are deep-seated divisions about how far technology companies should go to silence hate groups. After Google, GoDaddy, and several other domain registrars refused to host the website of the Daily Stormer—a publication that is for American Nazis what Inspire magazine is to Al-Qaeda members—many smart, serious people worried that this was a mistake.

“Literally, I woke up in a bad mood and decided someone shouldn’t be allowed on the internet,” Cloudflare CEO Matthew Prince wrote in a memo to employees after canceling the Nazi publication’s access to his company’s service, which protects against denial of service attacks. “No one should have that power.” 

For years, Cloudflare, like many tech companies, has resisted calls to stop doing business with questionable groups on the grounds that such business was legal and upheld the principle of free speech. Last week, Prince seemed to reverse course, though he remains conflicted. In a blog post published after the memo was leaked, he said canceling Cloudfare’s protection of the Daily Stormer’s website made it more vulnerable to the “vigilante justice” of hacker attacks, and that such an environment should worry anyone who cares about free speech and due process.

Prince has been rightly praised for his transparency, but I don’t find his argument compelling. Leaving aside that the Daily Stormer’s value isn’t a matter of opinion—the Daily Stormer is vile, full stop—I’d argue that personal judgments are the best way to stop the spread of violent extremism.

Like many who’ve expressed concern about the implications of the Daily Stormer’s blackballing, Prince seems to think that if some regulatory body or industry group decided to stop hate speech, the problem would vanish overnight. But hate speech doesn’t work that way. In Germany, where being a Nazi is illegal, there are still Nazi rallies.  

And, of course, Prince’s “vigilante justice” didn’t banish the Stormer from the internet. The site’s material is currently available on Gab, which is sort of like Twitter for people who think Twitter is too nice. (That’s funny if you know anything about Twitter.) By Saturday, upstart BitMitigate had surfaced to offer the Stormer an alternative to Cloudflare’s protection. BitMitigate is run by a 20-year-old who told ProPublica that he’d done it, in part, for the publicity.

One of the downsides of American capitalism is that we have to put up with opportunists who are okay with taking money from Nazis. One of the upsides is that nobody is forced to take money from Nazis. Until Congress decides otherwise, Cloudflare, GoDaddy, Google, and others are private companies, not utilities. If you own a bar, you don’t have to grit your teeth while a Nazi spouts off; you can toss him out. If you own Google, same deal. (Before being ousted from the Trump administration, Steve Bannon supposedly floated the idea of regulating Google and other tech giants such as utilities. That seems less likely now.)

Of course, deciding which customers to do business with and which to shun can be hard in the sense that it requires executives to make moral judgments. It’s true, as Will Oremus pointed out last week in a Slate column, that it’s probably only a matter of time before companies like Google are pressured to cut ties with such groups as Black Lives Matter. Maybe in the face of pressure the companies will do something morally odious, like deciding that advocating for the rights of people of color is just as objectionable as Nazism. 

But that will be rightly seen as a scandal, just as it should be seen as a scandal when huge companies with lots of resources adopt a pose of neutrality, falling back on their own terms of service, or code of conduct, or shareholder value, or saying “we follow the law,” when what is right has nothing to do with any of that. 

What was genuinely beautiful about Ken Frazier’s break with the Trump council—to me, anyway—was that Frazier didn’t bother justifying what he did on any grounds other than his own conscience. “I feel a responsibility,” Frazier wrote, “to take a stand against intolerance and extremism.” We need more of that, not less. Generally, the best way to combat intolerance is to act like a human being.

    Max Chafkin
    Bloomberg Businessweek Columnist
    Before it's here, it's on the Bloomberg Terminal.
    LEARN MORE