TORONTO/SAN FRANCISCO, August 16, 2017: The internet domain registration of the neo-Nazi website Daily Stormer was revoked twice in less than 24 hours in the wake of the weekend violence in Charlottesville, Virginia, part of a broad move by the tech industry in recent months to take a stronger hand in policing online hate speech and incitements to violence.
GoDaddy, which manages internet names and registrations, disclosed late Sunday via Twitter that it had given Daily Stormer 24 hours to move its domain to another provider, saying it had violated GoDaddy's terms of service.
The white supremacist website helped organize the weekend rally in Charlottesville where a 32-year-old woman was killed and 19 people were injured when a man plowed a car into a crowd protesting the white nationalist rally.
After GoDaddy revoked Daily Stormer's registration, the website turned to Alphabet's Google Domains. The Daily Stormer domain was registered with Google shortly before 8 a.m. Monday PDT (1500 GMT) and the company announced plans to revoke it at 10:56 a.m., according to a person familiar with the revocation.
As of late Monday, the site was still running on a Google-registered domain. Google issued a statement but did not say when the site would be taken down.
Caught in the middle
Internet companies have increasingly found themselves in the crosshairs over hate speech and other volatile social issues, with politicians and others calling on them to do more to police their networks while civil libertarians worry about the firms suppressing free speech.
Twitter, Facebook, Google's YouTube and other platforms have ramped up efforts to combat the social media efforts of Islamic militant groups, largely in response to pressure from European governments. Now they are facing similar pressures in the United States over white supremacist and neo-Nazi content.
Facebook confirmed Monday that it took down the event page that was used to promote and organize the "Unite the Right" rally in Charlottesville.
Facebook allows people to organize peaceful protests or rallies, but the social network said it would remove such pages when a threat of real-world harm and affiliation with hate organizations becomes clear.
"Facebook does not allow hate speech or praise of terrorist acts or hate crimes, and we are actively removing any posts that glorify the horrendous act committed in Charlottesville," the company said in a statement.
Several companies acted
Several other companies also took action. Canadian internet company Tucows stopped hiding the domain registration information of Andrew Anglin, the founder of Daily Stormer. Tucows, which was previously providing the website with services masking Anglin's phone number and email address, said Daily Stormer had breached its terms of service.
"They are inciting violence," said Michael Goldstein, vice president for sales and marketing at Tucows, a Toronto-based company. "It's a dangerous site and people should know who it is coming from."
Anglin did not respond to a request for comment.
Discord, a 70-person San Francisco company that allows video gamers to communicate across the internet, did not mince words in its decision to shut down the server of Altright.com, an alt-right news website, and the accounts of other white nationalists.
"We will continue to take action against white supremacy, Nazi ideology, and all forms of hate," the company said in a tweet Monday. Altright.com did not respond to a request for comment.
Meanwhile, Twilio Chief Executive Jeff Lawson tweeted Sunday that the company would update its use policy to prohibit hate speech. Twilio's services allow companies and organizations, such as political groups or campaigns, to send text messages to their communities.
Arbiters of acceptable speech
Internet companies, which enjoy broad protections under U.S. law for the activities of people using their services, have mostly tried to avoid being arbiters of what is acceptable speech.
But the ground is now shifting, said one executive at a major Silicon Valley firm. Twitter, for one, has moved sharply against harassment and hate speech after enduring years of criticism for not doing enough.
Facebook is beefing up its content monitoring teams. Google is pushing hard on new technology to help it monitor and delete YouTube videos that celebrate violence.
All this comes as an influential bloc of senators, including Republican Senator Rob Portman and Democratic Senator Richard Blumenthal, is pushing legislation that would make it easier to penalize operators of websites that facilitate online sex trafficking of women and children.
That measure, despite the noncontroversial nature of its espoused goal, was met with swift and coordinated opposition from tech firms and internet freedom groups, who fear that being legally liable for the postings of users would be a devastating blow to the internet industry. (VOA)