Congress to quiz Facebook and Google on white nationalism and political bias
Congressional lawmakers plan to grill Facebook and Google this week over the ways they police their platforms, including their efforts to stop online hate speech from spurring real-world violence, offering the latest sign that tech giants face a global regulatory reckoning for their business practices.
Congressional lawmakers plan to grill Facebook and Google this week over the ways they police their platforms, including their efforts to stop online hate speech from spurring real-world violence, offering the latest sign that tech giants face a global regulatory reckoning for their business practices.
For years, Silicon Valley has struggled to strike the right balance between allowing users to express themselves and preventing the viral spread of objectionable posts, photos, and videos. But a series of recent high-profile occurrences have prompted renewed debate in Washington over how to hold tech companies accountable for the content they allow or block.
That tension will be on display beginning Tuesday, when House Democratic lawmakers plan to explore the spread of white nationalism on social media. The hearing grows out of concern that Facebook, Google, and other tech giants have become digital incubators for some of the most deadly, racially motivated attacks around the world, including a neo-Nazi rally in Charlottesville, Va., in 2017, the shooting at a synagogue in Pittsburgh last year, and the attacks on two mosques in Christchurch, New Zealand, last month.
“They clearly have been conduits for that kind of hate speech,” said Rep. Jerry Nadler (D., N.Y.), the chairman of the House Judiciary Committee, which is holding Tuesday’s hearing.
For decades, federal law has shielded social media sites from being held liable for the content posted by their users, a form of protection that Silicon Valley's chief advocates stress is essential to the industry's runaway economic success. Other countries, however, have responded to the rise of extremism online by unveiling a wave of new proposals and laws targeting harmful content. The United Kingdom introduced a sweeping blueprint for new fines Sunday, and Australia passed a new content-moderation law earlier this month.
That flurry of activity has some U.S. lawmakers beginning to wonder whether they should follow suit.
“I think regulation should absolutely be examined,” Rep. Karen Bass (D., Calif.), the chairwoman of the Congressional Black Caucus, said in an interview.
Facebook and Twitter declined to comment for this story. Google declined to discuss the issue but said it would work with lawmakers to address their concerns. Each has invested heavily in more people, and increasingly sophisticated technology, meant to combat a wide array of abusive content — from the spread of false news to the rise of online hate speech. And they’ve pointed to recent major strides, taking down harmful posts, tweets, and videos before they’re widely seen by users.
“I think they’re doing well, but there’s always still work to be done,” said Michael Beckerman, the president of the Internet Association, a Washington lobbying group that counts Facebook, Google, and Twitter as members.
But the industry’s continued struggles became apparent last month, when videos from the shootings at two mosques in Christchurch, New Zealand, proliferated for days on social media. The attacker live-streamed the massacre on Facebook while users schemed on anonymous, lesser-known forums such as 8chan on ways to upload new copies across the internet while evading tech giants’ sophisticated sensors.
Even before the New Zealand massacre, though, civil rights organizations long had warned that Facebook and other social media sites had become conduits for hate speech.
The Anti-Defamation League, which is set to testify at the House's hearing on Tuesday, estimated that there were 4.2 million anti-Semitic tweets sent in 2017. Researchers long have faulted YouTube for features that suggest new videos to users based on what they've previously watched. And Facebook faced criticism for inconsistent policies regarding race: It previously prohibited posts about white supremacy but allowed content about white nationalism and separatism. Under siege from civil rights groups, Facebook widened its ban in April. The change came more than a year after self-professed white supremacists used the social networking site to organize rally that turned violent, and deadly, in Charlottesville.
Civil rights advocates pointed to those high-profile failures at major social media sites as evidence that Congress needed to act. Kristen Clarke, the president of the Lawyers' Committee for Civil Rights Under Law, said she planned to tell the committee Tuesday that social networks remain a "key venue that many violent white supremacists turn to, to organize and recruit new members, and to incite violence across the country."
"I think this is a problem that is widespread and systemic, and it's in large part because we largely relied on these companies to self-police," she said.
House Democrats said the Tuesday hearing is their first in a series exploring white nationalism, an issue that Nadler said Republicans repeatedly refused to explore when they ran the House. Instead, GOP leaders on the chamber’s Judiciary Committee focused much of their attention on allegations that Facebook, Google, and Twitter had been too heavy handed in taking down content online, resulting in conservative users, news, and views being suppressed.
Republicans led by Sen. Ted Cruz of Texas plan to revive those allegations with a hearing of their own Wednesday. While there is no evidence that tech giants systematically target right-leaning users, top GOP officials — including President Donald Trump — have regularly claimed that Silicon Valley has limited their online reach. Democrats, meanwhile, have derided the hearing: Sen. Sheldon Whitehouse (D., R.I.) laughed when asked about it last week. (“Just note that I laughed,” he said.)
Trump repeatedly has threatened regulation in response, at one point in March accusing Facebook, Google, and Twitter of harboring a “hatred they have for a certain group of people that happen to be in power, that happen to have won the election.” He said at the time that government had to “do something about it,” a call to action that has been echoed by congressional Republicans who also want to hold tech companies liable for the decisions they make about what to allow online.
“The whole area of social media is pretty much unregulated,” said Sen. Lindsey Graham (R., S.C.), the leader of his chamber’s Judiciary Committee. Appearing at a conference in Washington last week, Graham raised the potential that tech giants make decisions about the content they allow or remove based on “liberal bias in California.”
In questioning the need for new regulation, lawmakers have cast fresh doubt on Silicon Valley’s legal holy grail. A key provision of a 1990s law, known as Section 230, spares websites from being held liable for the content posted by their users. Members of Congress who seek to hold tech companies accountable for hate speech — or penalize them for decisions believed to be motivated by political bias — could strike at the heart of a federal law the industry has lobbied intensely to protect. They last chipped away at Section 230 in 2018 as part of an effort to crack down on sexual exploitation online.
Absent those legal protections, the Internet Association warned that it would result in a web where companies didn’t allow most user-generated content or didn’t moderate it at all, fearing legal liability. “Changing [Section] 230 is not the answer, because that gets you in a worse place, no matter what side you’re on,” Beckerman said.
Nadler, the leader of the House Judiciary Committee, expressed unease with upending that law. For now, he said the goal is to "see what happens by just pressuring them first." But he didn't rule out regulation if tech giants don't improve their practices.
“We have the First Amendment, and we’re very reluctant to pass speech laws,” he said. “But there’s a problem, and we have to deal with it.”