“If they posted sufficient content that violated our threshold, that page would come down,” said Bickert. “That threshold varies, depending on the severity of different types of violations.”
Bickert apologized to right-wing personalities Diamond and Silk for deeming them “dangerous.” Bickert also acknowledged Facebook may have made other mistakes, like deleting content from groups like Black Lives Matter.
The bigger issue, said San Jose congresswoman Zoe Lofgren, is that algorithms are playing into what users want to see.
“The net result is that Americans have been isolated into bubbles. The next effect is that we’ve ended up in echo chambers. That has allowed the American public to be exploited by our enemies,” said Lofgren.
Critics have argued the hearings are a waste of time. Eric Goldman, law professor at Santa Clara University, said that decisions by social media companies on whether to filter content are fundamentally a First Amendment protected activity.
“There’s substantial limits on the House Judiciary Committee or any other regulator to tell the social media companies what they should be filtering or can’t be filtering,” said Goldman. “That’s not really the purview of the legislature.”
Goldman points to Section 230 of the Communications Decency Act, which protects tech companies from content posted on their platforms.
Facebook, Twitter and Google declined an invite to the first hearing in April. This time around, Bickert, Juniper Downs, head of public policy and government relations for Google-owned YouTube, and Nick Pickles, Twitter’s senior strategist on public policy, attended.
Pickles’ testimony echoed Bickert’s remarks, telling the committee that Twitter doesn’t censor political views but that the social media site has made mistakes — for instance, blocking a Senate campaign announcement ad for Tennessee Republican Rep. Marsha Blackburn.
“Every day we have to make tough calls. We do not always get them right,” said Pickles. “When we make a mistake, we acknowledge them, and we strive to learn from them.”
Berin Szoka, president of policy think tank TechFreedom, testified at the first hearing and said he hopes that this time around legislators gain an understanding of the complexities of policing content, so that ultimately the conversation can steer in the direction of solutions.
“It’s about educating the members as to how their content moderation works and how hard it is to hit moving targets. How much they have to constantly adjust what they do to stay ahead of people who are trying to manipulate and game their systems.”