upper waypoint

4 Takeaways From the Senate Child Safety Hearing With YouTube, Snapchat and TikTok

Save ArticleSave Article
Failed to save article

Please try again

A woman and a man dressed in blazers sit at a long wooden table with nameplates in front of them. A photographer is blurred behind them in a doorway, a camera held to his face.
Jennifer Stout, left, vice president of global public policy at Snapchat parent Snap Inc., and Michael Beckerman, vice president and head of public policy at TikTok, testify before a Senate panel on Tuesday, Oct. 26, 2021.  (Samuel Corum/Getty Images)

Lawmakers in the Senate hammered representatives from Snapchat, TikTok and YouTube on Tuesday in a combative hearing about whether the tech giants do enough to keep children safe online.

It marked the first time Snapchat and TikTok have landed in the hot seat in Washington, D.C., and for nearly four hours lawmakers pressed the officials about how the apps have been misused to promote bullying, worsen eating disorders and help teens buy dangerous drugs or engage in reckless behavior.

The hearing was convened by the Senate Commerce Subcommittee on Consumer Protection, Product Safety and Data Security — the same panel that brought Facebook whistleblower Frances Haugen to testify earlier this month about the thousands of pages of internal company documents she has shared with Congress, regulators and the press. Haugen says the documents show how the social network places profits over public safety.

Haugen’s disclosures about Facebook underscored the potential harms of the platform: its ability to amplify misinformation and how Facebook’s own research showed that Instagram can worsen mental health and body-image issues for young people.

Given how enormously popular Snapchat, TikTok and YouTube are with teens, lawmakers expressed deep worry about the platforms having the ability to hurt users’ self-image and contribute to other mental health issues.

Sponsored

In his opening remarks, Sen. Richard Blumenthal, D-Conn., said social media firms claiming they are distinct from Facebook is not going to cut it.

“Being different from Facebook is not a defense,” said Blumenthal, who leads the subcommittee. “That bar is in the gutter. It’s not a defense to say that you are different.”

Sen. Ed Markey, D-Mass., offered an even blunter assessment.

“The problem is clear: Big Tech preys on children and teens to make more money,” Markey said. “Now is the time for the legislative solutions to these problems.”

Here are four takeaways from the hearing.

1. Lawmakers say that until incentives change, social media will be a ‘race to the bottom.’

For the youngest users, Blumenthal said, social media companies have a perverse incentive to keep eyeballs glued on their apps, regardless of what kind of content is eventually served up.

“What we want is not a race to the bottom, but a race to the top,” Blumenthal said.

For teens and other young people using social media, being optimized for engagement can make social media apps addicting and lead users to content that is not age-appropriate or is harmful, the lawmakers said.

Sen. Cynthia Lummis, R-Wyo., asked the company officials whether platforms are designed to keep people engaged as long as possible.

Michael Beckerman, TikTok’s vice president and head of public policy, was evasive, saying the viral video app sees itself as a form of entertainment, no different from television or movies. Still, the app has a responsibility to give parents time-management and “take a break” tools, he said.

Related Coverage

For TikTok, Beckerman said, “overall engagement” is more important than how much time is spent on the app.

Jennifer Stout, the vice president of global public policy at Snapchat parent Snap Inc., said time on the app is “one of many metrics” the company studies.

And Leslie Miller, YouTube’s vice president of government affairs and public policy, like the other officials, would not directly answer the question of whether the video-streaming service defines success by how long people spend watching videos.

“We do look at, for example, if a video was watched through its entirety,” Miller said. “We look at those data points.”

2. Snapchat says it will continue to fight abuse of its app, including cracking down on drug dealing.

Stout told lawmakers that Snapchat is “an antidote to social media,” highlighting how “very little” of its content is sorted by algorithms.

“Snapchat’s architecture was intentionally designed to empower people to express a full range of experiences and emotions with their real friends, not just the pretty and perfect moments,” Stout said.

But lawmakers zeroed in on the ways in which Snapchat has led to harm.

One feature that drew particular attention from lawmakers was Snapchat’s now-disabled “speed filter,” which critics say encouraged teens to drive at excessive speeds. The feature has been connected to a number of deadly or near-fatal car crashes. The company’s decision to eliminate the feature in June was first reported by NPR.

Sen. Amy Klobuchar, D-Minn., noted other cases where young people obtained drugs through Snapchat, including one young man who died after purchasing the painkiller Percocet laced with fentanyl on the app.

Snap has stepped up detection measures to root out drug dealing on the platform and launched an education campaign to steer users away from those peddling drugs on the app.

“We are absolutely determined to remove drug dealers from Snapchat,” Stout said.

Officials from all three companies were asked about instances where the platforms were found to have fed young users material about sex, self-harm or content that worsens body-image issues.

In response, the officials evaded responding to particular examples and instead stated generally that such content would violate its rules and be removed.

“We prohibit content that promotes or glorifies such things as eating disorders, but we also realize that users come and share their stories about these experiences,” said Miller.

At YouTube, Miller said, experts help develop content moderation policies. More than 90% of content that violates its community guidelines is detected through its artificial intelligence, according to Miller.

3. TikTok’s ties to China were in the spotlight.

TikTok, which has more than 1 billion monthly active users around the globe, was grilled about an issue that first landed it in hot water during the Trump administration: its ties to China.

TikTok is a U.S. business that is a subsidiary of ByteDance, a Beijing-based tech giant.

Officials at TikTok have long said that Americans’ data is primarily stored in the U.S. and safeguarded from the Chinese authorities.

Lawmakers, led by Sen. Ted Cruz, R-Texas, questioned those safeguards, and pushed Beckerman on whether U.S. user data is shared with ByteDance, accusing TikTok of being cozy with Chinese authorities.

Cruz asked Beckerman whether TikTok’s privacy policy permits ByteDance unfettered access to Americans’ personal information.

Beckerman did not directly answer the question, pointing out that TikTok does not exist in China. The app’s Chinese counterpart is known as Douyin.

“That does not give this committee any confidence that TikTok is doing anything other than participating in Chinese propaganda and espionage on American children,” said Cruz.

“That is not accurate,” Beckerman shot back.

China-based ByteDance engineers do have access to U.S. user data, but can only gain such access with permission from an American security team, a top TikTok security official said last year in a sworn statement as part of the company’s legal battle with the Trump administration.

“We do not share information with the Chinese government,” Beckerman told senators on Tuesday.

4. The companies refused to commit on legislative proposals.

Even though there is bipartisan support in Washington to regulate the tech industry, Democrats and Republicans differ in diagnosing the problem, and sometimes have opposing solutions.

Nonetheless, senators discussed a range of legislative proposals, like an update to a law known as Section 230 that provides a legal shield to the industry.

In addition, bills that would protect the online privacy rights of children, ban ads targeting young users and eliminate core features of social media, including “like” buttons, autoplay and push alerts, were also put to company officials.

Most of the time, however, lawmakers could not pin down the positions of the tech company representatives on various proposals.

After the officials refused to offer clear answers on whether they supported a law that would regulate how tech companies can collect personal data from teenagers, Markey became frustrated.

“This is just what drives us crazy, ‘We want to talk, want to talk, want to talk.’ This bill’s been out there for years, and you still don’t have a view on it,” Markey said.

Blumenthal, too, grew impatient with the answers from the company representatives on specific pieces of legislation that would impose greater restrictions on the tech industry.

Markey said it wasn’t enough to simply support the goals of the legislation, as the officials said that they did. “That’s meaningless if you don’t support the legislation,” he said.

Editor’s note: Google, which owns YouTube, is among NPR’s recent financial supporters.

Sponsored

Copyright 2021 NPR. To see more, visit npr.org.

lower waypoint
next waypoint