upper waypoint

Incorrectly Deleted From Facebook? Getting Back On Might Take Connections

04:05
Save ArticleSave Article
Failed to save article

Please try again

Four smiling young women wear tank tops that say "#Banned"; a pink banner at right reads "Thank you Rachael!”
From left to right: MaryKathryn Kopp, Hannah Mae Sturgis, Hallie Griffin, and Kaitlin Paige Longoria. All star in ‘Hitler's Tasters.’ The play's page has been deleted from Facebook more than once, but in the most recent case, it took a KQED reporter inquiring with Meta PR to get a human to review a software-prompted deletion. (Courtesy of Zach Griffin)

Belligerent nation states, exes bent on revenge porn, hucksters selling fake medical cures: there are a lot of scary threats Meta (a.k.a. Facebook) is trying to counter with a combination of artificial intelligence and human content moderators. But the software is regularly deleting the accounts of innocents, who quickly discover they don’t merit human review unless they’re considered VIPs by the company.

Consider the recent case of Los Angeles-based playwright Michelle Kholos Brooks. A few years ago, she came across an article like this one, about Margot Wölk, one of the young women forced to taste Adolf Hitler’s food before he ate it. In 2013, at the age of 95, Wölk shared her story with the German magazine Der Spiegel. “I wrote a play around that,” Brooks explains, “putting young women in a room, waiting to die at every meal.”

A former journalist and a Jewish American, Brooks wants to bring history to life for modern audiences, she said, “Because for young people today, World War II is in the rearview.”

Hitler’s Tasters has been performed in New York; Chicago; Venice, California; the Fringe Festival in Edinburgh—and coming in April 2022, New York again. Critics and audiences alike have responded positively to this dark comedy about an awful topic.

Poster shows a photo of Adolf Hitler smiling at four girls
The cover visual for the Facebook page of ‘Hitler’s Tasters,’ now that it’s back up. (Courtesy of Cody Butcher)

“Sometimes people are not sure if it’s OK to laugh,” Brooks acknowledges. “You know, a lot of it gets very dark. But we encourage it.”

Sponsored

But the Facebook pages belonging to the play, to Brooks, all the actors and even the director, were deleted suddenly in mid-November, with a generic alert that informed them they had violated the company’s “community guidelines.” Years of photos, videos, followers and contacts: gone.

Mistakes happen

“In the past, we have had the opportunity to say, ‘Hey, you got this wrong.’ And this time, it was just a sweeping removal out of nowhere,” says Hallie Griffin, an actor in Hitler’s Tasters, and also its social media maven. Yes, the play’s page has been deleted before, from Instagram, and restored before, once a human was put on the case.

Many—dare I say, most—humans living today in North America and beyond will have heard of Adolf Hitler, even if they know nothing about the man other than that he started a world war in the mid 20th century, and launched a genocide commonly known as the Holocaust.

His name does come up in a lot of hate speech on social media, which explains why a software filter might be triggered by the word “Hitler.” But most humans reviewing the use of the word in context can quickly differentiate between an attempt to stoke anti-Semitism and an artistic treatment of a historical figure and his impact on the world around him.

Four young women in character wearing in smock-like dresses.
A promotional image for the play ‘Hitler’s Tasters,’ reflecting the play’s dark humor. (Courtesy of Zach Griffin)

You can request a review of an account deletion, what Facebook calls a “cross check,” and Brooks did, getting an email back in 30 minutes.

Brooks read out some of that response for me: “Your account has been permanently disabled for not following the community standards. Unfortunately, we won’t be able to activate it for any reason. This will be our last message regarding your account.”

Hitler’s Tasters and the cast members don’t have huge followings on the various social media accounts, though their presence on the platforms has helped push ticket sales for performances. Initially, Brooks and her fellow thespians were shocked and upset. Their Instagram accounts were not deleted (this time). They thought it might even be a plus to focus promotion around the fact they were banned.

A remarkably common problem

Others caught in a similar pickle have been less sanguine. According to BuzzFeed News, Meta’s algorithmic intractability has spurred the creation of a black market, populated by scam artists and possibly Meta employees promising to restore deleted accounts. Scam artists take people’s money and run. But BuzzFeed says some accounts have been restored and even verified, which sounds like something only an employee could help facilitate.

The Struggle to Moderate Content

Brooks heard that some people who knew people inside the company could get customer support involved. “I vaguely know a woman who works at Microsoft,” Brooks recalls. “A member of her team moved over to Facebook recently. She explained our situation to him and he said he might be able to help. The reason I know this woman is that she once, once, babysat my kid, about 14 years ago, when were were visiting Seattle.”

Multiple news reports detail how arbitrary decisions made by artificial intelligence software rarely get a human review. Even though Facebook, by its own account, has 40,000 people working on safety and security. Even though there’s manifest evidence the algorithms still allow, and even amplify, toxic content.

Back in September, Meta announced on its blog that Facebook would ask its Oversight Board for guidance. In October, that board, which oversees Facebook’s parent company, Meta, found deficiencies in the appeals process. In November, Meta acknowledged the report, defended the program, and promised to continue exploring ways to further ensure “that we minimize our enforcement mistakes that have the greatest impact.”

This month, the board announced it’s taking public comment before Jan. 14, 2022. But the board’s policies are not binding on the company.

“Facebook wants you believe that the problems are unsolvable. They want you to believe in false choices,” said Facebook whistleblower Frances Haugen, testifying before the Senate Commerce Committee in October.

Haugen argued something akin to an old saw in Silicon Valley about persistent software problems: If it’s not a bug, it’s probably a feature. “They want you to believe that you must choose between divisive and extreme content, or losing one of the most important values our country was founded upon, free speech.”

There’s a happy ending to this story

Two days after I inquired with Facebook, the pages associated with Hitler’s Tasters went back up. There was no notice to those affected, but everything was restored. A Meta company spokesperson admitted to me the accounts were “incorrectly removed,” apologized for the mistaken deletions, and promised steps have been taken to prevent a reoccurrence. Naturally, I’m pleased, but is this any way to run a social media platform?

Sponsored

In related news, Meta recently moved its artificial intelligence group to the Reality Labs unit developing augmented and virtual reality products, according to The Information. The tech industry news site noted the shift means Meta’s AI team, “central to Meta’s efforts to detect harmful content on Facebook,” will now shift its primary focus to developing the metaverse—the virtual immersive world that is CEO Mark Zuckerberg’s latest obsession.

lower waypoint
next waypoint