upper waypoint

Black and Brown Gig Workers Report Lower Ratings — But Companies Make Bias Hard to Track

Save ArticleSave Article
Failed to save article

Please try again

A hand holds up a smart phone that shows a screen for rating Uber drivers
The driver rating screen in an Uber app is seen February 12, 2016 in Washington, DC. (Brendan Smialowski/AFP via Getty Images)

For almost a decade, on-demand services like Lyft, DoorDash and Instacart have allowed customers to rate and tip workers with just the tap of a finger. Even though it has been almost a decade, we still have no idea how much this allows customer bias to hurt Black and brown workers.

While rating workers on an app is new, it’s just the latest system companies have devised to allow customers to impact pay or promotions. If their ratings aren’t perfect, workers can lose income, or even their job, and there’s mounting evidence these systems allow bias to hurt workers of color.

How Restaurants Paved the Way

Long before apps entered the picture, tipping established a way for consumer bias to impact worker pay. Michael Lynn, a professor of consumer behavior at Cornell University, said he has long suspected bias influenced tips.

“I believe that we have an implicit bias against people of color in this country, and I believe those implicit biases are likely to impact tipping,” Lynn said. But over and over again, he’s come up against a major hurdle in proving it.

“I’ve asked a lot of different companies to give me data,” Lynn said. “But there’s no interest on their part in finding out because it doesn’t benefit them. If there is racism, that puts them in a bind, and it’s worse when there is racism and they know about it.”

That said, Lynn was able to get some data to run a small study on waiters in 2008. He found that customers did indeed give workers of color lower tips.

Sponsored

Similar Findings in Other Industries

Studies have consistently shown that Black, brown and immigrant taxi drivers get lower tips; and health care, management and sales professionals get more negative customer feedback.

The trend is the same in online reviews: Black and brown professors get worse student evaluations, while freelancers on Fiverr and TaskRabbit get lower reviews, which means fewer jobs and less money.

Customers may not even realize they’re treating Black and brown workers differently, giving them lower tips, leaving less positive feedback or rating them lower on an app. The cause of all this, according to researchers, is implicit or unconscious bias.

Related Stories

To fully understand the extent of the problem, researchers would need to look at large data sets and see what different groups of workers were getting tipped and rated. Race is just one possible trait to evaluate for bias. The same problem could exist for gender, age, ability or any category that is regularly discriminated against.

But while many companies are resistant to collect or share data along these lines, on-demand service apps like DoorDash and Uber don’t even have to gather worker demographic data.

That’s because of the way they classify their workers. App companies call workers contractors instead of employees. The contractor status also protects these on-demand app service companies from liability if is found that customers are discriminating against any of the workers.

The news that the app rating system is problematic is far from new. Labor advocates have been warning about it since the apps started. In 2016, there was an entire study done by Data & Society titled “Discriminating Tastes: Customer Ratings as Vehicles for Bias,” which was co-authored by Alex Rosenblat, who now works for Uber as the Head of Marketplace Policy, Fairness and Research.

What the Companies Say

KQED reached out to several major app companies: DoorDash, Lyft, Uber and TaskRabbit. Only Lyft and DoorDash responded.

DoorDash PR representatives talked about protocols for kicking overtly racist customers off the app, but didn’t mention anything about a system for detecting or addressing implicit bias.

A representative from Lyft said the company had commissioned a study to understand the extent of the problem. The PR representative said the company found no evidence of implicit bias in ratings, but it’s not possible to confirm the veracity of this assessment, as the company has not made the results or methodology of the study public.

Also Lyft, like many other app companies, does not gather demographic information on workers. Without that data, researchers say it’s impossible to know with certainty how much a rating and/or tip system allows for implicit bias to hurt workers. The companies are in the dark, and so are the workers.

Ashley Salas would love to know what led to non-perfect ratings she got delivering for Instacart in San Francisco. After those, she said, everything changed.

“It got really, really hard. I went from making $200 a day to struggling to make $100,” Salas said.

That’s $100 a day before expenses like gas and wear and tear on her car, while she went to school for radiology and took care of her newborn baby.

Did she get low ratings because she did something wrong? Were the customers just grumpy? Or did they react negatively to who she is? Salas is part Pacific Islander, part Native American.

“It’s kind of a bummer,” Salas said. “I would have wished to know why so I could improve myself.”

Frustrated, Salas reached out to Gig Workers Rising, an advocacy group for app workers. Lead organizer Lauren Casey said she has heard this same story again and again from workers of color.

Casey said, “Their performance at work is held to a different standard and in turn they receive worse ratings.”

A representative from Instacart said it has policies to deal with overt racism, but like other app companies, there’s no mechanism for detecting implicit bias, let alone addressing it.

No Data, No Context

Stanford University law professor Richard Ford said the app rating system has magnified the problem of implicit bias, making it easier for customers to hurt workers and harder for workers to prove it is happening.

“You don’t have context, and you don’t have the interpersonal reactions that might give you some clue that the ratings were based on race,” Ford said.

All you have is a number, and given our society’s increasing fetishization of data, Ford said a number without context can be very dangerous. “The difference in today’s environment is that it looks more objective. You’re getting, you know, a numerical rating. How could you argue with the numbers?”

Even if the ratings are high, it doesn’t mean they are fair. It’s possible that a person of different a race, sex or origin had to work harder to get good ratings.

UC Hastings labor law professor Veena Dubal has interviewed more than 100 Lyft and Uber drivers. She said Black and brown drivers often talk about having to perform to make white customers happy.

“There’s a lot of emotional labor and a lot of emotional performance that goes into ensuring that you’re not getting poor ratings, because otherwise you’re going to get fired. It’s almost that you have to play into the racial sensibilities of consumers,” Dubal said.

Some restaurants pool tips so any negative impacts from implicit bias are shared by the whole staff. App companies could adjust tips and ratings for Black and brown drivers to compensate for bias, but that means first figuring out how much lower they are on average.

Thanks to Proposition 22, app companies face no legal pressure to gather the necessary demographic data. Without data, individual workers are left to interpret their own experience, isolated and unprotected.

Sponsored

lower waypoint
next waypoint