upper waypoint

From Credit Scores to Job Applications: California's Reparations Task Force Looks to Algorithms

Save ArticleSave Article
Failed to save article

Please try again

Several people from California's Reparations Task Force in their virtual boxes during a meeting.
The California Reparations Task Force members listen to public comment during a virtual meeting on Jan. 28, 2022. (Beth LaBerge/KQED)

The tech industry is intricately linked to California’s identity and economy, bringing in an estimated $520 billion per year according to the Computing Technology Industry Association, a nonprofit trade association specializing in IT.

Roughly 2 million Californians have jobs because of the state’s tech economy.

The lack of Black representation in technology jobs has long been a point of contention for racial equity advocates, and claims of racial bias and discrimination within the industry remain persistent. In recent years, technology experts have raised alarms about racism being baked into the algorithms that determine credit worthiness, health treatments, job application status and more.

Discrimination in technology was featured at the January meetings of California’s Reparations Task Force, a nine-member body assigned the daunting job of conducting public meetings to study and develop reparation proposals for Black Californians. The task force, established by AB 3121, held meetings on Jan. 27 and 28 focused on public health and eligibility, in addition to technology.

As part of their scope of work, the task force is examining discriminatory practices in the public and private sectors, such as redlining and predatory lending, from 1868 to the present.

The expert technology panel included Yeshimabeit Milner, co-founder of Data for Black Lives, a organization that seeks to use data science to improve the lives of Black people; Vinhcent Le and Debra Gore-Mann from The Greenlining Institute, an Oakland-based public policy and advocacy group focused on the economic empowerment of people of color; and Safiya U. Noble, a MacArthur Foundation Fellow and professor of gender studies and African American studies at UCLA.

There are many ways the technology industry perpetuates the wealth gap, the experts told the task force. But, as Le and Gore-Mann said, harnessing technology may also be used as a reinvestment tool.

Before testifying about racism and discrimination in the tech industry, Noble, the author of “Algorithms of Oppression,” shared an anecdote about her family.

“I’m a descendant of enslaved Black Americans,” she said. “My paternal grandmother was born into sharecropping, and she and my grandfather bought many of our relatives out of this de facto system of slavery in Mississippi.”

Noble spoke about how algorithms used in criminal justice, resource allocation and surveillance can often lead to disparate outcomes for Black people.

Related Posts

“Each of these types of carceral technology are unfairly pointed at vulnerable Black people,” she said. “California is central to the origin story of the tech industry, and so it must be understood as a powerful element in California history.”

“Technology often allows systems to operate at a much faster speed and at a much higher scale, while also making the ability to intervene and question more difficult.”

Milner explained some of the nuances of algorithms to the task force and the audience watching from their computer screens. A simple definition for algorithms, Milner said, is that they are “a set of step-by-step instructions to solve a problem.”

“A recipe is an algorithm, a list of instructions … the result is based on what we define from the beginning of the recipe as a success,” she added. “Data has been weaponized against Black communities, because the bullets, police dogs and fire hoses of the past have become the predictive policing, data-driven voter suppression and facial recognition of the present.”

There’s a lack of transparency, accountability and clarity on what exactly goes into algorithms that are used to determine outcomes.

“While it is in violation of federal law to deny people housing, employment and education based on race, you can’t sue an algorithm,” Milner said. “You don’t even need to use race as a variable. The legacy of slavery [has] made ZIP codes proxies for race.”

Le and Gore-Mann pointed out how racism in the tech industry can be traced back to redlining.

“Redlining lives on in the data that determines which neighborhoods get good access to the internet or new services and technologies,” Le said. “This is why the reparations work is so important. It gives us a chance to change the data points that give rise to data-driven discrimination.”

Without proper oversight, systems can embed the history of discrimination and redlining into decisions, Le added.

“We need race-aware data so we can test these systems that control access to economic opportunity for bias and discrimination,” he said. “The housing discrimination and the redlining of the past is still quite well and alive today. Our anti-discrimination laws have not kept pace with these technologies.”

One solution Gore-Mann and Le recognized is a mapping tool designed with social equity in mind.

CalEnviroScreen uses equity indicators like poverty, unemployment and exposure to pollution to identify marginalized communities that are eligible for increased investment from the state’s Greenhouse Gas Reduction Fund. With the help of this algorithm, over $4.5 billion in state funding has gone toward investments in Black and Latino communities.

This program is a potential template to undo past discrimination and drive more investments into communities to improve economic opportunities.

“The reparations solutions will not close the gap if we’re continually allowing technology to widen the gap,” Gore-Mann said. “We see our praxis as the intersection between that economic equity, racial equity and building wealth.

“The data goes both ways — to help us analyze and create good policies and remove disparate impacts, but also to hold companies accountable.”

Sponsored

lower waypoint
next waypoint