How Safiya Umoja Noble Exposes Biases in Tech Platforms

Illustration of a woman lifting up a pie chart. Abstract geometric shapes float in the background.

Image Description: Illustration of a woman lifting up a pie chart. Abstract geometric shapes float in the background.

How Safiya Umoja Noble Exposes Biases in Tech Platforms

Professor, author, and social scientist Safiya Umoja Noble uncovers algorithmic biases and explores how to remove them so as to create more ethical search platforms.

Mar 16, 2021

Social scientist is ‘In pursuit of socially responsible information and technology.’


Photo of Safiya Umoja Noble standing at a podium, mid speech, gesticulating with her hand. She is wearing a black jacket.

When seeking answers on a search engine, it’s more likely than not that users expect to receive credible and unbiased information. While that may be a reasonable assumption when searching for directions to the local library or for information about a nearby restaurant, searches for specific terms—“beautiful,” for example—have historically generated biased results.

This is notable because information that’s deemed reliable can influence and shape opinions of millions.

In recent years, media literacy has taken on greater importance because of the extraordinary power wielded by large tech firms and over concerns about the spread of misinformation. Among those raising awareness about the inherent biases in tech algorithms is Safiya Umoja Noble, the author of three books on search engines, including the best seller “Algorithms of Oppression: How Search Engines Reinforce Racism.”

An associate professor at the University of California, Los Angeles (UCLA) in the Department of Information Studies and the co-founder and co-director of the UCLA Center for Critical Internet Inquiry, Noble is also a research associate at the Oxford Internet Institute at the University of Oxford, and board member of the non-profit Cyber Civil Rights Initiative.

Noble became interested in the way in which search engines feed us information when her own searches produced racist and sexist results.

That inspired her to collaborate with computer scientists and conduct research and educate others on ways to establish more ethical and credible algorithms. In the simplest terms, Noble found that search engines often have built-in biases that mirror public life.

“The technologies are already existing in systems of structural oppression, structural racism, global, social, political and economic inequality and the technologies are often picking up and amplifying and deepening those existing systems,” Noble said in a 2020 TIME interview with Prince Harry and Meghan Markle, the Duke and Duchess of Sussex.

Here’s how it all started for Noble: In 2009, she performed a Google search for “Black girls'' and was presented with a link to a pornographic website and a variety of sexualized images—an eye-opening and deeply unsettling experience she recounted in her 2014 TEDx Talk titled, “How biased are our algorithms?”

Later, she sought to raise awareness by writing about tech biases in several women’s magazines, including Bitch Magazine. Although the feminist publication was the first to respond, it still needed some persuading. To bring the point home, Noble typed “feminist magazine” into Google, and Bitch Magazine was nowhere to be found in the first few pages of results. Filling the void instead was sexist content. Finally, the magazine was convinced.

While the results eventually improved for the term “Black girls,” searches for “Latinas” or “Asian girls” both continued to deliver pornographic images and links. Conversely, a search for the word “beautiful” would serve a variety of images of a white woman with fair, unblemished skin.

“Search engines are amazing tools for us, but we don’t see the ethical dilemmas, the biases, the ways money influences the kinds of results we get,” she stated in the TED Talk. “Instead of being bombarded by problematic ideas and images, what if we imagined the search differently?”

Noble continued, “In my imagine-engine, you have to choose your biases, it’s visible to us. You want racism? Check the box. You want sexism? Check the box. But you have to opt into that rather than it being the default. That’s a different way of imaging for looking for information and a much more powerful one.”

Noble calls on tech firms to collaborate with her and other scholars demanding changes.

“We know public policy and regulation is crucial to these conversations about how business is conducted in the United States and abroad,” said Noble in the TIME100 Talks series. “Many governments have to contend with consumer harm and safety around products...We can also look at human and civil rights models of interventions at the level of governments and civic society organizations. I don’t think the tech industry can regulate itself. It’s a little like the fox guarding the hen house.”

Written by Christina Claus

Christina Claus is a former Senior Inbound Content Developer at Founding Partner, Morey Creative Studios. Currently she is the Marketing Manager at the Association of Executive Search and Leadership Consultants.

Leave a Comment