Math may be just math, but algorithms are written by people
And people have biases. Just ask Safiya Noble.
In 2018, Safiya Noble exposed a culture of racism and sexism in the way discoverability is created online in a paper titled “Algorithms of Oppression: How Search Engines Reinforce Racism”.
The gist of that work is something already well known to readers of this website: racism, sexism and other biases are baked into algorithms, from H.R. software that screens out women and candidates of color for jobs to Facebook’s advertising platform, which allegedly enabled landlords to exclude women, people with disabilities, people of color, and other underrepresented communities. Or to Google’s photo application, that in the past automatically tagged African Americans as apes and animals.
Today such facts are recognized as facts (albeit still widely ignored, too often). But it took Noble seven years to see her ideas accepted, and published.
She first met the problem, in fact, in 2011, when she googled “Black girls” to find activities for her daughter and young nieces, only to get pages of racialized porn.
But one year later, when she was a Ph.D. student, her professors told her things like, ‘This research isn’t real. It’s impossible for algorithms to discriminate because algorithms are just math and math can’t be racist’.
Point is, actual math is, indeed “just math”. But specific algorithms are formulas written by humans, that can insert their biases into them, or unvoluntarily tweak them with data that are intrinsically biased.
The story of how Safiya Noble fought to get this truth accepted was recently told by Vogue. Here, you can find at least these reasons to read it:
You may also:
- Follow this blog by subscribing to my newsletter
- Directly support, via donations or other means, this blog or my other work
- Read my free ebooks and other publications