Ten examples of algorithmic discrimination

(Paywall-free popularization like this is what I do for a living. To support me, see the end of this post)

Little horror stories that everybody should know.

Ten examples of algorithmic discrimination /img/algorithm-bias.jpg

This post is a short lists of ten real examples, from different studies, of how algorithms embedded in search engines or social media platforms can enhance or perpetuate prejudices and discrimination. The sources and explanation for all examples, which are just copied as is, are the four articles linked after the list.

1: despite regulations, insurance companies charge minority communities higher premiums than white communities, even when the risks are the same.

2: When Amazon decides to provide Amazon Prime free same-day shipping to its “best” customers, it’s effectively redlining - reinforcing the unfairness of the past in new and increasingly algorithmic ways.

3: A few years ago, Amazon tried to use AI to build a resume-screening tool, using resumes the company had collected for a decade, But since those resumes tended to come from men, the system learned to discriminate against women.

4: A husband and wife both applied for an Apple Card, and got widely different credit limits.

5: Companies that provide criminal background checks can place ads in Google search result. One study showed that when a search was performed on a name that was “racially associated” with the black community, the results were much more likely to be accompanied by an ad suggesting that the person had a criminal record - regardless of whether or not they did. If an employer searched the name of a prospective hire, only to be confronted with ads suggesting that the person had a prior arrest, you can imagine how that could affect the applicant’s career prospects.

6: Google’s online advertising system showed an ad for high-income jobs to men much more often than it showed the ad to women, a new study by Carnegie Mellon University researchers found.

7: According to the US Federal Trade Commission, online advertisers are able to target people who live in low-income neighborhoods with high-interest loans.

8: A Google Images search for “C.E.O.” produced 11 percent women, even though 27 percent of United States chief executives are women.

9: The autocomplete feature on search engines like Google and Bing learn and evolve based on what people do online [and this obviously perpetuates existing biases]. A Google search for “Are transgender,” for instance, suggested as autocompletion “Are transgenders going to hell.”

10: Google showed an ad for a career coaching service advertising “$200k+” executive positions 1,852 times to men and 318 times to women.

The common thread

Fairness and accuracy are not necessarily the same thing.

The very question of building a “fair” system is essentially nonsensical. The reason, quoting again the articles below, is that those systems try to answer social questions that don’t necessarily have an objective answer. For instance, algorithms that claim to predict a person’s recidivism don’t ultimately address the ethical question of whether someone deserves parole. It’s fundamentally a question of what our values are, and what the purpose of the criminal justice system is.

Sources

Who writes this, why, and how to help

I am Marco Fioretti, tech writer and aspiring polymath doing human-digital research and popularization.
I do it because YOUR civil rights and the quality of YOUR life depend every year more on how software is used AROUND you.

To this end, I have already shared more than a million words on this blog, without any paywall or user tracking, and am sharing the next million through a newsletter, also without any paywall.

The more direct support I get, the more I can continue to inform for free parents, teachers, decision makers, and everybody else who should know more stuff like this. You can support me with paid subscriptions to my newsletter, donations via PayPal (mfioretti@nexaima.net) or LiberaPay, or in any of the other ways listed here.THANKS for your support!