Algorithms behaving badly: 2020 edition

The racism problem

A lot of problems with algorithmic decision-making come down to bias, but some instances are more explicit than others. The Markupreportedthat Google’s ad portal connects the keywords “Black girls,” “Asian girls,” and “Latina girls” (but not “White girls”) to porn. (Google blocked the automated suggestions after The Markup reached out to the company. In the meantime, Googles’ search algorithm briefly sent our story to the first page of search results for the word “porn.”)

Sometimes the consequences of such bias can be severe.

Some medical algorithms are racially biased—deliberately. A paper in the New England Journal of Medicine identified 13 examples ofrace “corrections”integrated into tools used by doctors to determine who receives certain medical interventions, like heart surgery, antibiotics for urinary tract infections, and screenings for breast cancer. The tools assume patients of different races are at different risks for certain diseases—assumptions not always well-grounded in science, according to the researchers. The result:a Black man who needs a kidney transplantwas deemed not eligible, as Consumer Reports reported, among other disasters.

A related issue emerged in a lawsuit against the National Football League: Black players allege it’s muchharder to receive compensation for concussion-related dementiabecause of the way the league evaluates neurocognitive function. Essentially, they say, the league assumes Black players inherently have lower cognitive function than White players and weighs their eligibility for payouts accordingly.

Algorithms that make renters’ and lower-income people’s lives more difficult

If you’ve ever rented a home, and the chances are you have, as renting has skyrocketed since the 2008 financial crisis, a landlord has likely run you through a tenant screening service. Whatever results the background check algorithms spit out generally constitute the difference between getting to rent the home in question and getting denied—and, The Markup found, those reports areoften faulty. The computer-generated reports confuse identities, misconstrue minor run-ins with law enforcement as criminal records, and misreport evictions. And what little oversight exists typically comes too late for the wrongfully denied.

Similarly, MIT Technology Review reported, lawyers who work with low-income people are finding themselvesbutting up against inscrutable, unaccountable algorithms, created by private companies, that do things like decide which children enter foster care, allocate Medicaid services, and determine access to unemployment benefits.

Policing and persecution

There’s an enduring allure to the idea of predicting crimes before they happen, even as police department after police department has discoveredproblems with data-drivenmodels.

A case in point: the Pasco County Sheriff’s Department, which The Tampa Bay Times found routinelymonitored and harassed people it identified as potential criminals. The department “sends deputies to find and interrogate anyone whose name appears” on a list generated from “arrest histories, unspecified intelligence and arbitrary decisions by police analysts,” the newspaper reported. Deputies appeared at people’s homes in the middle of the night to conduct searches and wrote tickets for minor things like missing mailbox numbers. Many of those targeted were minors. The sheriff’s department, in response, said the newspaper was cherry-picking examples and conflating legitimate police tactics with harassment.

Facial recognition software, another policing-related algorithmic tool, led to the faulty arrest and detention of a Detroit man for a crime he did not commit, The New York Times reported in anarticle cataloging the technology’s privacy, accuracy, and race problems.

And in a particularly chilling development, The Washington Post reported, the Chinese tech company Huawei has been testing tools that could scan faces in crowds for ethnic features and send “Uighur alarms” to authorities. The Chinese government has detained members of the Muslim minority group en masse in prison camps—persecution thatappears to be expanding. Huawei USA spokesperson Glenn Schloss told the Post the tool “is simply atest, andit has not seen real-world application.”

Workplace surveillance

Big employers are turning to algorithms to help monitor their workers. This year, Microsoftapologizedafter it enabled a Microsoft 365 feature that allowed managers to monitor and analyze theirworkers’ “productivity.”The productivity score factored in things like an individual’s participation in group chats and the number of emails sent.

Meanwhile, Business Insider reported that Whole Foods uses heat maps, which weigh things like a number of employee complaints and the local unemployment rate, to predictwhich stores might see unionization attempts. Whole Foods is owned by Amazon, which has anelaborate apparatusfor monitoring worker behavior.

Revenge of the students

Anyone searching for inspiration in the fight against an algorithm-dominated tomorrow might look to students in the United Kingdom, whotook to the streetsafter the education system decided to use an algorithm togive grades based on past performance during the pandemic. Orthese kids, who discovered their tests were being graded by an algorithm—and then promptly figured out how to exploit it by essentially mashing up a bunch of keywords.

This article wasoriginally published on The Markupand was republished under theCreative Commons Attribution-NonCommercial-NoDerivativeslicense.

Story byThe Markup