Here are the biased algorithms the UK government uses to make high-level decisions
Algorithms all over
In the UK alone, several systems are being or have recently been used to make important decisions that determine the choices, opportunities, and legal position of certain sections of the public.
At the start of August, the Home Office agreed toscrap its visa “streaming tool” designed to sort visa applications into risk categories (red, amber, green) indicated how much further scrutiny was needed. This followed a legal challenge from campaign groupFoxgloveand theJoint Council for the Welfare of Immigrantscharity, claiming that the algorithm discriminated on the basis of nationality. Before this case could reach court, Home Secretary Priti Patel pledged to halt the use of the algorithm and to commit to a substantive redesign.
The Metropolitan Police Service’s“gangs matrix”is a database used to record suspected gang members and undertake automated risk assessments. It informs police interventions including stop and search, and arrest. A number of concerns have been raised regarding itspotentially discriminatory impact, its inclusion of potential victims of gang violence, and itsfailure to comply with data protectionlaw.
Many councils in England use algorithms to check benefit entitlements and detect welfare fraud. Dr. Joanna Redden of Cardiff University’s Data Justice Lab has found a number of authorities have halted such algorithm use after encountering problems with errors and bias. But also, significantly,she told the Guardianthere had been “a failure to consult with the public and particularly with those who will be most affected by the use of these automated and predictive systems before implementing them.”
This follows an important warning fromPhilip Alston, the UN special rapporteur for extreme poverty, that the UK risks “stumbling zombie-like into a digital welfare dystopia.” He argued that too often technology is being used to reduce people’s benefits, set up intrusive surveillance, and generate profits for private companies.
The UK government has also proposed anew algorithmfor assessing how many new houses English local authority areas should plan to build. The effect of this system remains to be seen, though the model seems to suggest more houses should be built in southern rural areas, instead of the more-expected urban areas, particularly northern cities. This raises serious questions of fair resource distribution.
Why does this matter?
The use of algorithmic systems by public authorities to make decisions that have a significant impact on our lives points to a number of crucial trends in government. As well as increasing the speed and scale at which decisions can be made, algorithmic systems also change the way those decisions are made and the forms of public scrutiny that are possible.
This points to a shift in the government’s perspective of, and expectations for, accountability. Algorithmic systems are opaque and complex “black boxes” that enable powerful political decisions to be made based on mathematical calculations, in ways not always clearly tied to legal requirements.
This summer alone, there have been at least three high-profile legal challenges to the use of algorithmic systems by public authorities, relating to the A-level and visa streaming systems, as well as thegovernment’s COVID-19 test and trace tool.Similarly, South Wales Police’s use of facial recognition software was declared unlawfulby the Court of Appeal.
While the purpose and nature of each of these systems are different, they share common features. Each system has been implemented without adequate oversight nor clarity regarding their lawfulness.
Failure of public authorities to ensure that algorithmic systems are accountable is at worst a deliberate attempt to hinder democratic processes by shielding algorithmic systems from public scrutiny. And at best, it represents a highly negligent attitude towards theresponsibility of the governmentto adhere to the rule of law, to provide transparency, and to ensure fairness and the protection of human rights.
With this in mind, it is important that we demand accountability from the government as it increases its use of algorithms, so that we retain democratic control over the direction of our society, and ourselves.
This article is republished fromThe ConversationbyAdam Harkens, Research Associate, Birmingham Law School,University of Birminghamunder a Creative Commons license. Read theoriginal article.
Story byThe Conversation
An independent news and commentary website produced by academics and journalists.An independent news and commentary website produced by academics and journalists.
Get the TNW newsletter
Get the most important tech news in your inbox each week.