Bots at the Gate

Summary:

This report focuses on the impacts of automated decision-making in Canada’s immigration and refugee system from a human rights perspective. It highlights how the use of algorithmic and automated technologies to replace or augment administrative decision-making in this context threatens to create a laboratory for high-risk experiments within an already highly discretionary system. Vulnerable and under-resourced communities such as non-citizens often have access to less robust human rights protections and fewer resources with which to defend those rights. Adopting these technologies in an irresponsible manner may only serve to exacerbate these disparities.

The use of these technologies is not merely speculative: the Canadian government has already been experimenting with their adoption in the immigration context since at least 2014. For example, the federal government has been in the process of developing a system of “predictive analytics” to automate certain activities currently conducted by immigration officials and to support the evaluation of some immigrant and visitor applications. The government has also quietly sought input from the private sector related to a 2018 pilot project for an “Artificial Intelligence Solution” in immigration decision-making and assessments, including in Humanitarian and Compassionate applications and Pre-Removal Risk Assessments. These two applications
are often used as a last resort by vulnerable people fleeing violence and war to remain in Canada.

The ramifications of using automated decision-making in the immigration and refugee space are farreaching. Hundreds of thousands of people enter Canada every year through a variety of applications for
temporary and permanent status. Many come from war-torn countries seeking protection from violence and persecution. The nuanced and complex nature of many refugee and immigration claims may be lost on these technologies, leading to serious breaches of internationally and domestically protected human rights, in the form of bias, discrimination, privacy breaches, due process and procedural fairness issues, among others.

These systems will have life-and-death ramifications for ordinary people, many of whom are fleeing for their lives.

Author(s): Molnar, Petra and Lex Gill

Publisher or Journal: International Human Rights Program (Faculty of Law, University of Toronto) and the Citizen Lab (Munk School of Global Affairs and Public Policy, University of Toronto)

Year of Publication: 2018

Document Type: PDF

Link: https://citizenlab.ca/wp-content/uploads/2018/09/IHRP-Automated-Systems-Report-Web-V2.pdf

Nach oben scrollen