Data Collection & Mobility Control

migreurop-300x254.png

Studies on “bordering” are largely concerned with more visible technologies, such as fences, radars or drones.

But recent technologies’ advantages are being generalised under the pretext of protecting exiles, providing them with a service or even reinforcing their autonomy. These supposed protections are a corollary to the control of frontiers and, at the level of the EU and its member states, these technologies function as devices of “social sorting” and for the differential attribution of rights.

In numerous camps around the world, asylum-seekers receive a monthly financial payment via a prepaid card, which

is presented by the UNHCR and states as a means for increasing recipients’ autonomy. But these cards also facilitate surveillance (by making it possible to trace cash withdrawals and transactions) and control (recipients can only buy items considered useful by the UNHCR and from approved vendors).

Equally, the UNHCR’s strategy of “digital inclusion” and “digital identity” for refugees rests on the idea of increasing their autonomy and participation in economic and social life, and combatting identity fraud. In Jordan, since 2016, an iris scanner designed by IrisGuard has been used to identify asylum-seekers in Zaatari camp. Implemented to “protect the identity” of refugees and guarantee them a “civil status”, it also contributes to controlling them. The system communicates automatically with UNHCR’s registration database to confirm the identity of the beneficiary, checks their account balance through Jordan Ahli Bank and Middle East Payment Services, and then confirms the purchase.

Applications such as Whatsapp, Viber, Skype and Facebook are not at first glance perceived as technologies of control. They are, however, tools for extracting data, and they are not only deployed by border guards, the police or the asylum authorities, but are also increasingly used by the UNHCR and IOM. Surveillance and control through these technologies is imperceptible, principally because the circuits of data extraction remain largely unknown.

While civil society as a whole has very little knowledge of the risks raised by these applications, in terms of the

protection of privacy and personal data, exiles are even more severely impacted:

in part because, even less so than others, they are not able to give any kind of consent regarding the collection or use

of their data. On the other hand, their fate is closely tied to the use of these digital tools. Thus, when applications fail, when there is no connection, or when calls are unsuccessful, the risk of exiles being denied their rights, including humanitarian aid, is all the more likely.

The ways in which digital technologies can be used has been limited by privacy regulations. So when the European

Asylum Support Office (EASO) attempted to use social networks to monitor migratory routes, it was forced to back down. However, it is not yet possible to hold private agencies or actors (such as Microsoft, Accenture, Leonardo, etc.)

accountable in the field of migration, in particular regarding how and why they use technologies at the border, in refugee camps and in detention centres.

The European Pact on Asylum and Migration foresees the establishment of an independent control mechanism to

guarantee respect for fundamental rights during “screening” procedures at the borders. But it says nothing of responsibility for the extraction of data nor about technologies deployed at borders, whether during control procedures or after them, leaving many questions unanswered.

Big-Data-209x300.png

Full titleData and New Technologies, the Hidden Face of Mobility Control
AuthorUnspecified
PublisherMigreurop
Year2020
Media typeUnspecified
Linkhttp://migreurop.org/IMG/pdf/note_12_en.pdf
Topics Border and Surveillance Technology & Industry, European Agencies (Frontex, GIZ & Co), Perspectives on Migration
Regions All Regions, West Africa

Back to index