Refugees and digital risk

Refugees and digital risk

This post is inspired by an invitation from Andreas Hackl to a workshop organised by himself and Margie Cheesman on The risks of working online: refugees in the internet economy, supported by the UNHCR.

The digital society has various used for refugees beyond the usual. Digital traces might be used to support an asylum claim. Money transfers, distance communications and work opportunities become especially salient, and especially vulnerable. Trust technology could be used to distribute aid and track support needs. A number of risks are apparent. Hostile state and non-state actors can use the surveillance opportunities to target the person seeking asylum and their friends, family and loved ones. Digital verification can close the useful ambiguous spaces and conditions that refugees sometimes have to make use of. The UK Home Office’s hostile environment policy is an example here. The move away from cash payments increases some opportunities but also increases surveillance and tracing threats. Refugees often have to choose between adherence to the law and survival. They may be particularly vulnerable to cyberthreats and cybercrime.

There are many contexts in which refugees might face threats and barriers to digital inclusion. Work with Alex Wafer of Wits University in South Africa has looked at how migrants use digital labour in informal economies. Many are forced to use unreliable and insecure methods of payment and have to use platforms like Uber as jury-rigged trust technologies. This is in the context of significant unemployment and political racism directed at them.

So let’s sort out some principles and see how we can think about how to create resilience:

Continuity and capacity: there are many domains where continuity is disrupted and where digital means can help support it. I work in education so my mind immediately went to how refugees can be integrated into education and credentialing systems. There are loads of places where that might break down and impose extra costs. How can language qualifications be recognised? How do credentials maintain validity for employment or study? Are there challenges posed by different academic cultures? Various capacities are limited by the digital divide – different kinds of literacy might come into play alongside access.

Connectivity and convergence: the mobile internet is a vital resource, and existing and new infrastructures can be investigated for the support they provide, from Starlink to cafe wifi. Various risks arise from this such as the need to share insecure connections or devices. Some risks are baked into the design. For example, iOS devices do not normally allow multiple user accounts making privacy harder to maintain. These simple features or lack of them have iterative effects in terms of risk. We should also not lose sight of the way in which existing risks or disadvantages are made sharper or mitigated, for example, gender and ethnic vulnerabilities play out in very different ways in precarious situations. Refugees are a financial resource subject to a process of data extractive capitalism, technological/system experimentation and convergence (Madianou, 2019). That has effects on their autonomy and agency in the face of powerful humanitarian bodies and state agencies.

A counterpoint is that several of these principles might be inherently risky, or at least in tension, in the lives of refugees, especially that of continuity. Strategic opacity might be a useful goal, and this is where the possibilities of hidden hosting and communications, and decentralised payments could come into play. Generally for each technical solution proposed however there is also the corresponding problem of how to maintain it within a viable community. Nothing is automatic, and crypto et al does not work without a significant investment of time and labour. Therefore we might want to look first not directly at how refugees’ trajectories might be supported, but at how supportive networks can be brought into being and maintained using some of these security and trust technologies. The real questions then are the big ones of what principles inform our software and hardware design, how design can be community led and ethically protective.

Questions remain about the affordances of particular combinations of technology and the way in particular technological solutions are likely to evolve (Cheesman, 2022b). Something we should consider in future is the effect of the fracturing internet and the emergency of distinct and sometimes hostile digital infrastructures. An ethnography of infrastructure approach could work here (Leigh Starr, 1999). Refugees must move across and within these infrastructures, interleave different platforms, and balance the need for opacity with the need to maintain a digital identity and keep contact with home and receiving communities. Cheesman (2022a)  frames these challenges and tensions in dimensions of subjectivities, timescapes and materialities. We should always be aware of the uses of tech for social closure and exclusion, and the political risk facing refugees from new xenophobic movements as we see in South Africa and elsewhere at the moment.

Cheesman M (2022a) Infrastructure Justice and Humanitarianism: Blockchain’s Promises in Practice. Oxford.
Cheesman M (2022b) Self-Sovereignty for Refugees? The Contested Horizons of Digital Identity. Geopolitics 27(1): 134–159. DOI: 10.1080/14650045.2020.1823836.
Leigh Star S (1999) The ethnography of infrastructure. American Behavioral Scientist 43(3): 377–391.
Madianou M (2019) The Biometric Assemblage: Surveillance, Experimentation, Profit, and the Measuring of Refugee Bodies. Television & New Media 20(6). SAGE Publications: 581–599. DOI: 10.1177/1527476419857682.
Weitzberg K, Cheesman M, Martin A, et al. (2021) Between surveillance and recognition: Rethinking digital identity in aid. Big Data & Society 8(1). SAGE Publications Ltd: 20539517211006744. DOI: 10.1177/20539517211006744.

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.