After the COVID-19 pandemic halted many asylum procedures across Europe, new technologies are reviving these types of systems. Out of lie recognition tools tested at the edge to a program for verifying documents and transcribes interviews, a wide range of solutions is being made use of in asylum applications. This article explores how these solutions have reshaped the ways asylum procedures will be conducted. That reveals just how asylum seekers will be transformed into required hindered techno-users: They are asked to conform to a series of techno-bureaucratic steps and to keep up with capricious tiny changes in criteria and deadlines. This obstructs all their capacity to find their way these systems and to go after their right for coverage.
It also illustrates how these technologies are embedded in refugee governance: They accomplish the ‘circuits of financial-humanitarianism’ that function through a whirlwind of dispersed technological requirements. These requirements increase asylum seekers’ socio-legal precarity simply by hindering these people from interacting with the stations of protection. It further states that analyses of securitization and victimization should be put together with an insight into the disciplinary mechanisms of technologies, in which migrants happen to be turned into data-generating subjects who are self-disciplined by their reliance on technology.
Drawing on Foucault’s notion of power/knowledge and comarcal knowledge, the article states that these technologies have an inherent obstructiveness. There is a double effect: www.ascella-llc.com/the-counseling-services-offers-free-confidential-counseling-services-to-enrolled-students while they assist to expedite the asylum process, they also help to make it difficult designed for refugees to navigate these kinds of systems. They may be positioned in a ‘knowledge deficit’ that makes all of them vulnerable to illegitimate decisions of non-governmental stars, and ill-informed and unreliable narratives about their situations. Moreover, that they pose new risks of’machine mistakes’ that may result in erroneous or discriminatory outcomes.