the use of artificial intelligence to sort candidates worries NGOs

the use of artificial intelligence to sort candidates worries NGOs

In the United Kingdom, migrant rights defenders are concerned about the management of asylum requests. According to several NGOs, the British government will rely on artificial intelligence to decide between applications.

2 mins

From our correspondent in London,

The activist group Privacy International reveals concerns about one particular piece of software, called IPIC for “Immigration Matters Identification and Prioritization.” Its aim is to facilitate decision-making regarding migrants and asylum seekers likely to be deported. How ? By analyzing dozens of data such as state of health, ethnicity or judicial file, according to the NGO Privacy International. The program digests all this information and makes a recommendation about the individual, whether or not they should be deported.

The British government is said to have started using IPIC around 2019-2020, and has since faced a flood of requests. Technology is often presented in this context as the solution to delays which are only lengthening and weighing on public finances.

Also readUnited Kingdom: Labor government presents plan to combat illegal immigration

discriminatory bias

Rights organizations see several problems, such as the confidentiality of the data collected, particularly those relating to health. Then and above all, they denounce a lack of transparency. Privacy International took more than a year to obtain information, a right enshrined in law in United Kingdom. The NGO is concerned about the way in which this information is used. An example on health: the former Conservative Interior Minister, James Cleverly, believed that some asylum seekers pretended to have suicidal thoughts to influence the decision in their favor. The question then arises: can the algorithm be modified according to the government’s positions?

Associations, not only Privacy Internationalare particularly outraged that such important decisions, which concern the lives of more than 40,000 people, depend so much on technology. Studies have already shown that analytics tools exhibit discriminatory biases similar to those that exist in society. In addition, errors could be noted, particularly regarding the identity of people.

The difficulty of not following the recommendations of the IPIC software

Officially, the algorithm issues a recommendation which should facilitate the work of immigration officers. In fact, they face a very high number of requests and do not always have the means to process them in depth. Of course, it is possible to reject the recommendation, but the software then requires detailed justification. In short, it is much easier for ministry employees to follow the software’s recommendations. However, asylum seekers are not always informed that the technology has been used, thus limiting their options for appeal. The government has so far not wished to react.

Also readUnited Kingdom: legal immigration could fall in 2024

rf-5-general