After seeing fighters destroy his home, Jean thought he had found safety in Britain. However, he was told he was too tall to be 16 and placed with hundreds of other adult asylum seekers without any further assistance.
Jean, who was alone and exhausted, said that border officials told him he was 26 years old when he arrived in 2012, which was ten years older than his true age. For privacy reasons, Jean decided to use a pseudonym and refrained from disclosing his home country in central Africa.
“I look ten years older because I am taller, that was the reason they gave,” Jean, whose age was officially corrected years after an appeal, told the Thomson Reuters Foundation.
“They don’t believe you when you come to share your story. I was in a terrible place. I desperately needed assistance. One officer’s decision changed my entire life.
Artificial intelligence will now make the critical decision—the initial age assessment made by border guards—but organisations warn that the technology could reinforce prejudices and make the same mistakes Jean did.
In July, Britain said it would use face age estimate technology in 2026 to help identify migrants who say they are younger than 18, especially those travelling from France in small boats.
Prime Minister Keir Starmer is under pressure to reduce migration as nationalist Nigel Farage’s anti-immigrant Reform UK party surges in polls.
Compared to the same period in 2024, more than 35,000 people have crossed the English Channel in small boats this year, a 33% increase.
Rights groups argue that age estimation using facial recognition technology is a delicate task best left to trained professionals and is dehumanising.
They fear that more children without official documentation or with forged documents will be unintentionally admitted to adult asylum hotels without the necessary safeguards and support if artificial intelligence is used.
“Determining the ages of migrants is a complex process that shouldn’t be subject to shortcuts,” said Luke Geoghegan, head of policy and research at the British Association of Social Workers.
“This should never be compromised for perceived quicker results through artificial intelligence,” he wrote in an email.
A charity claims that local governments can offer social workers, legal aid, education, and other forms of support to unaccompanied child migrants.
The interior ministry of the Home Office says facial age estimation technology is a cheap way to prevent adults from pretending to be children to exploit the asylum system.
“Strong age assessments for migrants are necessary to maintain border security,” a spokeswoman said.
“This technology will not be used in isolation, but rather in combination with a variety of methods used by certified assessors.”
Digital solutions?
As the number of people fleeing poverty, conflict, natural disasters, and other unrest reaches all-time highs, states are increasingly managing migration with digital solutions.
Britain declared in April that it would use AI to speed up asylum decisions by giving caseworkers crucial interview summaries and recommendations tailored to each nation.
In July, Britain and OpenAI, the company that created ChatGPT, teamed up to look into the possible uses of AI in domains like security, education technology, law, and the military.
The nonprofit organisation Freedom from Torture’s head of asylum advocacy, Sile Reynolds, stated that “the asylum system must not be the testing ground for what are currently deeply flawed AI tools operating with minimal transparency and safeguards.”
Using such technology could be harmful, according to Anna Bacciarelli, a senior AI researcher with Human Rights Watch.
“Besides exposing vulnerable children and young people to a dehumanizing process that may risk their privacy and other human rights, there is also uncertainty about whether facial age estimation actually works”
Prejudices
Because it targets specific racial groups and extracts sensitive biometric data, digital rights organisations have questioned London police’s use of facial recognition technology during protests and events like the Notting Hill Carnival.
“There will always be concerns about sensitive data, biometric data in particular, being taken from vulnerable people and then sought by the government and used against them,” said Tim Squirrell, head of strategy at Foxglove, a British tech rights group.
It is also completely unaccountable. The machine says you’re 19. What comes next? How can you question that? Because it is practically impossible to comprehend how that has been trained.
Scientists warn that AI may reinforce prejudices against specific communities because it is trained on historical facts.
Child asylum seekers have been told they are too tall or too hairy to be younger than 18, according to the Greater Manchester Immigration Aid Unit (GMIAU), which helps migrants.
“Children are not being treated appropriately.” Rivka Shaw, a policy officer at GMIAU, said, “I think racism and adultification are related to the way they’re being treated as subjects of immigration control.”
Inaccurate assessment
Jean, who is now thirty, felt isolated and contemplated suicide as a result of the inaccurate age assessment.
“I was terrified. My mind was all over the place. “I just wanted to end my life,” said Jean, who was granted asylum in 2018.
About half of all migrants who had their ages reassessed in 2024—roughly 680—were children who were inadvertently placed in adult hotels, according to the Helen Bamber Foundation, an organisation that collected data through Freedom of Information requests.
The charity’s head of policy, Kamena Dorling, claims that “a child entering an adult accommodation is essentially placed in a shared room with a bunch of strangers where there are no additional safeguarding checks.”
The Independent Chief Inspector of Borders and Immigration, which looks into Home Office policy, requested in a July report that the government include qualified child experts.
Dorling said, “Child protection professionals should make age decisions.”
“Now, every problem we have with human decision-making also affects AI decision-making.” — Reuters