Biometric aid data and the Taliban
How long is data stored? Who has copies? Can it be deleted? What about informed consent? Re-examining aid data security after the Taliban’s rise.
By Irwin Loy for The New Humanitarian
From ethical dilemmas on data security to worst-case scenarios unfolding in real time — the Taliban’s rise to power in Afghanistan is spurring urgent concern about the safety of data that aid groups have collected over 20 years.
Data protection experts warn that aid groups must quickly review and safeguard sensitive information on Afghans who have received emergency relief and other services. Humanitarian agencies are among those that have tracked, stored, and shared data linked to millions of Afghans — including precise biometric data like fingerprints or iris scans.
Crucially, some of this data has been collected by the now-deposed Afghan government — raising concern that the Taliban have inherited databases and technology that could be used to identify people linked to previous regimes or international forces, or members of persecuted groups who have received aid.
“The Taliban have been given the keys to the server room, so to speak,” said Karl Steinacker, a former official with the UN’s refugee agency, UNHCR. He now advises civil society organisations on digital identity.
The New Humanitarian spoke with Steinacker and Katja Lindskov Jacobsen, a senior researcher at the University of Copenhagen’s Centre for Military Studies, to unpack the issues. In the interview, parts of which are excerpted below, they discussed the potential risks, why aid groups collect so much data in the first place, and the right to be forgotten.
It’s unclear exactly how much data aid agencies have collected and shared over the years, or what the Taliban have access to now, which underscores the need for a swift review, Steinacker said.
But aid groups or international donors have had their hands in an enormous range of data through two decades of programming: registration for millions who received food aid or mobile cash transfers; digitised government identity cards linked to biometric data; or iris scans for refugees in neighbouring Pakistan, for example.
UNHCR did not respond to a request for comment. Other agencies, including the migration agency (IOM) and the World Food Programme, said they were not able to respond to questions before publication.
The Taliban have promised an amnesty, and a spokesperson said there is no “hit list”. But rights groups already report reprisal killings and threats.
In early August, the Taliban seized US military biometric devices that might help uncover people who worked with international forces, The Intercept reported. And when Kabul fell, Taliban soldiers searched for files at the national intelligence agency and the communications ministry, The New York Times reported.
Today’s risks should underscore wider data privacy questions for the entire aid sector, which has often embraced the benefits of digitised records and biometrics while overlooking the dangers, according to researchers who study data security in aid settings.
How long is data stored? Who else has access? Is there adequate consent from people receiving aid — often newly displaced with few other options? Are policies future-proofed to protect against unforeseen risks? The Taliban’s rapid takeover in Afghanistan has brought these and other questions to the forefront again.
In June, a Human Rights Watch investigation detailed how biometric data UNHCR collected from Rohingya refugees was shared with the country they fled, Myanmar.
“With biometrics, the concern is, you can take a new name, but you can’t really take a new iris,” said Jacobsen, whose research often focuses on humanitarian interventions and technology, including biometrics.
Her 2015 study highlighted potential flaws with a first-of-its-kind UNHCR biometrics programme for Afghan refugees in Pakistan. The system used iris scans — stored anonymously — to determine whether returning refugees had already received aid. Jacobsen’s research warned that “false positives” — where a person’s iris is erroneously found in the system — could essentially deny aid.
It was a programme Steinacker supported as UNHCR’s head of global registration in 2004. Now, both he and Jacobsen are calling for an urgent review of data in Afghanistan, and for a deeper re-evaluation of the use of biometrics across the aid sector.
This conversation has been edited for length and clarity.
The New Humanitarian: What should be the immediate priority for aid agencies when it comes to evaluating data security in Afghanistan? How much — and what kinds — of data are we talking about?
Karl Steinacker: What is important is that the agencies sit together and assess first what data there is, and where it is. Every big organisation would say: “There’s no need to worry. We have data security in place.” But is that so? What about the data which is in common databases: a child protection database, let’s say, where you can maybe trace single mothers, or victims of sexual violence — things that are quite delicate issues.
The other issue is the data used through commercial service providers — cash programmes in particular. Were they cash programmes for very specific vulnerable groups who might be targeted by the Taliban, because they were war veterans, or sexual minorities — whatever it is.
But this process has to start somewhere. Someone has to say, “Since we haven’t done what we should have done before we implemented these programmes, let’s now retroactively look at what we have, what can be accessed by the Taliban, and how can we mitigate the problem?” What should have happened is the data protection impact assessment has been done before they started these programmes. But we know from experience that no humanitarian agency does these impact assessments.
The New Humanitarian: How much of a debate is there about data collection within humanitarian organisations before programmes begin?
Steinacker: There is no debate, in reality. It’s assumed the more data I have, the better it is for the programme. I can better target, I can better report, I can better ask for funds from donors. There is no questioning whether this could create collateral damage, that it could really backfire.
Katja Lindskov Jacobsen: Much of this is not as new as we would like to think. It’s new that it’s the Taliban.
But UNHCR has, for a long time, had a data-sharing agreement of sorts with the [the US Department of] Homeland Security. The idea was to share data on refugees resettled in the US, but a lot more data has been shared. It’s researchers as well. A lot of actors are interested in biometrics that UNHCR and other agencies have collected — not just in Afghanistan. There are donors that are interested, and also host states. I think we have to think about this whole idea of who to give access to this data and who not to, and whether we can really control that.
“Much of this is not as new as we would like to think. It’s new that it’s the Taliban.”
Some humanitarian agencies, like the [International Committee of the Red Cross], have decided not to collect biometrics if they can avoid it. It puts into contrast some of the choices UNHCR has made about maintaining data forever, really. I think decisions like those have to be revised as well, given the sensitivity and the question of whether we can really make sure that this data, which is kept forever in enormous databases, is always in safe hands.
The New Humanitarian: It obviously sounds alarming if the Taliban now has access to biometric and other data once possessed by Afghanistan’s former government. How would you describe the level of concern?
Steinacker: The issue is extremely complex. [It could be] that the [Taliban] uses data to identify collaborators — people they consider traitors, collaborators, and whatnot. That would be my worry with regard to humanitarian data: that they use it, not against everyone who has received assistance, but specific groups: victims of sexual violence, sexual minorities — this kind of thing.
Jacobsen: For me, the bigger issue is they could use it whenever someone would require assistance from the government, whether that be schools, hospitals, medical assistance of any kind. Services from the government could be linked to a requirement of registering irises or fingerprints, now they have the devices and the databases to do that.
But even for individuals: having that concern would mean they decide not to go to hospital. We’ve seen that with UNHCR programmes [elsewhere], too: that individuals would decide not to register for humanitarian assistance because they didn’t know what was going to happen with their data. Was it being shared with the host government, or with their [home country]?
The fear in the population that they have registered their data and it might be used — I think we have to take that fear seriously. Because it might mean that people don’t access hospitals, for example, if this is where the Taliban decides to introduce biometric registration.
The New Humanitarian: What should humanitarian agencies do? Let’s start with the UNHCR data in Pakistan: biometric data — iris scans — collected from former Afghan refugees who returned from Pakistan over the last two decades.
Steinacker: We would basically say, delete it. The people never consented to it being kept forever. Secondly, if you have a new influx into Pakistan, then it no longer serves any purpose. There is no reason to keep a database with four million anonymous biometric datasets. It makes no sense.
“When should humanitarian and other organisations delete data?”
Jacobsen: It calls into question for how long such data should be kept. For this specific database, it’s even more pertinent to delete it if it technically also cannot be of much use. But I think the question is much, much bigger. When should humanitarian and other organisations delete data?
The New Humanitarian: For the data within Afghanistan, which has been collected by humanitarian agencies and others, what can or should be done with it?
Steinacker: They have to do a kind of housecleaning. They have to find out first what data there is. The amazing thing about working in humanitarian agencies is that over the years, you amass data. And with the staff rotation, people who are working there today have no clue what has been collected five years ago or 10 years ago. And this data has been collected under different circumstances at different times. So the first thing that has to happen is an inventory of data: what data is there, what is needed and what is not needed, where are the copies… and is this data potentially damaging to people in the databases? So this operations security process has to take place.
Jacobsen: And urgently. Not in a month.
The New Humanitarian: Karl, you’re coming from a position where you were partly responsible for the collection of data at UNHCR, and now you’re advocating for the right to be forgotten. How would you describe how your views have changed?
Steinacker: I was a fervent advocate of biometrics when it was introduced, because I believed that the systems which were used before were extremely degrading and humiliating to people. The way it was done before: it was like, spraying people with invisible ink, putting people behind fences, and whatnot. So I was certainly somebody who supported biometrics.
“Aid agencies have to learn that this kind of data is extremely important. It’s not just statistics.”
But we are all learning. Today, we can see certain impacts and side effects. I’m not totally against biometrics. But aid agencies have to learn that this kind of data is extremely important. It’s not just statistics. There is data that is linked to a person — through physical features like biometrics, for instance, but also others — that is so important that it needs special protection and measures, which have to be reviewed all the time because the situation is changing all the time. It’s not enough to pay lip service.
I believe that most agencies are paying lip service to data protection and data security issues. They can show you their manuals, they can show you their instructions, they can always say we follow GDPR rules. But then it is self-policing. None of these agencies submits itself to third-party evaluations or oversight. No beneficiary has access to his own data. So it’s just — it’s a mess. And these are the moments when we can see it.