“An opaque world, increasingly consolidated in few hands, dealing in the data of the world's most vulnerable and providing fertile ground to greedy data brokers and intermediaries.” That’s how Giulio Coppi, senior humanitarian officer at Access Now, describes the landscape surrounding the digital transformation of humanitarian responses. He mapped out the relationships between private technology companies and international humanitarian organizations in a new report published on Thursday which he said raises “ample reason for alarm.” In a conversation with Devex, he further described the situation as “A cynical space where the lines between humanitarian and military digital systems gets increasingly blurred, as data is funneled through the same platforms and anonymity loses any meaning.” But that’s not been the overarching narrative of the digitalization of humanitarian responses — which has often been hailed as a way to better access difficult-to-reach, conflict-affected communities in a sector that’s long been criticized as conservative and inefficient. In reality, technology is complex and requires many different players like a supply chain, and there’s experimental use of “often-exploitive” emerging technologies on populations living in volatile contexts without transparency, according to the report. The barter People living in crisis are often mined for their information. When a person arrives at a displacement camp, they may be put in a situation where they must barter their personal data to access lifesaving aid such as food. For example, the World Food Programme’s SCOPE is one of the largest data management systems in the humanitarian sector and has the personal data of more than 63 million people. Individuals are not just providing their data to one group — there are often many different groups within any camp asking for details to provide support ranging from the provision of tents and hygiene products to mental health counseling. The information collected could include biometrics, phone numbers, identification cards, ethnic and religious identity, financial information, geolocation, dietary habits, family size, history of gender-based violence, birth and death certificates, and medical data. While each group might collect only one piece of this person’s data, when multiple datasets are linked, including with data sets from third party groups, it can create a clearer picture of a person’s identity called a “mosaic effect” — and it can compromise the anonymity of this data. Humanitarian groups say data collection helps to improve coordination, efficiency, transparency, and accountability — and is often driven by donors who request disaggregated data to be used to judge whether the objectives of intervention were met and to communicate to the public why they’re providing funding to these crises. It also aims to ensure people don’t fall through the cracks. But it also raises questions about how carefully information is managed. An aid worker may collect that data and upload it to a cloud server with many different groups possibly accessing it. “The most vulnerable, those forcibly displaced, are led through a gauntlet of digital services, with their data passing through many hands, nonprofit and for-profit alike,” the report said. And most worryingly, this data could find itself in the wrong hands. In January 2022, a cyberattack targeted the servers of a company that the International Committee of the Red Cross used to store data on over 500,000 missing people in humanitarian crises. Last year, a cyberattack targeted the online database containing the personal information of thousands of the Norwegian Refugee Council’s project participants. And beyond cyber hacking, data is a commercial asset in the global economy. Some data brokers buy data, mix it, and sell it, for example, to organizations working in artificial intelligence whose work relies on large data sets. “Humanitarian data might be more financially valuable than it is operationally useful, and AI companies might need humanitarian and development data more than aid actors actually need their AI prototypes,” Coppi told Devex. For example, a person from rural Afghanistan or Yemen, who has been largely off-the-grid for most of their life, may have sought-after personal data. “The selling of that data is valuable — it has a very high commercial value because these are what we define as ‘rare datasets,’” he said. That data can be linked to a person’s ethnic group or place of origin — which could then be sold to another country’s homeland security department, for example, which could impact their ability to migrate. “It increases the chances, for example, that you're flagged for secondary inspection or that you're denied a visa altogether. Sometimes it's not even because it's you, but it's because your profile corresponds to the data that was harvested on your ethnic group, or minority, or on your community,” he said. The role and implication of data brokers on vulnerable populations still need to be fully investigated — and there is concern that humanitarians actually become data brokers, Coppi said. “Our research highlights that the humanitarian data ecosystem makes no exception as data seeps and leaks through open data practices, unsecure data pipelines, and untrustworthy providers,” he said, noting that there is a near impossibility of retaining the anonymity of this data. “If the argument is public interest, then basically everybody that is not doing protection should not be collecting data — but they still do.” --— Giulio Coppi, senior humanitarian officer, Access Now Technology companies are also often providing services to military or law enforcement agencies in the same locales as they operate in humanitarian responses without safeguards in place. It could be funneled through systems without clearly defined boundaries and possibly “exposed and deconstructed to train, feed, or support the same military operations that are the direct cause of their suffering,” the report said. The raising of this alarm is not new to the aid sector. In 2018, an ICRC report found that “the humanitarian sector’s growing use of digital and mobile technologies creates records that can be accessed and misused by third parties, potentially putting people receiving humanitarian aid at risk.” The ‘big fishes’ The technology companies are also consolidated, creating de-facto control over data, the Access Now report said. The majority of international humanitarian organizations — such as ICRC, most U.N. agencies, Mercy Corps, and NRC — are using Microsoft Azure, which is now a “semi-monopoly and hosting the world’s most sensitive datasets, despite its blemished track record on security,” according to the report. Last year, a major cyberattack breached the platform — prompting a U.S. senator to say the company has “negligent cybersecurity practices.” This consolidation could reduce the ability of aid groups to bargain for more protective conditions. And companies may include clauses that allow them to access the aid groups’ data to improve services. For example, a company could run algorithmic training — harvesting the data without it leaving the database — but it's still extracting results. “That is technically not a breach,” Coppi said. “But the final result is that some crumble, some information, some of that metadata, can then flow into other information systems.” And while a few big organizations have said they’ve taken precautions with protecting data — they aren't transparent with how they’re doing that, he said. There’s also a variance in attitudes toward data protection in the sector. “Some humanitarian organizations barely set up a data protection unit, but others showed more constructive and progressive attitudes,” according to the report. Positive examples include the UNHCR Digital Innovation Fund Project which works to strengthen human rights due diligence, and Oxfam’s efforts to pioneer a more critical approach to data protection. But even for large organizations, it’s “Basically impossible to be data protection compliant because some of that data is collected by subgrantees,” Coppi said. None of the humanitarian groups that they examined in their mapping had run due diligence on vendors, partners, suppliers, and subprocessors, nor did they look for conflicts of interest including the support of military activities in the same country where they were involved in a humanitarian response. The mapping also found that humanitarian technology partnerships often dodge regulatory data protection frameworks. Coppi said that all NGOs exposed to the European Union's General Data Protection Regulation legislation, or comparable legislation, "very likely have at least an embryonic form of data protection unit.” Almost all of those that he researched are developing their processes and internal capabilities, and it's very likely that almost none of these organizations is “fully, entirely, absolutely compliant with data protection norms and regulations across all their countries of operation," he said. The consolidation of both the humanitarian groups involved in responses and the technology companies they work with also serves as a barrier to localization, according to Coppi. When humanitarian responses involve massive data collection and processing, small organizations aren’t well suited to play a leading role. They can’t afford, for example, to have a data protection officer. “We are designing a digital system that is safe only for big fishes, and is not safe for small organizations,” he said. Data minimization While asking for consent from individuals living in crisis has long been seen as a justification for gathering data, Coppi said there are now shifting ideas around this. “There cannot be meaningful consent when your life depends on giving away your data,” he said. And so, humanitarian groups are now saying they’re collecting data because their mandate is protecting people — creating a need to know who these people are, where they live, and how to follow up with them. “The argument is shifting and we see there is an evolution — which is good. But we are still not having the fundamental discussion about: Why do you need the data in the first place?” he said. Coppi said his organization advocates for a “radical approach to data minimization” in that data is only collected when there’s a strong justification for it and it's done so with adequate guardrails. For example, collecting data is necessary for an aid group that is handing out cash transfers to comply with anti-terrorism laws. “When it comes to food, for example, that’s not the case,” Coppi said. “If the argument is public interest, then basically everybody that is not doing protection should not be collecting data — but they still do,” he added. Humanitarians should not be influenced by external forces to collect or share more data than is required or to store it for longer than necessary, Jill Capotosto wrote in 2021 for ICRC. They should also work to avoid a situation where data sets create a mosaic effect — which can be used to further identify people and lead to targeting. There have been efforts to create sector-wide tools to better manage data, such as the IASC operational guidance on data responsibility in humanitarian action. The Collaborative Cash Delivery Network has also been working to develop data sharing and interoperability standards. Additionally, the Humanitarian Data Centre is working to develop data collection standards with an eye on ethics. A clear set of guidelines that might also inform ethical standards is presented in the Principled Framework for Responsible Data Sharing Between Humanitarian Organizations and Donors, led by ICRC, the Humanitarian and Development Consortium, and the Swiss Federal Department of Foreign Affairs, which has data minimization as a core principle. But even so, there’s more work to be done and there isn’t much of a push to ensure more accountability. “Nobody's asking them to do it,” Coppi said. “And because nobody asks, they don't feel compelled to explain.” Big organizations also tend to think about data governance as an afterthought, he said. "It's a sector that jumps the gun, in a way. It deploys technologies, tests them, sees if they crash and burn — or not — and then adjusts, pretty much to see what they can do in order to ‘Do no harm,’” he said. “The priorities are inverted.”
“An opaque world, increasingly consolidated in few hands, dealing in the data of the world's most vulnerable and providing fertile ground to greedy data brokers and intermediaries.”
That’s how Giulio Coppi, senior humanitarian officer at Access Now, describes the landscape surrounding the digital transformation of humanitarian responses.
He mapped out the relationships between private technology companies and international humanitarian organizations in a new report published on Thursday which he said raises “ample reason for alarm.”
Printing articles to share with others is a breach of our terms and conditions and copyright policy. Please use the sharing options on the left side of the article. Devex Pro members may share up to 10 articles per month using the Pro share tool ( ).
Search for articles
Most Read
- 1
- 2
- 3
- 4
- 5







