Opinion: The WFP and Palantir controversy should be a wake-up call for humanitarian community

Refugees from Syria at a cash disribution by the World Food Programme and ECHO. Photo by: Peter Biro / ECHO / CC BY-NC-ND

The World Food Programme’s Feb. 5 announcement of a $45 million partnership with the algorithmic intelligence firm Palantir has sparked a firestorm of criticism. The partnership raises serious issues that must be addressed by WFP — who appear to be taking the right steps toward transparency. But this controversy is bigger than just one private sector agreement by one agency — and it should serve as a challenge and an opportunity for the entire humanitarian sector.

Humanitarians have become increasingly reliant on digital data, and on third-party partnerships to collect and process it to create operational impact. But the ecosystem for doing this responsibly is missing: the policies, procedures, and capacities that ensure core principles, human rights standards, and data regulations govern these partnerships.

This controversy is an opportunity for collective action to help humanitarians better protect vulnerable people, by building the robust partnerships it needs to serve more effectively.

Humanitarians can do three things, in short order, to ensure that third-party data-related partnerships do not harm the physical safety, human rights, and individual and group privacy of the world’s most vulnerable people. These actions will require more than robust engagement by a diverse set of actors — they will also require essential cultural changes in how humanitarians approach data. Donors and humanitarian leaders must embrace the fact that being fit for purpose in a digital world requires being open to partnerships and rigorously ethical.

First, humanitarians must recognize and mitigate the potential harms they can cause through the use of digital technologies in humanitarian work. To do so, humanitarians must come together to formally acknowledge that the Humanitarian Principles, the Humanitarian Charter, and the Universal Declaration of Human Rights require them to respect the rights of vulnerable people to information, and to agency over their own data.

Additionally, the sector must commission new commentaries for how International Humanitarian Law applies to armed actors engage in cyber and information warfare, as well as how civilians and humanitarians must be protected from digital threats.

Second, adapting to a digital world requires collective regulation to enact these principles, coupled with the independent review of digital services. Agencies must improve the procedures that govern our digital partnerships. This includes scrutiny of end-user license agreements; may extend to developing status of data agreements modeled on the status of forces agreements that are used in peacekeeping; and could extend to monitoring mechanisms that pay attention to the challenge of digital protection.

Third, mistakes will be made. There will be risks that become harms. A sector-wide, transparent and independent critical data incident management process is urgently needed to manage and mitigate problems, and to help improve digital guidelines as we learn. Doing no harm is not possible unless we know what the harm is and could be. Developing critical incident mechanisms for data will also help develop a common theory of harm.

Humanitarians have taken some first steps to adapt to a digital world. However, the deep need for digital standards, investments in capacity, and powerful accountability mechanisms remain — but can be addressed in a way that continues to allow new partnerships and protects the people we serve. The investment in digital capacity, competencies, and capabilities must not be seen as just investing in the humanitarian system. Instead, they are the necessary steps to recognizing the opportunity of our digital world, and our moral obligation to recognize the reality that data is people.

The views in this opinion piece do not necessarily reflect Devex's editorial views.

About the authors

  • Nathaniel Raymond

    Nathaniel A. Raymond is a lecturer in global affairs at the Jackson Institute for Global Affairs at Yale University. He is formerly the founding director of the Signal Program on Human Security and Technology at the Harvard Humanitarian Initiative and he currently co-chairs the World Economic Forum's Group Data and Human Rights Working Group.
  • Laura Walker McDonald

    Laura Walker McDonald is director of innovation at the Global Alliance for Humanitarian Innovation. She was formerly the CEO of SIMLab, and before that, FrontlineSMS.
  • Rahul Chandran

    Rahul Chandran is the executive director of the Global Alliance on Humanitarian Innovation, launched at the World Humanitarian Summit to increase the impact of humanitarian innovation. His previous experience spans humanitarian, development and peacekeeping, leading major reform efforts for the United Nations, and extensive work in and on Afghanistan. He worked in headquarters and the field for the United Nations, the World Bank, UN-OCHA, UNDP, the Peacebuilding Support Office, and the Executive Office of the Secretary-General. Prior to this, he was the deputy director of the Center on International Cooperation, and has also worked in civil rights and for a number of technology start-ups.