'Be prepared:' What one NGO learned from a data security breach
Last year, a digital payments system used by Catholic Relief Services in West Africa was hacked, exposing beneficiaries' data. The NGO's Chief Information Officer Karl Lowe spoke to Devex about what they learned from the experience, and what advice he'd offer to others.
By Jessica Abrahams // 01 August 2018LONDON — It was just before Thanksgiving when Karl Lowe, chief information officer at Catholic Relief Services, got word of bad news. The RedRose system it uses for digital payments in West Africa had been hacked, exposing personal, geographic, and photographic data about its beneficiaries. The vulnerability began with a password. “If you look at any breach, that is the easiest way to get into someone’s system,” said Lowe. Mautinoa Technologies, a company working on similar projects to RedRose, had been investigating its competition when a staff member stumbled across an old password and user ID for one of the CRS systems, enabling them to gain access. The fault “was squarely on our shoulders,” Lowe acknowledged. Mautinoa said it also revealed systematic weaknesses in RedRose’s security, which both RedRose and CRS deny. Fortunately, the data was not made public, but it still required a swift response to close the security gap. “I was on the phone and doing work all of Thanksgiving,” said Lowe. “Obviously what we did immediately was do a very quick removal of those credentials and looked at the overall environment of that RedRose instance and removed anything that looked even slightly suspicious, and reset everyone’s password.” When staff returned to work on Monday, everyone had to reauthenticate their credentials. “Immediately thereafter we looked at all of the other RedRose systems that we have” — about 30 in total, which are mainly used for cash and voucher distributions — “and we did the same thing,” he said. “We hardened the passwords, we reset them, and anyone who had elevated privileges in the environment had to go through a separate step of [reauthentication].” All its RedRose systems have now been upgraded to a higher level of security, he added. Many aid organizations have moved quickly to keep up with progress on data that allows them to make their programs more effective and efficient in delivering support to those who need it most. More data than ever before is being collected about beneficiaries and supporters, creating huge opportunities for aid work — but the risks have also begun to emerge. The challenge is particularly acute in humanitarian contexts, where time and needs are of the essence. Over the past three months, Devex’s Data Guardians series has been exploring some of the issues affecting aid organizations as they work to protect their beneficiaries’ data, and the debates and practicalities around what more can be done. In the end, Lowe agreed, the RedRose breach was a learning experience that helped to drive the organization’s work on data security forward. He spoke to Devex about what they learned, and what advice he would offer to other organizations facing similar challenges. The conversation has been edited for length and clarity. What have you taken away from the experience in the long term? Be prepared. We are going to be targets — whether this person was doing it for research, or whether it could be some state [trying to gain access] — we are going to be a target at some point, so we recognize that. Having a playbook in place is going to help us move forward, so we’ve already developed a plan about how to move forward, and who we need to communicate with. Communication was key. We did lots of communication, both internally and externally. Transparency is something we were super [focused on] so [when the breach came to light] we let the donors know; we work with the Catholic church, so we let the church know what was going on. We let our partners on the ground know what was going on, and we kept them informed as we knew more and more information. I think the other thing we take away from this is that we have work to do. We need to harden our environments and we need to improve the way that we keep data safe. We’ve got some other actions that we’ve taken in order to do that. What actions are you taking, and what do you see as the biggest challenge in that process of improvement? The biggest challenge is not necessarily the technology — it’s the people and processes associated with the technology, and then driving a culture of using data in an appropriate way. Across the world, you’re going to see that being the biggest issue ... In terms of what we’re doing, we’re rearchitecting the way that we administer all of these decentralized systems. Obviously, we’re a very large organization and we have many, many systems around the world, but we’re really thinking about security first. We’ve done that with all of the RedRose systems, but we [are now working through it with others] ... hardening them, making sure that proper IT hygiene is put into all of them from a security standpoint. “The biggest challenge is not necessarily the technology — it’s the people and processes associated with the technology.” --— Karl Lowe, chief information officer at Catholic Relief Services We’re also doing a privacy impact assessment on all the systems going forward, and in some cases on the systems going backward. Probably the most important thing that we’ve done is create a cross-functional data protection group, that’s not just the IT department; it’s across all of our departments, and really looking at developing this culture of responsible data use. We’ve created eight principles of data use that we’re going to be deploying across the environment, committing to: 1. Respect and protect an individual’s personal data as an extension of their human dignity. 2. Balance the right to be counted and heard with the right to privacy and security. 3. Weigh the benefits and risks of using digital tools, platforms, and data. 4. Open data for the common good only after minimizing the risks. 5. Prioritize local ownership and control of data for planning and decision making. 6. Work to educate, inform, and engage our constituents in responsible data approaches. 7. Take a preferential option for protecting and securing the data of the poor. 8. Responsibly steward the data that is provided to us by our constituents. Finally, we’re staffing up our security department and having people in place who understand security from both the IT side and the non-IT side. What kind of information are beneficiaries given when they sign up to use the system in terms of data security and privacy? “It’s difficult … because if beneficiaries say ‘well, we don’t want you to collect that data,’ I’m not sure we’re going to be able to deliver the benefits.” --— In this case, the system was set up a couple of years ago and we didn’t have that kind of process in place. [But] we have recently institutionalized it as part of how we improve, so we are now doing privacy impact assessments; we are doing “opt in” and telling people “here’s what we’re going to be doing with the data, here’s why we need to collect this kind of data.” It’s difficult, though, because if beneficiaries say “well, we don’t want you to collect that data,” I’m not sure we’re going to be able to deliver the benefits that we’re trying to deliver to them. So we do give folks now, as we go forward with our programs, the ability to understand what we’re going to do with the data [and we haven’t had any pushback so far.] Do you have any insights into how we balance those two things — the need for data privacy against the need to collect data in order to provide services? The first thing we do is the privacy impact assessment. The way to think about that is it asks “why” five times. Why do you need to capture this level of data? You get a response. Then you ask “why” again, and again. We need to start by thinking: Do we really need to capture personally identifiable data on individuals that we’re helping — and if we don’t, then let’s not capture it. If we don’t think we should but the donor’s asking us to, we need to step up and have a discussion with the donor and say, “we don’t think we ought to be doing this and here’s why.” We think that many of the donors will understand that and perhaps move in our direction ... I do think that the sector as a whole is moving in the right direction here. There are going to be laws and policies — the General Data Protection Regulation being one of them [the European Union data protection regulations that are generally regarded as some of the most stringent worldwide] — that we have to adhere to. Our view at CRS, and it’s probably the view of the rest of the sector as well, is let’s take the GDPR rules that are out there and deploy them everywhere and use that as the gold standard in terms of managing data privacy and data protection. Do you have any advice about how to start driving a culture of responsible data use within your organization? We need to have policies in place, simple policies, so that people know what they can do and what they can’t do — but from those policies, we need to move to a practice and a culture of data use. I liken that to seatbelts. When I first started driving, seatbelts were for sitting on: They were mandated in cars but people didn’t use them. So how did people move from a 20 percent utilization of seatbelts to 80 percent? It’s a number of different elements, but it has some policy to it — there are laws; “click it or ticket.” Then there are practices. You start as a child, you’re always in a seatbelt ... Where it gets to culture is that it just feels weird to get into a car and not put a seatbelt on. The first thing I do is look for the seatbelt. So that’s how you get to the culture — you really have to be that prescriptive and that diligent about how we do that. The policy needs to be very simple; the training needs to be very simple; and the training is multipronged. So we’re teaching our staff how to recognize a phish when a phish comes in; we’re teaching people “here are five things you do with data” and “here are five things you don’t do with data.” We’re trying to be very simple in moving from a policy world to a cultural world. [Paul Eagle, vice president for marketing and communications at CRS, added that the organization has been encouraging a “take your data to the meeting” approach, where “we can really look together at how to solve these problems” and “get more friendly with it.”] How does this effort to change the culture work when you’re such an international organization with a very dispersed staff, working with lots of partner organizations? “If we think of data as an extension of the individual we need to be as careful with the data as we are with the human being.” --— Obviously we can’t stand on a chair and say, “hey, we’re all going in this direction.” And we have partners we’re working with — oftentimes, though, the partners don’t have the same access to the data as we do. But as we pass more and more responsibility onto local partners, we need to make sure that they have capacity and capability to be able to do this. It’s really a three-pronged approach in terms of how you work through this [policy, practice, and culture.] The rise in the use of seatbelts happened over a 30-year period. We don’t have 30 years to do this. But one of the secret sauces that I think we have in the humanitarian sector is this whole idea that data is really an extension of the human being. We have this respect and protection of individuals in our programs, and if we think of data as an extension of the individual we need to be as careful with the data as we are with the human being. There are [hundreds of] data breaches in the for-profit world every month. And they have a lot more money than us, but we have a leg up on [them] in that this is our mission. If we can tie those two things together, I think we have something that can move a lot faster than seatbelts. Read more stories in Devex’s Data Guardians series, from expert advice on how the new GDPR regulations impact NGOs, to insights on why your NGO needs in-house data security expertise.
LONDON — It was just before Thanksgiving when Karl Lowe, chief information officer at Catholic Relief Services, got word of bad news. The RedRose system it uses for digital payments in West Africa had been hacked, exposing personal, geographic, and photographic data about its beneficiaries.
The vulnerability began with a password. “If you look at any breach, that is the easiest way to get into someone’s system,” said Lowe. Mautinoa Technologies, a company working on similar projects to RedRose, had been investigating its competition when a staff member stumbled across an old password and user ID for one of the CRS systems, enabling them to gain access. The fault “was squarely on our shoulders,” Lowe acknowledged. Mautinoa said it also revealed systematic weaknesses in RedRose’s security, which both RedRose and CRS deny.
Fortunately, the data was not made public, but it still required a swift response to close the security gap.
This story is forDevex Promembers
Unlock this story now with a 15-day free trial of Devex Pro.
With a Devex Pro subscription you'll get access to deeper analysis and exclusive insights from our reporters and analysts.
Start my free trialRequest a group subscription Printing articles to share with others is a breach of our terms and conditions and copyright policy. Please use the sharing options on the left side of the article. Devex Pro members may share up to 10 articles per month using the Pro share tool ( ).
Jessica Abrahams is a former editor of Devex Pro. She helped to oversee news, features, data analysis, events, and newsletters for Devex Pro members. Before that, she served as deputy news editor and as an associate editor, with a particular focus on Europe. She has also worked as a writer, researcher, and editor for Prospect magazine, The Telegraph, and Bloomberg News, among other outlets. Based in London, Jessica holds graduate degrees in journalism from City University London and in international relations from Institut Barcelona d’Estudis Internacionals.