Solving the data conundrum: How to leverage tech and 'big data' for impact

By Malia Politzer 11 July 2016

Participants of the Hackathon challenge, aimed at utilizing technology to find creative solutions to humanitarian issues at the World Humanitarian Summit in Istanbul, Turkey. Photo by: World Humanitarian Summit / CC BY-ND

In recent years, both the volume and number of data sources have increased at unprecedented rates: In fact, a staggering 90 percent of the data that currently exists today was created in the last two years, according to technology and innovation giant IBM.

From combing phone subscription records to estimate population density and poverty levels, to analyzing tweets to predict a pending food crisis, emerging technology and the availability of “big data” sources offers global development and humanitarian aid organizations new ways to optimize both their effectiveness and reach.

How to best harness and utilize technology and data to achieve development goals, however, is not always clear. So how can aid actors leverage technology and data sources to maximize their impact?

Start by asking the right questions

International organizations today have many different tools to choose from when it comes to both data collection and analysis, ranging from smartphone applications, to data visualization software, to customizable algorithms. With so many ways to collect and store data, a plethora of readily available data sources, and new data-oriented technological gadgets emerging daily, it can become easy to drown in too much information. That’s why the first step to harnessing the power of data is learning how to ask right questions, and having clearly defined goals.

“When we start to work with organizations, the first thing is to assess what it is that they actually need to know in order to function properly on a Monday morning,” said Dr. Ernest Darkoh, co-founder and director of BroadReach, a data-driven analytics company that specializes in designing and optimizing health care systems in more than 20 African countries.

According to Darkoh, refining these questions is critical, as they become the key drivers of the data collection and analysis process.

For example, for a district health manager overseeing 400 clinics in rural Africa, the initial question might be, “How am I performing?” which would be refined to, “In which areas of health am I failing the most, as of one month ago, and which specific clinics are contributing the most to my nonperformance?” The answer to the second question, if available instantly at the manager’s fingertips, is a much more helpful point of departure from which to make decisions.

Create easy-to-use data platforms to solve key questions

Once the questions are clear, BroadReach works with leaders and managers across governments, donors, private sector and nongovernmental organizations to create an easy-to-use “insight dashboard,” which provides the key optics and answers to make better, more informed and timely decisions. BroadReach then provides electronic best practice toolkits to guide field staff, so that their implementation of any resultant programs and solutions is of consistent high quality.

“One of the big problems many organizations struggle with is incomplete and fragmented data sets and knowledge,” explained Darkoh. “Our job is to pull all that relevant information together, analyze it into useful ‘at your fingertips’ insights, and make it as easy to use as possible for real-life operators in the trenches, who may not have a background in technology and statistics.”

Darkoh believes that people don’t actually need more data, but more answers — and they need them quickly. BroadReach provides this by integrating diverse data sets such as finance, human resources, supply chain, and epidemiology with clinical information from hospitals and health care providers, patients and communities into easy-to-use formats on their touch pads, smartphones and computers. This enables leaders and managers to instantly compare key metrics across multiple facilities and districts, and identify which interventions and clinics are the most and least successful and why.

Take, for example the district health manager, responsible for the 100 different clinics, who wants to know which clinics are underperforming, in what areas they are underperforming and why they are underperforming.

“If immunization is a concern, first we might need data from the local census that shows where diseases are breaking out, and if there’s a clinic in that area,” said Darkoh. “Then we need to know who is working there? What is their training? What resources do they have available? With the BroadReach platform, all of that information becomes instantly available.”

And if the manager then needs to run an immunization campaign, Darkoh explained, they have immediate access to best practice electronic workflow toolkits that provide their team with step-by-step guidance for conducting a successful campaign.

Adopt context-appropriate, low-grade technology for data collection

For development and humanitarian professionals whose primary work is in areas that often lack regular access to electricity, internet, or even cellphone coverage, relying on “high tech” data collection methods are often impractical.

That’s why when the United States Agency for International Development and the Philippines’ Department of Environment and National Resources partnered to develop the Biodiversity and Watersheds Improved for Stronger Economy and Ecosystem Resilience program, known as B+WISER, to address threats to the country’s natural forests and biodiversity, they deliberately kept things simple.

For nearly 20 years, the DENR had been monitoring forest threats and degradation by sending forest rangers armed with notebooks and cameras into the forest to collect data manually. Once there, they would interview people in the community, take photo documentation, and record their observations by hand.  

Although the data they collected was potentially useful, most of it never got digitized, leading to a paperwork bottleneck of years’ worth of data.

Their solution was to develop the LAWIN Forest and Biodiversity Protection System that includes electronic data collection and analysis system. First, they significantly narrowed the number of parameters collected by forest rangers to four key indicators of forest biodiversity and health.

“Before, people were collecting a large volume of data that wasn’t necessarily used,” explained Dr. Felix Gaschick, a biodiversity and forest specialist with Chemonics International, who also works for B+WISER and helped to design the LAWIN system. “One of the first steps was to lower the amount of data collected to parameters that really inform people about the condition of the forest.”

Next, they created patrol plans to ensure systematic monitoring of the forest, and replaced notebooks with tablets and smartphones installed with a preexisting application called CyberTracker, which allowed rangers to geo-tag photos and observations of forest conditions and threats.

They then adopted open-source software called the Spatial Monitoring and Reporting Tool, compatible with CyberTracker, to analyze the data.

“We knew that we needed a solution that was open-sourced, so it would be freely accessible to the people in the Philippines, and cost-effective to maintain,” said Gaschick, explaining that information collected on the tablet would then be automatically deposited in the SMART system, making it available for analysis almost instantly. Among other results, it allows DENR personnel timely access to information that enables them to respond to observed threats and see trends in forest health.

Although the LAWIN system is still relatively new, some of the forest areas, where it is implemented, are already showing signs of improvement. By the end of 2017, the B+WISER Program is expected to improve the conditions of 500,000 hectares of forest and help reduce greenhouse gas emissions by 5.2 million metric tons.  

Tap ‘big data’ sources to achieve development goals

“Big data” is a term that’s been bandied about a great deal in recent years, however few people truly understand what it means.

“Big data isn’t really big data,” said Emmanuel Letouzé, director of Data-Pop Alliance, a global coalition on big data and development created by Harvard Humanitarian Initiative, MIT Media Lab, and the Overseas Development Institute. “People think that big data means data sets too big to feed into a computer. It’s not. Big data can be better described as an ecosystem, defined by three C’s — crumbs, capacity and community.”

According to Letouzé, the “big data” ecosystem is one in which tech-minded individuals, institutions and organizations use various tools, including software, algorithms and statistical analysis, to leverage the digital “crumbs” left behind by the data usage patterns of people all over the world (for example, by mining tweets, cellphone usage patterns, and so on).

There are four key ways that big data can be used for development, disaster relief, and humanitarian response: The first is descriptive — for example, using data from satellite imagery to identify flooded areas, or identifying areas in need from crisis maps. Next is predictive — such as using cellphone activity before, during and after a disaster to infer real-time population distribution. Finally big data can be “discursive,” and used to create dialogue.

Making big data available to development and humanitarian organizations in a way that is useful, however, requires partnerships with the private companies that control the data — such as cellphone companies and tech giants Facebook and Google. Also, concerns about who has access to big data, how it is being used, and how user privacy will be protected, need to be discussed in a much more open way.

“In order to bring this to scale and make it sustainable, we need to build long-term relationships that are much more transparent, and where people can weigh in on how data will be used,” said Letouzé.

Share data to maximize impact across sectors

Three years ago, Sarah Telford was the head of the reporting unit at the United Nations Office for Humanitarian Affairs, where she was tasked with improving the quality of reports created in the field by more than 30 different offices, and making them more analytical. She kept running into the same problems.

“The data was difficult to get, and it was always in different formats,” she said. “One person would have it on Dropbox, while another person would have data on their thumb drive, or in an email.”

She quickly realized that she wasn’t the only one struggling with this problem: Many organizations were duplicating data because they didn’t know it already existed, while others used outdated data when fresher sources were available. To solve this problem, she helped to develop the Humanitarian Data Exchange, an open platform in which entities such as the U.N., NGOs and governments can upload and download data sets in a standardized, usable format.

“In a crisis situation, having timely access to useful, accurate data is critical, and it isn’t always easily available,” said Telford. “By creating a platform where everyone can easily upload data, we hope to be able to make disaster relief more efficient.”

Since it launched two years ago, HDX today has more than 180,000 users and 4,000 uploaded data sets from more than 250 locations around the world — from Ecuador, to Kenya’s Kakuma refugee camp, to Nepal. HDX has also piloted emergency “hot spots,” featuring blended data sets on key international crisis. The Ebola crisis page, for example, includes an interactive map of the worst-affected countries, a graph of week-by-week spread and cumulative deaths, and 62 distinct data sets.

The Nepal earthquake page includes an interactive map of the numbers and locations of internally displaced persons, and more than 87 data sets, including information related to global food prices, countrywide road networks, health infrastructure, and various surveys on community perception of the earthquake’s effect.

A screencap of the Lake Chad humanitarian crisis interactive data map developed on the Humanitarian Data Exchange.

There’s also a page on El Nino, which, apart from an interactive map that shows which countries are “highly affected,” includes up-to-date data sets ranging from drought response in Haiti in 2016, to the World Food Program and the Food and Agriculture Organization overview of countries affected by the El Nino, to an active global archive of large flood events. Most recently, they’ve developed an interactive data map drawing on data sets uploaded by users that focuses on the Lake Chad humanitarian crisis.

By centralizing and sharing data in one place in a standardized, easily downloadable format, Telford believes that all humanitarian organizations will benefit.

“We all work better with good data,” she said. “By sharing this knowledge, other organizations can also benefit.”

How can new or existing technologies and “big data” sources be leveraged to maximize impact? Have your say by leaving a comment below.

With potential to change the trajectory of crises, such as famines or the spread of diseases, the innovative use of data will drive a new era for global development. Throughout this monthlong Data Driven discussion, Devex and partners — the Agence Française de Développement, BroadReach, Chemonics and Johnson & Johnson — will explore how the data revolution is changing our approach to achieving development outcomes and reshaping the future of our industry. Help us drive the conversation forward by tagging #DataDriven and @devex.

About the author

Malia politzer
Malia Politzer

Malia Politzer is an award-winning long-form journalist who specializes in international development, human rights issues and investigative reporting. She recently completed a fellowship from the Institute of Current World Affairs in India and Spain. For three years, she worked as a feature-writer at Mint, India’s second-largest financial newspaper, where she wrote about international development, strategic philanthropy and impact investing. She holds an M.S. journalism from Columbia University Graduate School of Journalism, where she was a Stabile Fellow for Investigative Journalism, and a B.A. from Hampshire College.


Join the Discussion