SAN FRANCISCO — As Facebook tries to recover from the backlash of several high-profile instances of its data being misused, the company is looking for ways to leverage its data for good while still preserving the privacy of its users.
At the U.N. General Assembly, the social media giant made a five-year commitment to use data to help partners advance progress on the Sustainable Development Goals — and it has narrowed in on gender data as the place to start.
“We mapped projects related to SDGs in the company, then got a sense for which SDGs are we currently working hard on, which are we missing out on, then turned to the future,” said Anna Lerner Nesbitt, program manager of global impact for data and artificial intelligence at Facebook, at a convening hosted by Data2X last week. “Based on what we’re doing now and where the world needs to be in 2030, where are our unique superpowers?”
“When we’re looking at disaggregating data along gender lines, you have to be cognizant of ways data could be misused.”— Joe Westby, researcher, Amnesty International
Traditional data sources have been insufficient in capturing the lives of girls and women, insights that are critical for effective policymaking, according to Data2X, an initiative of the United Nations Foundation that has been exploring how big data can fill some of these gender data gaps.
Now, Facebook is exploring whether there is new and relevant data it could collect on gender, and it is turning to global development experts for input on how to shape its strategy. Development professionals, meanwhile, remain wary of the risks of partnering with Facebook.
Answering questions about girls and women
During Global Goals Week in New York City, Facebook organized a roundtable conversation in an effort to learn how Facebook data might help answer key questions about girls and women.
“Today’s conversation is an opportunity for Facebook to sit in a room with all these experts and start to understand: What are the questions people are trying to answer, and how might we be able to help answer these questions?” said Marcy Scott Lynn, who leads Facebook’s partnerships around the SDGs.
She sat at a table set with nametags for Sheryl Sandberg, chief operating officer for Facebook, and leaders from organizations including UN Women, the Bill & Melinda Gates Foundation, and the World Bank Group.
Internally, Facebook refers to its work on the SDGs as “project 17.” The company is relying on this consultative process as it determines how to leverage its data to advance progress on the goals. Facebook has hosted four roundtables on gender data so far, with plans for another in San Francisco this December, and it will reveal its strategy early next year.
More reading on gender and Facebook:
But Nesbitt of Facebook shared early details on the company’s plans at the Data2X event.
The company is considering what data it already has access to that it could analyze in different ways — for example, by gender disaggregating the displacement maps it shares with Disaster Maps partners.
Facebook is also exploring whether there is new and relevant data it could collect on gender — as it has done with the “Future of Business” survey in partnership with the World Bank and the Organization for Economic Cooperation and Development — but acknowledges the challenges of doing so on sensitive issues such as gender-based violence.
Finally, the company is also considering how it might combine open data sources in order to create predictive models that would generate new insights, as it has done with its Population Density Maps.
The risks of partnership
While users do not pay for Facebook, they do offer up their personal data, often without realizing just how valuable that information is.
Data analytics tools can infer highly detailed characteristics of these 2.8 billion users that are used to shape their experience on Facebook, as well as on Instagram and WhatsApp.
As Facebook expands its international reach, users have no choice but to access these platforms on terms dictated by the company, according to a recent report by Amnesty International. Facebook’s business model, which is based on the “extraction and analysis of data,” poses a threat to human rights, the report says.
Development organizations need to carry out proper due diligence when they engage with Facebook in ways that might present risks of privacy violations, said Joe Westby, a researcher at the international advocacy organization who authored the report.
“When we’re looking at disaggregating data along gender lines, you have to be cognizant of ways data could be misused in ways that could lead to undermining their [people’s] rights,” he said. “There are ways to use the data in a way that mitigates against discriminatory harms.”
At the Data2X event, a member of the audience asked why organizations should work with Facebook “after it’s proven over and over again how very misaligned its incentives are.”
Nesbitt encouraged them to look up the privatization methodologies Facebook has in place for its Data for Good partnerships, which is the same infrastructure the company will also use for its work with partners on gender data.
In a follow-up interview with Devex, Nesbitt said the company does consider the “unintended consequences” that might result from data getting into the wrong hands, and it works across teams and with external experts to avoid this.
Matching priorities and capabilities
Facebook held a three-hour meeting with Data2X experts in November, where the social media giant “sort of let us under the hood of what they’re starting to do and the kinds of data they have,” said Bapu Vaitla, a Data2X fellow whose research focuses on big data and gender.
Vaitla said there are certainly risks to partnering, but it’s unclear what those risks are and whether they outweigh the potential benefit.
“The central question is: Can you reidentify individuals?” he said. “The reality is that there’s an arms race going on where people are developing ever more sophisticated ways of identifying individuals, and there are different ways of aggregating and reorganizing information that makes it harder to do that.”
Vaitla said he hopes to see more narrowly defined priority areas, “then a safely conducted data-to-policy experiment,” in which partners can learn something new about girls and women — while protecting their identities — and implement an effective policy that would not have been possible without those insights.
“We have our priorities, they have their capacities, and hopefully in the future we’ll find a way they feel comfortable sharing their data in some anonymized and aggregated form,” he said.