This article was originally published by Devex and is republished with permission.
SAN FRANCISCO — As Facebook responds to a public relations nightmare — the fallout from news that a political consulting firm violated its rules for third party apps — organizations that have worked with the social media giant to use its data for good are wondering what the implications may be for their partnerships.
Groups that have worked with the company on everything from spreading internet access, to raising funds, to communicating with beneficiaries, are asking what lessons this scandal holds for them when it comes to privacy and security. Some are now worried that partnering with Facebook could pose a risk to their reputation, or have real concerns about the protection of sensitive information on the populations they serve.
Others remain as committed as ever to these partnerships — with varying levels of confidence as to whether Facebook feels the same. But what is clear to all of them is that this is a defining moment, demonstrating the growing opportunities and risks of leveraging data for good.
This week, Facebook announced it would increase data protections for its 2.2 billion users after Cambridge Analytica obtained the personal information of more than 87 million users. Starting next week, the company will now show people a link at the top of their News Feed, so they can see which apps they use, and what information they share with them.
During a call with reporters Wednesday, Facebook Chief Executive Officer Mark Zuckerberg said the company is rethinking every aspect of every relationship it has, and taking a broader view of its responsibility to protect the privacy of its users. Facebook now realizes it’s not enough to simply have rules, but it must also ensure that everyone in its ecosystem is following them, he said. Zuckerberg announced a number of changes to its services, which are designed to restrict the data that developers can access about users, and said the company will continue to invest in data protections.
“This is a multiyear effort,” he said. “This is a big shift for us to take a lot more responsibility for how each of the tools are used.”
Facebook learned about the data breach in 2015, but did not inform its users before the story broke on March 17. The company has since faced a backlash, including a #deletefacebook movement, and a loss of $100 billion in market value. It follows several other scandals involving the company, including revelations that Russia bought ads to interfere with the 2016 U.S. presidential election, and recent claims that it has been used to spread hate messages in Myanmar, which it says it has been trying to remove from the site.
As Facebook works to make changes internally, the question is not only what the impact will be on its data for good partnerships, but also whether this will drive a more serious conversation on responsible data sharing between the private and public sector.
Data for good
Before the Cambridge Analytica news broke, Chaya Nayak, public policy research manager at Facebook, gave a presentation at an event in San Francisco called “How Facebook Builds Impact Through Data.”
“Part of the Data for Good program that we’re building at Facebook is focused on: How do we actually get Facebook data in the hands of academics, NGOs, and the broader development community so they can take the data and do good in the world?” she said at the event organized by the Center for Effective Global Action and World Bank.
She talked about population density maps, which began as a joint effort between Facebook, the World Bank, and Columbia University, and combine census data with buildings identified from satellite imagery to create what she described as a higher resolution view of the world.
Nayak also talked about the disaster maps initiative, which Facebook launched in June after working closely with UNICEF, the International Federation of the Red Cross and Red Crescent Societies, and the World Food Programme, offering them aggregated and anonymized location data to improve their situational awareness when responding to natural disasters. Nayak said Facebook was starting to expand access to the datasets to NGOs and academics, seeing the value not only in disaster response but also other areas such as infrastructure and economic development.
Another emerging priority for the Data for Good team has been to leverage data from Facebook posts, using algorithms to surface topics and insights about what people are discussing. Nayak stood before a picture of a man kissing his infant daughter born with microcephaly, part of a UNICEF campaign to engage men in the fight against Zika which was based on an insight from Facebook that 58 percent of posts about Zika in Brazil came from men.
Beyond providing data, Facebook also supports groups using the platform and its products to communicate with affected populations, said Nathaniel Raymond, who directs the Harvard Humanitarian Initiative Signal Program, which specializes in information during crises. But he also talked about the risks that personally-identifiable, demographically-identifiable, and action-based information can pose when it comes to the exploitation of vulnerable populations.
As part of her remarks, Nayak acknowledged concerns about privacy, saying she often gets questions from partners on how it is embedded in the process of sharing data and insights.
“Facebook data has been a black box to a lot of academics and researchers because we were really scared of sharing data in a way that would not preserve the privacy of our users,” she said. “What we’ve been doing through the Data for Good program is really focusing on how we can aggregate our datasets in some way in order to get away from user-identifiable information.”
One of the lesser known features Facebook has rolled out in recent months is an effort to increase blood donations that started in India but has since expanded to Bangladesh and Pakistan. The person or organization in need of blood will not be able to see any information about the donor unless he or she explicitly provides it, said Hema Budaraju, product manager for health at Facebook, announcing the release. More than 7 million people have signed up as blood donors, leading teams at Facebook to consider other markets, and to ask whether they might leverage similar tools to drive better health outcomes in other areas.
“It’s been quite a sobering week for all of us,” Budaraju told Devex last week.
Across the company, there is what she described as “a heightened awareness” of the potential for abuse of data, which is particularly sensitive when it comes to something like blood type. The key now is to reassure partners that the company prioritizes the security, privacy, and safety of the community.
“We are always in touch with our partners, and people want to know what’s going on,” she said. “Our role is to build safe and supported communities. I take my role seriously and our team takes our role seriously and we would like to demonstrate that.”
Nayak told Devex that the company gets a lot of inbound request for partnership, especially when it comes to disaster response. When it launched the disaster maps initiative, Toby Wicks, a data strategist at UNICEF, said he saw demand-driven partnerships as critical to disaster response given limited resources and a rapidly changing landscape. Nayak said the company sees its work on data for good as core to its mission to make the world more open and connected, and is increasingly devoting resources to these efforts.
“For us, it’s about figuring out what the insight is that you lack that you need to get to,” she said. “Then we ask if we can build something that gets to that insight, but in a privacy preserving way.”
Stepping up or backing down?
In December 2017, Facebook quietly shut down its Audience Insights API, which allowed advertisers and agencies to tap into aggregated and anonymized user information such as age, interests, location, and more. The news about Russia buying Facebook ads to influence the United States presidential election broke while the API was still in beta, and it led the social media giant to make changes. So Facebook shuttered the API, directing marketers to a separate Audience Insights tool, following what some have described as a pattern of being reactive rather than proactive.
When things like this happen, Facebook releases statements saying it will test new ways to provide valuable insights, while also protecting the data of its users. Past mistakes have not led to quite the same level of fallout as the Cambridge Analytica news. Still, the many NGOs leveraging Facebook data for social good say they hope the company, and other organizations, will step up rather than back down on data for good efforts.
“What a shame it would be for a bad actor like Cambridge Analytica to do harm not just one time, but to actually harm the entire system, by effectively ruining it for the good that could be done and that is being done.” — Drew Bernard, co-founder and CEO of ActionSprout
“Don’t throw the baby out with the bathwater,” said Drew Bernard, co-founder and CEO of ActionSprout, which helps social good organizations leverage Facebook for their cause.
He hopes this moment of public scrutiny does not lead Facebook to limit the data it shares with the nonprofit sector, where he believes it can have a powerful and beneficial impact.
“For us, the key is: How do we make it comfortable for Facebook to share valuable data-informed insights? And it’s hard,” Bernard said. “Facebook is and should be very cautious about what they share. It has to make sense to them and the entire Facebook community. The reward has to outweigh the risk. What a shame it would be for a bad actor like Cambridge Analytica to do harm not just one time, but to actually harm the entire system, by effectively ruining it for the good that could be done and that is being done.”
While he sees the threat this moment could pose to the data for good partnerships, he has also seen Facebook’s commitment to the impact these insights can have, he said, so he does not expect the work to slow down, at least once the dust settles.
“Facebook is a learning organization, and they have made no secret about the fact that they like to fail fast, but they are also an organization that is maniacal about correcting and learning from failures,” said Frank Schott, managing director of global programs for NetHope, a coalition of NGOs that work to improve IT connectivity in disaster-stricken areas. “The whole agenda around data-driven decision-making in the humanitarian sector is not going away, so we are just going to need to all be better about the protocols for data sharing and data usage.”
As recently as five years ago, the way these data for good partnerships worked was that NetHope would call up Google or Facebook and ask how an emergency compromised the connectivity of a particular geography.
“It is not as efficient to have someone at Google or Facebook reading through interpretations of the data on a phone call. It’s better to have access and do your own visualizations with considerably more precision. We see great promise for the use of other data to inform our work and we don’t need the personally identifiable information to do that work,” Schott said.
Following Hurricane Maria in Puerto Rico, NetHope leveraged anonymized and aggregated Facebook data to see which parts of the island had connectivity before the storm but had gone dark after it, which informed the efforts to get the island back online.
“What we signed was quite prescriptive and definitely caused us to have internal discussions about how we live up to the letter of the agreement,” Schott said.
Once organizations sign a data sharing agreement, they are given secure access to Facebook’s visualization tool, which hosts and visualizes its datasets. Facebook can control and administer access. And organizations can use the tool to identify insights, or download visualizations of the data, a feature Facebook can turn on or off depending on the data that is being shared.
“It would be the worst thing to do right now to have a nonprofit equivalent of the #deletefacebook movement, because we have important voices to express to them as to what their core product — connectivity and information — can do in terms of playing a constructive role in society,” said Andrew Schroeder, director of research and analysis at Direct Relief, a medical relief nonprofit.
Direct Relief uses Facebook in a number of ways, including sending geographically targeted messages to crisis-affected communities. As with many nonprofits, the organization also uses Facebook advertising to reach donors, and Direct Relief wants to make sure their information is not used in ways they did not fully understand or really agree to for purposes beyond their own intent. Schroeder said he wants nonprofits to dial up their dialogue with Facebook, and expand on data for good partnerships, but in a way that demands more transparency.
“Our partners have asked us questions about where the data comes from and what assurances we get and we told them all of the agreements we make in order to access this data,” he said. “We frame it in terms of, ‘We don’t know all the answers, but we believe that this is still valuable going forward, and that the only way to get there is to engage our partners within Facebook and give them a sense of here’s what we think and what we’re seeing and where we have concerns.'”
Move fast and fix things
While there are a growing number of data collaboratives between Silicon Valley technology companies and organizations working in global health, international development, and humanitarian response, there can be a clash of cultures. To a certain extent, Facebook has maintained its hacker, “move fast and break things” mindset. When one side of a partnership is far more cautious than the other, it can get in the way.
Yet in these emerging data for good partnerships, neither side is being careful enough, Raymond of the Harvard Humanitarian Initiative said.
“We need to centralize how the standard is set across our sector. We get our house in order. That requires minimum technical and ethical standards and we don’t have them. It’s not about Facebook, it’s about us.”— Nathaniel Raymond, director of the Harvard Humanitarian Initiative Signal Program
“In medical ethics, there is something called ‘never scenarios’ — scenarios that should never occur, like operating on the wrong leg. For Facebook, it’s clear now that several ‘never scenarios’ have happened, and that there was not a duty-of-care concept within Facebook that clearly articulated the ‘never scenarios.’ They relied on the hacker ethos, when they needed to literally rely on ethos, which was ethics,” he told Devex.
He said neither the global development community nor Silicon Valley can “do no harm” until it knows the harm, and called for both sides to come together to figure out the way forward in a way that prioritizes the protection of people.
“There has to be an end to the silos by which these partnerships happen,” he added. “We need to centralize how the standard is set across our sector. We get our house in order. That requires minimum technical and ethical standards and we don’t have them. It’s not about Facebook, it’s about us.”
While forums such as the Global Partnership for Sustainable Development Data that bring technology companies and humanitarian organizations together can help build connections and trust between both sides, both sides have yet to figure out how to fully deliver on the promise of responsible data sharing, in part because of rapidly changing contexts.
The World Food Programme is one of several United Nations agencies and NGOs that have been working to establish guidelines and best practices in responsible data use.
“Data collection and analysis is essential for WFP to fulfil its global public service mandate in a way that is informed, responsible, and accountable,” a spokesperson told Devex. “Data partnerships are a new frontier. There can be real value if data and expertise can be put to work to generate insights into how to achieve the SDGs and modernize humanitarian response. In today’s fast-changing digital world, we aim to maximize benefits and minimize risks by working with partners to refine policies and practices as new challenges are understood.”
One U.N. report on big data to achieve the Sustainable Development Goals set out guidance for data privacy, data protection, and data ethics.
But there are also concerns that neither side fully understands the dangers involved with sharing the data of vulnerable populations.
“The humanitarian sector is now beyond hypotheticals when it comes to things like data breaches, the harmful effects of data experimentation, and the weaponization of information,” said Joseph Guay, an associate at The Policy Lab, and an expert in humanitarian innovation. Guay is also behind a new initiative called Do No Digital Harm. “The requisite knowledge and expertise found in the digital security domain is sorely lacking in the field of humanitarian protection.”
The sector struggles to bring risk mitigation, data protection, and digital security into the center of its work, he said, explaining that the new initiative will provide support to humanitarian partners in areas where they are likely to struggle.
Simply making sure that personally identifiable information is anonymized is not enough, he added. He talked about the “mosaic effect,” which allows for highly granular information to be drawn from layering multiple datasets, even if the information might seem disparate or has been anonymized. Because not all anonymization can prevent reidentification, efforts with the best of intentions could cause harm, he said.
Facebook told Devex its partners in the data for good space are unlikely to be impacted by the new data restrictions, since privacy is already paramount in these proactive data partnerships. Still, the company and its partners seem to agree this is an important moment for reflection, and not just for Facebook.
“This is a defining moment,” Schroeder of Direct Relief said. “The tools have gotten better. It seems as though there is more opportunity. But at the same time, as we learn more along with everyone else, there may be more risk too. And we, in the nonprofit community, may not have taken all of that into account.”