People think and say all sorts of things that they would never actually do. One of the biggest challenges in countering violent extremism is not only figuring out which people hold radical views, but who is most likely to join and act on behalf of violent extremist organizations. Determining who is likely to become violent is key to designing and evaluating more targeted interventions, but it has proven to be extremely difficult.
There are few recognized tools for assessing perceptions and beliefs, such as whether community sentiment about violent extremist organizations is more or less favorable, or which narratives and counternarratives resonate with vulnerable populations.
Program designers and monitoring and evaluation staff often rely on perception surveying to assess attitudinal changes that CVE programs try to achieve, but there are limitations to this method. Security and logistical challenges to collecting perception data in a conflict-affected community can make it difficult to get a representative sample, while ensuring the safety of enumerators and respondents. And given the sensitivity of the subject matter, respondents may be reluctant to express their actual beliefs to an outsider (that is, social desirability bias can affect data reliability).
The rise of smartphone technology and social media uptake among the burgeoning youth populations of many conflict-affected countries presents a new opportunity to understand what people believe from a safer distance, lessening the associated risks and data defects. Seeing an opportunity in the growing mass of online public data, the marketing industry has pioneered tools to “scrape” and aggregate the data to help companies paint a clearer picture of consumer behavior and perceptions of brands and products.
These developments present a critical question for CVE programs: Could similar tools be developed that would analyze online public data to identify who is being influenced by which extremist narratives and influences, learn which messages go viral, and distinguish groups and individuals who simply hold radical views from those who support or carry out violence?
Using data to track radicalization
Seeking to answer this question, researchers at Arizona State University’s Center for the Study of Religion and Conflict, Cornell University’s Social Dynamics Laboratory, and Carnegie Mellon’s Center for Computational Analysis of Social and Organizational systems have been innovating a wide variety of data analytics tools. ASU’s LookingGlass tool, for example, maps networks of perception, belief, and influence online. ASU andChemonics International are now piloting the tool on a CVE program in Libya.
Drawn from the humanities and social and computational sciences, LookingGlass retrieves, categorizes, and analyzes vast amounts of data from across the internet to map the spread of extremist and counter-extremist influence online. The tool displays what people think about their political situation, governments and extremist groups, and tracks changes in these perceptions over time and in response to events. It also lets users visualize how groups emerge, interact, coalesce, and fragment in relation to emerging issues and events and evaluates “information cascades” to assess what causes extremist messages to go viral on social media and what causes them to die out.
For CVE planners, LookingGlass can map social movements in relation to specific countries and regions. Indonesia, for example, has been the site of numerous violent movements and events. A relatively young democracy, the country’s complex political environment encompasses numerous groups seeking radical change across a wide spectrum of social and political issues.
The ASU team built a real-time, contextual and automated system that could distinguish the violent from the nonviolent among these groups and detect if and when groups or factions begin to move toward or away from violence, or when new groups emerge.
It does this in Indonesia by collecting and analyzing large amounts of multilingual text collected from Twitter, websites, blogs, news sites, speeches, images and videos to discover hotly debated issues and the key topics that discriminate opposing camps. It automatically classifies group positions within a social network, their level of radicalism and their advocacy for violence, based on an innovative methodology that uses groups’ own discourse patterns to position them in relation to each other. It can rapidly detect and display radical and counter-radical hot spots, overcoming language barriers and connecting the dots to visualize networks, narratives, and activities.
Pilot and implications for the future of CVE
There are significant potential applications for LookingGlass in CVE programming worldwide. It offers the ability to validate and triangulate multiple sources of data to arrive at a more comprehensive picture of the reach and influence of competing narratives without trying to measure actual extremist activity. It also allows us to mitigate the observer effect, common in perception surveying, because expressed perceptions are passively collected.
Though data representativeness will be skewed toward countries and demographics with greater internet and social media uptake, LookingGlass puts the young, tech-savvy populations that are often at greatest risk of extremist recruitment directly in our reach.
LookingGlass is being tested in Libya to better understand how extremist narratives and counter-messaging efforts affect local perceptions and popular support of the Government of National Accord versus violent extremist organizations.
The lessons learned from the pilot stand to inform the next generation of CVE programs, which, through the astute application of data science, could offer sophisticated new insights into the drivers and solutions to violent extremism — and dramatically increase CVE program effectiveness.
With potential to change the trajectory of crises, such as famines or the spread of diseases, the innovative use of data will drive a new era for global development. Throughout this monthlong Data Driven discussion, Devex and partners — the Agence Française de Développement, BroadReach, Chemonics and Johnson & Johnson — will explore how the data revolution is changing our approach to achieving development outcomes and reshaping the future of our industry. Help us drive the conversation forward by tagging #DataDriven and @devex.
Michele Piercey is a political transition and countering violent extremism practitioner with experience in Iraq, Tunisia and Afghanistan. She serves as the director of Chemonics’ Conflict and Crisis Practice.
Carolyn Forbes is assistant director of the Center for the Study of Religion and Conflict at Arizona State University, which advances research and education on the dynamics of religion and conflict in global affairs.
Hasan Davulcu is associate professor of computer science in Arizona State University’s Fulton Schools of Engineering. He is an expert in developing novel data mining techniques and tools for structuring and organizing unstructured sources.
Subscribe to Devex Newswire
Top international development headlines emailed to you every day