Can a new app more safely identify human trafficking victims?

A migrant fisherman on a boat in Thailand. Photo by: International Labour Organization / CC BY-NC-ND

BANGKOK — In northern, central, and southern Thailand, several groups fighting forced labor and sex trafficking are testing out a new way to identify potential victims. A mobile app dubbed Apprise seeks to provide a cost-efficient platform to interview those who may be trafficking victims — and to disrupt the often biased or unsafe measures by which many groups conduct interviews.

Thailand is a source, destination, and transit country for men, women, and children subjected to forced labor and sex trafficking. Labor brokers recruit from vulnerable communities in Myanmar, Laos, and Cambodia, often promising employment in the construction, manufacturing, or agriculture industries. The fishing and seafood labor force in Thailand, which has come under fire for years for its exploitative nature, is largely composed of migrant workers, many of whom are trapped in debt bondage in Samut Sakhon, Thailand’s leading seafood processing region.

The challenge of combating forced labor and sex trafficking often lies in correctly identifying victims so local groups can offer support — as well as in standardizing questions posed to potential victims in different settings, according to Silvia Mera, program director at The Mekong Club, which is developing Apprise in partnership with United Nations University. The project receives funding from Humanity United and the Freedom Fund.

Often, language barriers prevent direct communication between a Thai frontline responder and a possible victim of exploitation.

“It’s also a problem of trust and fear of reprisal,” Mera explained. There may be a lack of trust either in the person asking the questions or in the authority conducting the investigation — or in the interpreter misinterpreting what’s been asked or what’s been answered.”

The app, which will be free to use and is currently being piloted in Thailand, has been designed with targeted question lists that address forced labor in the fishing industry and in manufacturing settings, as well as lists addressing sex trafficking and forced begging. An eight-question emergency list, meanwhile, provides a quicker way to prescreen a potential victim without focusing on a particular setting.

The power of the app is in the languages, Mera said of the 14 languages the questions lists have now been translated into, including several dialects. A frontline responder equipped with the app loaded on a smartphone would choose the appropriate questionnaire and hand the phone to the potential victim with headphones. Audio questions in the potential victim’s selected language then prompt the person to tap yes or no on the smartphone in response.  

Perhaps most importantly, questions are able to be asked privately and without an exploiter overhearing a conversation, as too often happens, Mera explained.

In its 2015 report on human trafficking, Thailand reported that inspections of 474,334 fishery workers failed to identify a single case of forced labor. More recently, more than 50,000 inspections of fishers did not identify a single instance where labor protection regulations had been violated, according to a 2018 report by Human Rights Watch. This isn’t reflective of the rampant exploitation in the industry, rights groups say, and can likely be traced at least somewhat to the method of interviewing potential victims.

“Even when they trust the frontline responder and understand the language, oftentimes, in factories, the manager will be standing next to the potential victim, or on a fishing boat someone from the crew will be right there, so this will hinder the chances of someone responding truthfully for fear of reprisal,” Mera said.

In testing the app during a recent research trip, Mera found that participants preferred using the app even if a language barrier was not a problem, for the enhanced privacy it offered. Her team also tested Apprise with minors who had been in sex trafficking situations, some of whom could not read or write, and found that the children were still able to use the app thanks to the audio prompts.

Removing an interpreter and even interviewer from the formula not only eliminates potential bias — whether personal or due to bribes — but it could also help to standardize the process of identifying victims, Mera said.

Different organizations have their own ways of identifying victims in certain settings, depending on which guidelines they apply, she said. By developing comprehensive, unvarying question lists, The Mekong Club seeks to remove the potential for the interviewer to change the wording of questions as well as create a standard way to identify abuse.

“If you’re going to interview 20 workers, those 20 workers would be asked the exact same 20 questions in the exact same order,” Mera said. “And that's a much more balanced way to judge what the situation is.”

Update April 10, 2018: This article has been updated to clarify that Apprise is developed in partnership with United Nations University and receives funding from both Humanity United and Freedom Fund. It has also clarified that frontline responders choose the questionnaire, and allow potential victims to choose the appropriate language.

About the author

  • Rogers kelli cropped

    Kelli Rogers

    Kelli Rogers is a global development reporter for Devex. Based in Bangkok, she covers disaster and crisis response, innovation, women’s rights, and development trends throughout Asia. Prior to her current post, she covered leadership, careers, and the USAID implementer community from Washington, D.C. Previously, she reported on social and environmental issues from Nairobi, Kenya. Kelli holds a bachelor’s degree in journalism from the University of Missouri, and has since reported from more than 20 countries.

Join the Discussion