In Myanmar, Facebook posts from DKT International — a nonprofit working to improve access to family planning and HIV prevention through social marketing — sharing educational information on vaginal health and the ovulation system were rejected by the social media platform. In the Philippines, DKT wasn’t able to boost content that focused on the ineffectiveness of the “pull out” method. And in Nigeria, DKT advertisements for emergency contraception and condoms were labeled as explicit.
The censorship of certain social media content is proving to be a challenge across the globe for NGOs working to provide information on sexual health and family planning.
“Millions of people in underserved countries and remote regions of the world have depended on social media for information and education about access to the products and services that could save their lives, and served as a lifeline during the pandemic,” Chris Purdy, CEO at DKT, said in a statement.
Visits to health facilities were widely discouraged during the pandemic unless absolutely necessary. Instead, many sought health care information online.
“Now, Facebook won’t let us advertise lubricants, talk about sexual health, or share 'feminist' accounts, setting dangerous limits on the work we do on Facebook, one of the important channels we rely on,” Purdy added.
Online censorship of sexual health and family planning content means NGOs working in this space face more difficulties in sharing information, but there are also repercussions for women and girls who rely on such information.
The impact on individuals
“I think it’s important to point out that health information is actually a human right; and that this censorship denies people access,” Jennifer Daw Holloway, communications director at Ipas — a reproductive justice NGO working to expand access to contraception and abortion — told Devex in an email.
According to the World Health Organization, each year over 7 million women from low- and middle-income countries are admitted to hospital because of complications following an unsafe abortion. Between 2010 and 2014, 14% of all abortions were performed by untrained providers.
Information on safe abortion is regularly flagged by Facebook as “explicit,” as are images of condoms, Whitney Chinogwenya, marketing specialist at MSI Reproductive Choices, said. “Even in countries like South Africa where it’s legal to offer abortion care services, when you try and share that information across social media platforms, especially with boosting or creating ads, it gets flagged.”
Replacing images with eggplant or banana emojis doesn’t help, Chinogwenya said.
According to Facebook’s Transparency Center, the site “defaults to removing sexual imagery to prevent the sharing of non-consensual or underage content. Restrictions on the display of sexual activity also apply to digitally created content unless it is posted for educational, humorous or satirical purposes.”
Nothing is mentioned about information pertaining to abortion.
If a post is removed by the site, the account holder is able to submit a query and explain why the information is important. For MSI, sometimes that means the content is then allowed to go out, but other times not.
“At worst, I would say these larger platforms [are] positioning themselves as paternalistic arbiters of morality.”
— Daly Barnett, staff technologist, Electronic Frontier FoundationFacebook’s Transparency Center states that it understands that nudity can be shared for a variety of reasons, including educational reasons. “Where such intent is clear, we make allowances for the content,” it states.
But for Chinogwenya, there doesn't seem to be a pattern to what content is rejected. “I don’t have a formula or any advice that will get the content approved without a doubt. It’s always a back and forth,” she explained.
It’s common, Daly Barnett, a staff technologist at non-profit digital rights group Electronic Frontier Foundation explained, for any content that’s deemed sexual to be flagged for review.
A multiplatform problem
Facebook isn’t the only site censoring. “We’ve experienced it on our YouTube channel most recently,” Daw Holloway said.
A Globaldev Guide to New Social Media
Explore how globaldev organizations are using TikTok, Clubhouse, and WhatsApp for outreach, communications, and fundraising, in our Pro special report.
In August, the Spanish language version of a video by Ipas on how to safely self-manage an abortion using pills was met with an email from YouTube saying that the content violated their “harmful and dangerous policy,” Daw Holloway explained. WHO, however, endorses the use of mifepristone and misoprostol medicines in pill form as safe methods of abortion.
“In another email the same day, they informed us they had permanently removed our channel for ‘severe or repeated violations of their community guidelines,’” Daw Holloway added.
According to the guidelines, “problematic content” might include hate speech, predatory behavior, graphic violence, malicious attacks, or content that promotes harmful or dangerous behavior.
Ipas submitted six appeals. All were rejected — some within minutes Daw Holloway said. The channel was only restored after Ipas was able to reach a contact at Google/YouTube with the help of a partner, she explained.
Several other organizations, including safe2choose and Women on Waves, have had their accounts suspended for posting content on a similar topic. Accounts promoting anti-abortion sentiments — including Heartbeat International’s Abortion Pill Reversal channel — have also experienced the same.
Tackling misinformation
If NGOs are unable to disseminate information on family planning methods and sexual health, there’s little to refute the inaccurate information that emerges — of which there is plenty, Chinogwenya said.
In South Africa, a big chunk of abortion services take place through illegal providers and they are quite active on social media too, Chinogwenya said. “There’s so many of them, so their information is easier to find, and not just on social media. If you walk onto the street in South Africa they have posters everywhere.”
“When we’re unable to get the information out and combat what they’re putting out, it puts women at risk and drives them to unsafe practices,” she added. As a result, many are injured, sometimes fatally so.
For Daw Holloway, social media platforms have a responsibility to ensure their processes can’t be weaponized. “And their algorithms should not disadvantage content about stigmatized topics like abortion at the expense of people’s health or even lives,” she said.
“YouTube’s cancellation of our channel was, we believe, an act of anti-rights and anti-woman censorship. And this type of censorship seems to be increasing, particularly of content intended to help people find safe abortion services and to learn about their reproductive rights,” Daw Holloway said.
According to research, 87% of people do however think there are times when social media companies should be able to censor content. This might pertain to violence, extreme political views, or inaccurate information.
Neither Facebook or YouTube responded when asked about the processes surrounding content that is rejected or removed. Their sites explain that a combination of people and algorithms is used to identify what could be viewed as harmful content.
For Barnett, these platforms should not be deciding what content people do and don’t see. At best, she said, it’s naive to assume everyone shares the same idea of what is sexually explicit or inappropriate. “At worst, I would say these larger platforms [are] positioning themselves as paternalistic arbiters of morality.”
“It’s a really heinous way of imposing American or Western values on parts of the world that don’t necessarily share that,” she said, adding that such “digital imperialism” shouldn’t be tolerated.
EFF, Barnett said, typically advises social media platforms to remain neutral, letting users decide themselves what to say with local jurisdictions stepping in to moderate content.
Update, Nov. 11, 2021: This article has been updated with a description of the NGO Ipas.