This month, Instagram started quietly demoting posts that are “inappropriate” but don’t violate its community guidelines. The platform has yet to explain what that includes exactly, and has given users only one example: “sexually suggestive” content.
As a result, such posts are now restricted from Instagram’s Explore and hashtag pages, which help grow people’s accounts by algorithmically recommending their public photos and videos to the broader community. Users aren’t informed when their engagement is limited — a form of censorship that’s sometimes referred to as “shadow banning.”
This change, which comes amid a wider debate over tech giants’ control of their algorithms, represents Instagram’s efforts to ensure the posts it recommends are “both safe and appropriate for the community,” according to a brief statement about managing problematic content on the site. Several social networking services have responded to public pressure and fleeing advertisers by taking steps to rein in their artificial intelligence: Facebook announced a new plan to stop amplifying fake news; YouTube pledged to slow its promotion of disinformation.
Meanwhile, Instagram — one of the only platforms that’s still hosting conspiracy theorist Alex Jones, among other far-right activists — is suppressing vaguely “inappropriate” posts in a quiet campaign devoid of transparency. Instead of trolls and extremists, experts say it’s women and sex workers in particular who will suffer the consequences.
Instagram/JacqTheStripper Sexist Censorship
Reached by HuffPost, Instagram declined to define “inappropriate” and “sexually suggestive” in the context of non-recommendable content. It also declined to provide a comment explaining why it won’t define those terms. Users seeking clarification on what they’re allowed to post without being shadow banned won’t find much information on the app or website, either.
“As a creator I just have to guess what I can and can’t post and hope for the best,” said Caitlin, an Australian stripper and artist who asked to be identified by her first name only.
Engagement on @exotic.cancer, Caitlin’s massively popular Instagram account where she often posts erotic illustrations, has decelerated, she said. She and others worry the hazy new standards for demotion will lead to broader censorship of women’s bodies.
“The rules are not black and white. They couldn’t be more vague,” Caitlin said. “How will [algorithms] be able to differentiate [between] a woman in lingerie and a woman in a bikini at the beach? A fitness model? Where do they draw the line?”
Like other sites, Facebook-owned Instagram explicitly prohibits posts showing sexual intercourse and most forms of nudity: genitals, “close-ups of fully-nude buttocks,” female nipples. Violating those rules could result in content deletion or account termination, according to Instagram’s community guidelines. Understanding what specific kinds of posts could be subject to algorithmic demotion, however, remains unclear — an issue that’s troubling but “absolutely not surprising,” said digital rights expert David Greene.
“Platforms are generally pretty bad about downgrading, demoting and moderating sexual content of all types,” explained Greene, who is the civil liberties director for the Electronic Frontier Foundation, a San Francisco-based nonprofit. This kind of nebulous censorship on social media disproportionately affects women and marginalized groups, including queer people and sex workers, he added.
“We’ve seen not only content advertising sexual services coming down [on various sites] but also sex work advocacy materials — harm reduction and informative posts about health and safety issues — coming down as well,” Greene said. “One of the reasons it’s important for platforms to define [their content policies] is to make sure that they don’t apply them arbitrarily.”
Facebook Newsroom Instagram has started demoting “inappropriate” content that doesn’t violate its community guidelines, including “sexually suggestive” posts.
Svetlana Mintcheva is the director of programs at the National Coalition Against Censorship, a nonprofit group that campaigns against artistic censorship and gender discrimination online. The NCAC also advocates for social media platforms to give their users more control over what kind of content they see — instead of enforcing vague, overreaching policies.
“It comes down to oppression of the body and female sexuality,” Mintcheva said of Instagram’s new shadow ban. “When it comes to sex workers, I think they’re actually the target of this kind of content demotion — I don’t think they’re even collateral damage.”
Sex workers who spoke to HuffPost said they’ve noticed a significant increase in the online censorship of even remotely sexual content from women since the passage of FOSTA-SESTA in late 2018. The law makes it illegal to assist, facilitate or support sex trafficking, and it removes platforms’ immunity under the Communications Decency Act for user-generated content that does any of those things. In its wake, big tech has made broad, sweeping changes to policies surrounding sexual posts.
Facebook now bans implicit sexual solicitation, including “sexualized slang” and “suggestive” statements. Tumblr no longer allows “adult” content, such as “female-presenting” nipples. Instagram is demoting “inappropriate” posts and, according to sex workers on the platform, ramping up the number of NSFW accounts it deletes altogether.
‘Lobbying For Your Livelihood’
Instagram’s Explore page features content that its algorithms predict will be of interest to specific users. If you engage with a lot of bodybuilder accounts, for example, you can expect to find posts about workout plans and protein shakes in your Explore feed. Conversely, you would have been unlikely to stumble upon “sexually suggestive” images if you hadn’t already been seeking out similar posts. (Under the shadow ban, even if you only follow NSFW accounts, Instagram says it won’t recommend that kind of content.)
For individuals like Jacqueline Frances, a New York-based stripper, artist and comedian who uses Instagram to promote herself and her work, the Explore page can be a vital outreach tool.
“I absolutely depend on Instagram to make a living. I sell books, I sell T-shirts, I sell art, and Instagram is my largest-reaching advertising platform,” she said. “Having my content demoted makes me less visible and makes it harder to remind people to buy my stuff.”
Frances’ 153,000-follower account, @jacqthestripper, was deleted earlier this year even though she didn’t violate any of the community guidelines, she said. Instagram reinstated it after she filed multiple appeals through the app.
“You’re lobbying for your livelihood through a fucking form submission on your phone,” she said.
Caitlin also had her account temporarily deleted this month without notice or explanation from Instagram.
“It was honestly such a scary time. I hate to say it, but I pretty much depend on Instagram for my income as an artist,” she said. “It’s not sustainable to rely on one platform alone. I’ve since made a backup account, a Twitter, started promoting my Patreon and growing my email list.”
On April 5, a Florida-based stripper who uses the pseudonym Selena noticed Caitlin’s page had been removed, so she created a backup account and urged her nearly 12,000 followers to go there, just in case. Minutes later, both her main and backup accounts were gone.
“I just don’t understand. I didn’t post any nudity, I didn’t violate the community guidelines … it felt like Instagram was just trying to get rid of me,” said Selena, who still hasn’t gotten either of her pages back, despite filing repeated appeals to Instagram.
In a testament to the ease with which Instagram’s obscure and arbitrarily enforced content policies can be manipulated to silence women, a Jezebel investigation found that a misogynist troll is allegedly purging sex workers from the platform by reporting their accounts over and over again until they disappear. The troll told Jezebel he has single-handedly gotten as many as 300 pages taken down. Caitlin and Selena’s accounts were reportedly among his casualties.
“All my photos, my memories, my business relationships are gone,” said Selena. “In my [Instagram] bio, I used to say that I was a stripper, but now [in my new account] I just put ‘dancer.’ I used to be open about it, but now I’m scared to be.”
While some women and sex workers like Selena have started to self-censor on Instagram in an effort to avoid being shadow banned or having their accounts terminated, others are reluctantly considering leaving the 1 billion-user site altogether.
“When I got deleted, I had this sort of epiphany,” Frances said. “Why am I putting all my eggs in this basket if I could just be deleted at the drop of a hat, or at the drop of some algorithm, or some troll who doesn’t like that I exist because I’m a woman who makes money off her body?”
‘Nowhere To Go’
After Tumblr’s ban on adult content in December, Annie Brown, a San Diego-based digital marketer and feminist activist, noticed sex workers and sexually expressive artists were “floating around the internet with nowhere to go because they were told that they’re unwanted or inappropriate,” she said. She’s now working to transform Lips, the sex-positive magazine she founded years ago, into a social media site where users can freely embrace and share their sexuality.
Raising awareness about this initiative has been a challenge, she said. Lips’ Instagram account, @lips_zine, has had its posts deleted and, Brown believes, algorithmically demoted.
“Bots can’t tell the difference between erotic art and pornography,” she said. “So now with Instagram [demoting] ‘suggestive’ content, they’re basically saying, ‘We don’t care if it’s art, we don’t care if it’s activism, we don’t care if it’s self-expression.’”
Provided by Lips
Brown said she’s rushing to raise funds for Lips’ web and mobile app development because she’s concerned that sex workers who are turning away from Instagram and other major sites may feel like they have no choice but to seek out alternative spaces that could be less safe.
“If I’m a cam-girl and I want to make videos in the privacy of my home, but I’m not able to promote myself via social media, then it’s harder for me to make a living from my cam-girl page,” she explained. “So I’m going to think about in-person sex work, which increases safety risks and decreases my control over my work.”
Online censorship of sexuality has already made digital work a less viable option for sex workers, according to a lawsuit filed against the federal government by the Woodhull Freedom Foundation, a national human rights organization. It claims censorship affecting sex workers as a result of FOSTA-SESTA has caused them to lose income and has exposed them to greater risks of offline violence.
“I feel as though it’s only going to continue to get worse, and sex workers and those alike will really suffer,” said Caitlin, who noted that she’s been tiptoeing around Instagram’s rules since her account was deleted and restored.
“Anybody who is taking ownership of their sexuality and being comfortable with their body even in a nonsexual way is being silenced for it.”