
‘Your content has been removed for violating community guidelines.’
This is a notification that Annie Sherlock (Owner of Anniemal Art) was prepared to see when her sex art began gaining traction on Instagram. She shared all images with her subjects’ consent, and captions clearly stated that they were created for artistic purposes. Instagram, however, wasn’t buying it.
For individuals who post to Instagram for fun, there’s no major issue with content being removed through questionable moderation methods. For artists whose livelihoods rest in the balance of the online platform – it’s another story.
There is a very fine line between expression and exploitation online, now more than ever. It becomes increasingly difficult with Instagram’s 1 billion+ users worldwide adhering to various rules and regulations. Not only has Instagram become an essential tool for creators to connect with others and build a community, but the platform has also become home to big money. It’s easier than ever for businesses and individuals to build lucrative careers – albeit at the mercy of the algorithm.
Today, we’ll dive into Instagram’s problematic moderation algorithm, with a specific lens on creatives and sex workers. By mitigating online harm for these individuals, is Instagram protecting lives for some at the cost of livelihoods for others? As Annie says – ‘Instagram needs to grow up – and let my talent shine!’
Moderation – Since When?
In the early 1990s, internet service providers began implementing content moderation policies to prevent users from accessing illegal or harmful content, such as child pornography or hate speech. Since then, the proliferation of social media platforms has brought new challenges for online moderation, as users can now share and disseminate content quicker than ever before.
There are 95 million photos and videos shared on Instagram every day. This pace requires Instagram to use artificial intelligence alongside human moderators to report and remove questionable content.
Instagram policies see the platform commit to removing content that “promotes violence, hate speech, self-harm, or terrorism.” Instagram community guidelines also prohibit “nudity, sexual content, harassment, and bullying.”
The Online Safety Act (OSA) in Australia (passed in early 2021) is also employed. It aims to address online harm and harassment by requiring social media platforms like Instagram to “remove harmful content within a reasonable timeframe.” The act primarily focuses on removing harmful content rather than ‘addressing the underlying societal and cultural issues that give rise to online hate and abuse; (Kerrigan & Kostka, 2021, p. 5). Additionally, some critics argue that the act places too much responsibility on social media platforms to police online content rather than holding individuals accountable for their behaviour online.
Unsurprisingly, Instagram and Facebook have come under fire in recent years for their moderation practices and censorship’s impact on creators, artists and other marginalised groups.
Many scholars argue that content moderation policies are technical decisions and ‘political and cultural ones that shape the contours of online speech’ (Gillespie, 2018, p. 4). Research emphasises the importance of transparency and accountability in content moderation, arguing that platforms like Instagram must be more transparent about their moderation practices and ‘more accountable to the communities they serve.’ (p.5).
Unfortunately, nude art falls into this category, as do sex workers who willingly promote sexualised content (not nudity) on their pages. In fact – even mothers who share information on birth and birth education have been known to be banned from posting to Instagram.
These policies are not perfect. However, they demonstrate the platform’s commitment to creating a safe and positive environment for its users – and the nuances involved with such a seemingly impossible task of preventing online harm.
Gillespie concurs that algorithms “are not foolproof and often result in false positives and negatives that can have significant consequences for creators” (Gillespie, 2018, p. 98).
As we’ll hear – creators and other users often feel that their voices are not being heard. Platforms like Instagram are under intense pressure to balance the interests of different stakeholders, including advertisers, investors, and users, while ensuring the platform remains safe – what Gillespie refers to as ‘conflicting priorities’ (p.72).
It comes back to the purpose of the platform. As Instagram puts it, they’re “A simple, fun & creative way to capture, edit & share photos, videos & messages with friends & family.” Unfortuantely, I think those days are long gone, Zuckerberg.
A Platform For Expression – But Not That Type
Instagram’s content moderation policies seem to impact those who blur the line of self-expression. Ironically – sex workers, artists and creatives who rely on the platform to showcase their work are often the most highly moderated – not those who inflict the hate speech or inappropriate behaviour towards creators.
Many artists have reported having their content removed or censored without explanation; in some cases, their entire accounts have been deleted (Duffy & Hund, 2021, p. 19). Artist Betty Tompkins has had her account suspended several times for sharing her artwork, which often features explicit sexual imagery. Such instances raise concerns about the platform’s bias towards sexually explicit content and its impact on artists’ freedom of expression.
Digital artist Anna McNaught also had her account suspended without warning, and it took several weeks of appealing to have it reinstated. During that time, she lost access to her customer base and was unable to promote her artwork on the platform (Borland, 2020, p.7).
Research confirms that Instagram’s moderation policies have a disproportionate impact on marginalised artists and creatives. A study by Matsui and Nguyen (2021, p.13) found that Instagram’s content moderation policies often target individuals who identify as LGBTQIA+ and people of colour. Additionally, individuals who post content related to mental health, body positivity, and sexual wellness also face increased censorship.
Too Rude or Just Nude?
In recent years, Instagram has faced criticism for its inconsistent policies around nudity. While the platform allows images of topless men, it often removes images of topless women, regardless of context. This has led to accusations of sexism and body-shaming, plus the rapid growth of political movements such as ‘‘Free The Nipple.’
It seems the algorithm also tends to prioritise content that is likely to generate engagement – (often content that is harmful, unsolicited and unsubstantiated), which can lead to homogenisation of content.
One study found that Instagram’s algorithmic moderation practices tend to favour content that conforms to certain aesthetic and demographic norms, such as images of thin, white, conventionally attractive women.
Numerous sex workers have reported having their accounts deleted or suspended, even if they were not violating any of Instagram’s Community Guidelines nor receiving hate or bullying (Roberts & Quiroz-Martinez, 2021, p.326). Perhaps most frustrating is that Instagram’s appeals process is opaque and inconsistent, and many sex workers have reported that their appeals are denied without explanation (Matsui & Nguyen, 2021, p.12).
The online harms faced by individuals who rely on Instagram for their livelihoods cannot be overstated. Whilst creators may be participating in a ‘safe’ (not always) online community – it becomes increasingly difficult for them to gain visibility and build a following, harming their ability to earn a living and connect with their audience (Abidin & Ots, 2019, p.8).
The question is, what is greater harm? Do we prioritise protecting audiences from being exposed to nudity or protecting creators from being stripped of their opportunity to express and create? The censorship does seem ironic – considering these creators are the very people who constitute the platform existing.
A Brighter Day – Sunroom To The Rescue
Despite the difficulties of being a creative on Instagram, talented people are dreaming up new and innovative ways to jump the hurdles. Enter Sunroom: the app that embraces creators that have been banished from Instagram in a safe and open way. As they put it, ‘We are the app where women and women-identifying people make money.’
Sunroom was launched in 2020 and built by two ex-dating app designers – Lucy Mort of Hinge and Michelle Battersby of Bumble. The women saw a gap in the market for creators who wished to monetise their online content, felt averse to utilising OnlyFans and had been censored by Instagram.
Sunroom operates on a subscription-based model that enables creators to monetise their content and offer exclusive access to their followers. Unlike Instagram, Sunroom has no content moderation policies, meaning that creators have complete freedom of expression and can share any content they want without fear of censorship. Additionally, Sunroom has features like anti-screenshot capabilities, watermarking, and other measures to prevent hate speech and other forms of abuse.
Sunroom has already gained traction among creators affected by Instagram’s content moderation policies. One such creator is Rachel Cargle, an activist and writer who has been vocal about the censorship she has faced on Instagram. Cargle has praised Sunroom for its commitment to allowing creators to express themselves without fear of censorship and for creating a platform that centres the voices of marginalised communities.
While it remains to be seen whether Sunroom will become a significant player in the social media landscape, its early success suggests a demand for platforms that prioritise creators’ interests over those of advertisers and investors.
Share, Report, Block, Delete – What’s Next?
It is crucial to understand the importance of Instagram as a platform for self-expression and community-building. It must remain this way for us to work towards more inclusive and equitable moderation policies that support rather than harm creators, artists and sex workers.
In my view, we require a shift away from the current algorithmic moderation practices that prioritise conventional content and towards a more nuanced and context-specific approach that considers the diversity and complexity of the content posted on the platform. It also requires recognising the value and legitimacy of sex work as work and a commitment to supporting and empowering sex workers and artists rather than stigmatising and marginalising them.
As we have discussed, moderation on platforms like Instagram can be a double-edged sword, as it can protect users from harmful content while also limiting their ability to express themselves freely. Flew argued that the regulation of platforms should be guided by principles of “symmetry,” which means that the same rules and standards should apply to all participants on the platform, regardless of their size, demographics or influence (Flew, 2021, p. 114). He also highlighted the importance of considering “the needs and interests of both creators and users, as well as the broader societal impacts of the platform.”
Overall, the challenge of regulating social media platforms like Instagram is a complex and multifaceted one. It requires balancing the needs and interests of multiple stakeholders while also considering the broader societal impacts of the platform. Moving forward, it will be crucial for regulators and platform operators to take a thoughtful and nuanced approach to moderation and regulation, in order to ensure that these platforms are able to operate in a sustainable, equitable, safe and socially responsible way.
For Annie, it comes back to the intent of the content. Expressing sex and beauty is, as she says – ‘a part of life.’ Much like nudity. Or relationships. Or art. If people are unable to embrace that through Instagram, they will surely find another way to do so.
What are your thoughts on moderation and the prevention of online harm? We would love to hear in the comments below.
Be the first to comment