Censoring Sex Positive Content and The Internet
Instagram Gets Tough on Censoring Sex Positive Content – The Continuing Effects of FOSTA/SESTA
The History
The issues with Instagram and its strict moderation of sexual content originated in 2018 when, former President Trump passed the US laws dubbed FOSTA (Allow States and Victims to Fight Online Sex Trafficking Act) and SESTA (Stop Enabling Sex Traffickers Act). FOSTA/SESTA dealt a significant blow to social media companies, removing Section 230 from the US Telecommunications Act, which previously protected them from liability for what was posted on their platforms.
The aim of FOSTA/SESTA was to fight sex trafficking, but in order to avoid federal prosecution, social media platforms including Instagram have implemented policies to control and censor content relating to sex. Strict community guidelines relating to nudity, sexual activity and solicitation were implemented shortly after the US law was passed. Enforced through artificial intelligence systems scanning for sexual images, words and emojies, combined with user reports of content violations. Simply posting the word "sex" or using a call to action, "link in bio" for example, have resulted in posts being deleted and accounts being deactivated.
Meta’s official policy says that although they allow for the discussion of sex worker rights advocacy and sex education, it will not allow content that “facilitates, encourages or coordinates” commercial sex work, in a move that not only conflates legal forms of sex work with sex trafficking, but also broadly censors any discussion of sex or sex positive content.
Current State of Play
Although content creators using platforms such as Instagram tread very carefully, those in the business of sex are still subject to heavy handed judgement when it comes to their content. Even accounts for lingerie brands, sex educators and adult products are being swept up in the conservative approach that Instagram are taking in order to avoid prosecution under FOSTA/SESTA legislation. There is an appeals process content creators can access to get content reinstated, however contacting Instagram directly is very difficult. Some creators have reactivated their accounts through a personal connection with someone at the social media mega-company. Fortunately for them, they have an advocate on their behalf that is not available to every creator.
Options for paid advertising and monetisation are not available to creators whose business is in industries catering for adult content.
The head of Instagram, Adam Mosseri, recently released a lengthy video regarding content suppression and the algorithm Instagram relies on to recommend content to users. Although previously Mosseri has denied that the platform “shadowbans” users, in this recent statement, he does acknowledge that content that violates community guidelines will have its reach limited. Mosseri has also denied that content suppression is used to strong-arm users into paying for advertising, however, options for paid advertising and monetisation are not available to creators whose business is in industries catering to adult content. Furthermore, although Mosseri acknowledges that transparency around what exactly constitutes a violation of community guidelines and that improvements with the appeals process need to occur. Currently there are no solid plans for how this will be progressed.
Since 2018 FOSTA/SESTA has only been successfully used once to prosecute at the US Federal level, and its implementation has resulted in a fragmentation of the online sex market. Ironically this has made it increasingly more difficult for law enforcement to surveil sex traffickers.
What is on the way?
Although the effects of FOSTA/SESTA will continue to be felt for some time to come, there are other pieces of internet legislation being introduced in the US which, if passed, will have a significant effect on users of social media.
Such a level of liability will inevitably result in platforms further limiting access to content which is sex positive, focussed on education and harm minimisation
One of these pieces of legislation is the Kids Online Safety Act (KOSA) which has been dubbed one of the worst internet bills of 2023. Proposed by the Republican Senators, the bill aims to protect young people under the age of 16, from accessing material on social media platforms, which may impact on their health and wellbeing. Aside from the concerns regarding the surveillance of young people's internet usage, this bill would force platforms to adopt a "duty of care" in relation to these users. Such a level of liability will inevitably result in platforms further limiting access to content which is sex positive, focused on education and harm minimisation. This will impact every person who uses social media.