According to an investigation by The Verge, Twitter had discussed creating an OnlyF clone to monetize the adult content that dominated the platform for years, but the inability to effectively detect and remove harmful sexual content stalled the idea. A Twitter team came together to see if the company could take such a step this spring, “Twitter cannot accurately detect child sexual exploitation and mass non-consensual nudity.” Twitter spokeswoman Katie Rosberg Katie Rosborough said the team’s findings were “part of a discussion that culminated in us suspending our workflow for legitimate reasons.”
Twitter discontinued its Adult Content Monetization (ACM) project in May shortly after it reportedly agreed to sell to Elon Musk for $44 billion — a deal now up in the air. The company’s leadership team determined that the ACM could not be advanced without further health and safety measures.
The investigation, which you can read in full here, details a February 2021 warning from Twitter researchers that the company was not doing enough to detect and remove harmful content such as child sexual abuse material (CSAM). Researchers reportedly told the company that Twitter’s primary enforcement system, RedPanda, is “an outdated, unsupported tool” and “one of the most fragile, inefficient and unsupported tools it has used to date.” .