People are familiar with generative AI’s capabilities of creating photorealistic images; however, some individuals have turned it to create something much darker: nonconsensual pornography. “Deepfakes” created from using images taken of real people and altered to show something sexual can have serious repercussions for mental and emotional health; experts note it often triggers trauma or causes posttraumatic stress disorder (PTSD). Therefore it has become an issue among victims of sexual assault or abuse.
Although warned against, text-to-image AI models can easily be misused to generate pornographic images that depict specific people or celebrities without their consent, including politicians and musicians as well as your crush from high school who didn’t agree for their likenesses to be captured in pornographic depictions. According to experts, such engines can be used to frighten or harass them into signing releases giving permission for their image(s) to appear this way.
Due to these tools’ potential for misuse, some online communities have banned them, like Reddit which recently closed down several subreddits dedicated to sharing NSFW generative AI pornography. FurAffinity and similar platforms also impose strict guidelines as to what kind of explicit imagery may be created using their models.
FurAffinity allows only one piece of NSFW content every two minutes; however, some techy users have found ways to bypass these restrictions by editing popular generative AI models’ codes without permission – thus creating nude images, which in many countries would be considered child sexual abuse material (CSAM).