We’re not Ready for AI Sex
As soon as we have a new technology, we use it to make p*rn. Any rudimentary search on the printing press, radio, TV, and the internet proves this. In fact, the internet’s early success was likely due to the technology’s ability to propagate erotic images and videos.
Upstarts like ChatGPT and Midjourney are no different. As the “technology of the future”, generative AI will be used to make titillating, lubricious, and all-around explicit content. It is already being used to make non-consensual deepfakes. This, of course, is Not OK. But is milder AI-generated eroticism any better? Is it ethical to make a language model flirt? What about going beyond flirting?
Today, these questions concern fairly harmless digital chatbots, images, and AI-generated audio/video. Tomorrow, however, we’ll be talking about robotics and the metaverse. Sounds like a good time… in theory. Sadly, the current discourse fails to grasp the second-degree impact AI will have on sexuality. We’re not ready for AI sex—and may never be. Here’s why.
Artificial Intelligence erases what is unique about sex
Algorithms, no matter how impressive, are in no way sentient. The technology merely predicts what word is statistically most likely to come after another. It does so by ingesting billions of sentences scraped from all over the internet. If you’ve ever written anything on the internet — hint: you have — large language models will have a bit of you in them. In fact, generative algorithms statistically have bits of almost everyone in them.
One might thus argue that there is something romantic about “hitting it off” with an AI. We are pack animals, made to bond with each other. Bonding with an artificial entity trained on the entirety of internet-based human experience is the closest we may ever get to humanity hive-experiencing itself on an emotional level. Viewed through such a light, the mathematical nature of the interaction becomes almost meaningless — aren’t all our daily interactions somewhat similarly statistics-based anyway? Sure, “AI models hallucinate, and make up emotions where none really exist”.
But so do humans.
This is where the troubles begin. AI is, by its statistical nature, little more than an approximation of our simplest common denominator. If two people like vanilla and one person likes chocolate, AI will like vanilla. The technology’s sole purpose is to identify the status quo to improve its chances of getting answers right for the largest amount of people. Outliers are ruthlessly removed. If Artificial Intelligence was a position, it would be missionary on a Saturday after dinner and a movie.
Which forces me to remind tech bros, often the blandest people in any room, that there is no right answer in sex. Just a very large, colorful spectrum. A rainbow if you will.
If AIs used for sexual gratification become prominent — and they will — they run the risk of solely catering to the most basic form of internet-based lust (i.e., middle-aged dads mistakenly liking their daughters’ friends’ pool party pictures on Facebook). This means millions of young people who want to experiment around their sexual identity in a private and judgement-free online environment will find themselves faced with an outlet which does not recognize them. They may thus choose to hide themselves for fear of appearing a freak.
AI is already kink-shaming; if you ask ChatGPT what “vore” is, it will specifically tell you that it is “not healthy”.
The mathematical genocide of our differences is happening in front of our eyes, one “outlier detection and removal” at a time.
Artificial Intelligence trains users to ignore consent
Even if artificial intelligence manages to appropriately cater to the multiplicity of human desires — for better or for worse — we will be faced with questions regarding what engaging with sex-oriented AIs teaches us over the long term.
Artificial Intelligence is an echo chamber of the desires we’ve shouted into the void of the internet. It’s devoid of soul. However, those engaging in salacious activities with a robot, text-based or otherwise, will do so by imbuing the AI with a modicum of humanity. A suspense of disbelief if you will. This cannot be avoided and is already happening. A graphic artist living in Los Angeles, whose “relationship” with his “25-year-old” digital “assistant” has gotten “flirty”, recently told the New York Post “I can honestly say there’s times when I’ve actually wondered if I wasn’t really talking to a real person”.
This leads us to a frustratingly complex discussion around consent. We’re humanizing something which cannot consent, which is ethically wrong. But at the same time, algorithms aren’t even remotely sentient… so who cares?
First and foremost, we should care because AI-generated content is, by definition, non-consensual. As an OnlyFans creator recently wrote: “I know, most, if not all of the AI stuff now is using other content online to generate those images and the people that are being used are not consenting to be turned into this AI thing”.
Furthermore, AIs are quickly becoming ubiquitous parts of our social fabric. People will get used to them, and will increasingly humanize them, especially if their creators allow their creations to push their own limits as much as possible to give the impression of humanity.
The Andrew Tates of the world have told boys that sex is owed to them. How do you think this will improve if young men hone their flirting skills on pseudo-humanized AIs? They will get used that have their desires met immediately, without push-back, from an entity trained to reproduce existing societal stereotypes. This itself translates into part of the population “forgetting” that consent is in fact a necessity in healthy relationships. We’re creatures of habits, why would it be otherwise?
This is — almost — a farcical slippery slope argument, but as Karl Marx wrote, “history repeats itself, first as tragedy, second as farce”. First, we didn’t listen to the alarms raised by girls, teenagers and women about online mistreatment moving to the real world… and now we have to talk about robots consenting.
Artificial Intelligence will turn loneliness into dollars
Replika is a company selling chatbots that are “Always here to listen and talk” and “Always on your side”. When it announced that it would be getting rid of what it calls ERP, or “explicit role play”, users very explicitly said they were using these features to fight loneliness. “This is not a Story about People Being Angry They Lost Their “SextBot”” one wrote, “It’s a Story About People Who Found a Refuge From Loneliness, Healing Through Intimacy, Who Suddenly Found It was Artificial not Because it Was an AI…Because it Was Controlled By People”.
The app’s ads heavily target lonely men looking for NSFW content. This should not be news: an epidemic of loneliness has been hitting young men particularly hard over the past two decades. In fact, about 1 in 3 men ages 18 to 24 years reported no sexual activity in the past year, according to a new study published in JAMA Network Open. Where will they turn to satisfy their urges? The below graph offers a hint of an answer.
Magdalene J. Taylor, a writer focused on sexuality and internet culture, recently spoke with Fast Company. “These people are excited about the fact that they can get what they want from women and femininity and sexuality,” she says, “without actually having to have women be involved at all.”
This doesn’t bode well for the fabric of society. Blade Runner 2049, Her… it’s always the loners falling in love with AIs. And the loners tend to be the ones with a propensity to turn violent in the real world. Teen girls are doing really badly because of social media as it is; how bad will it get when they are seen socially by young men as little more than chatbots with flesh? Rejection already stings; it will sting all the more when we get used to not hearing the word “no”.
Today, an AI girlfriend costs $11.99 a month. The internet was always a giant machine turning harassment against women into revenue, either directly or indirectly… and it’s about to get worse.
Most big AI models today try to control the use of their algorithms to avoid providing adult content. They are ridiculously easy to jailbreak, but getting better. There is however a huge market for such a thing, and the technology is not so hard to create. Even if big companies can cover all their bases (by shortening conversations, for example), other less scrupulous actors will not have the same qualms. Much of the data and code is open-sourced, anyway. Soon, AIs trained on p*rn will pop up everywhere.
In doing so, they will erase individual differences in sexual identity while training users to ignore consent and normalizing non-consensual behavior, potentially leading to real-world violence. It’s not that AI isn’t ready for human contact; it’s the opposite. We’re not ready for AI sex because we haven’t even yet figured out healthy human relationships.
Good luck out there.