AI Sexbot Boom Raises New Questions and Risks

Artificial intelligence (AI) is getting personal. Chatbots are designed to imitate human interactions, and the rise of realistic voice chat is leading many users to form emotional attachments or laugh along with virtual podcast hosts.

Author

  • Raffaele F Ciriello

    Senior Lecturer in Business Information Systems, University of Sydney

And that's before we get to the really intimate stuff. Research has shown that sexual roleplaying is one of the most common uses of ChatGPT, and millions of people interact with AI-powered systems designed as virtual companions, such as such as Character.AI, Replika, and Chai.AI.

What does this mean for the future of (human) romance? The prospects are alarming.

Better be nice to your AI overlord

The most prominent AI companion service is Replika, which allows some 30 million users to create custom digital girlfriends (or boyfriends).

While early studies indicate most Replika users are male, Caucasian and under 30, other demographics are catching up. Male sex robots have been in the making for some years. And they're more than just vibrators with integrated jar openers.

For a subscription fee, users can exchange intimate messages or pictures with their AI partners. Over half a million users had subscribed before Replika temporarily disabled its "erotic roleplay" module in early 2023, fearing regulatory backlash - a move that users dubbed "The Lobotomy."

The Replika "lobotomy" highlights a key feature of virtual companions: their creators have complete control over their behaviour. The makers of apps can modify or shut down a user's "partner" - and millions of others - at any moment. These systems also read everything users say, to tailor future interactions and, of course, ads.

However, these caveats don't appear to be holding the industry back. New products are proliferating. One company, Kindroid, now offers voice chats with up to ten virtual companions simultaneously.

The digital world isn't the limit either. Sex doll vendors such as Joy Love Dolls offer interactive real-life sexbots, with not only customisable skin colour and breast size, but also "complete control" of features including movement, heating, and AI-enabled "moans, squeals, and even flirting from your doll, making her a great companion".

For now, virtual companions and AI sexbots remain a much smaller market than social media, with millions of users rather than billions. But as the history of the likes of Facebook, Google and Amazon has taught us, today's digital quirks could become tomorrow's global giants.

Towards ethically sourced AI girlfriends?

The availability of AI-driven relationships is likely to usher in all manner of ethically dubious behaviour from users who won't have to face the real-world consequences.

Soon, you might satisfy any kink with your AI girlfriend for an extra fee. If your AI wife becomes troublesome, just ask the corporate overlord to deactivate her envy module - for a price, of course. Or simply delete her and start fresh with as many AI mistresses as you like in parallel.

The way people form relationships has already been disrupted by dating apps such as Tinder and Bumble.

What will happen if, in the future, people looking for love are competing against perfect synthetic lovers that are always available and horny? Well, at least they'll be able to create virtual replicas of those hot dates they didn't land.

And for those who lack the skills to create their own virtual companions, there will be plenty of off-the-shelf alternatives.

An ABC investigation revealed the use of generative AI to create fake influencers by manipulating women's social media images is already widespread. This is generally done without consent to sell pornographic content. Much of this content depicts unattainable body ideals, and some depicts people who appear to be at best barely of consenting age.

Another likely application? Using AI sexbot technology to bring celebrities such as Marilyn Monroe and Clara Bow back to life. After all, dead people cannot deny consent anymore.

Replika itself was inspired by its founder's desire to recreate her late best friend through a chatbot. Many use the app to keep deceased loved ones around. What a time to be alive (or dead)!

The potential for emotional manipulation by inventive catfishers and dictators is alarming. Imagine the havoc if figures like Russia's Vladimir Putin or North Korea's Kim Jong-un harness this technology to complement their nations' already extensive cyber-espionage operations.

Perhaps before long we will see corporations offering "responsibly sourced" AI girlfriends for the more ethical consumer - organically grown from consensually harvested content, promoting socially acceptable smut.

Society and the state must act now

With loneliness rising to epidemic levels - surveys suggest up to one in four people in OECD countries lack human connection - the demand for sexbots is only going to grow. Corporations will meet this demand unless society and the state set clear boundaries on what's acceptable.

Sex and technology have always co-evolved. Just as prostitution is "the oldest profession", porn sites are some of the oldest corners of the internet. However, the dystopian potential of sexbots for mass-customised, corporate-controlled monetisation of our most intimate sphere is unprecedented.

Users aren't entirely blameless, either. There's something vicious about replacing a real human being with a totally submissive lust machine.

Early studies suggest narcissism is prevalent among users of this technology. Normalising harmful sexual behaviours such as rape, sadism or paedophilia is bad news for society.

However, going after users isn't likely to be the best way to tackle the issue. We should treat sexbot use like other potentially problematic behaviours, such as gambling.

As with other problematic behaviours where the issue lies more with providers than users, it's time to hold sexbot providers accountable. As our links to AI are growing ever more intimate, there's not much time to waste.

The Conversation

Raffaele F Ciriello does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

/Courtesy of The Conversation. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).