Home

ChatGPT’s ‘Adult Mode’ Is Coming in 2026

Reading time 2 minutes

OpenAI announced the latest version of its flagship AI model, GPT-5.2, on Thursday in response to its ongoing attempt to keep up with its competitors. And while GPT-5.2 reportedly performs well in most benchmark tests, there’s one metric that OpenAI did not provide details on: how horny is it? It looks like we’ll find that out sometime early next year. Fidji Simo, OpenAI’s CEO of Applications, told reporters that “adult mode” will debut in ChatGPT during the first quarter of 2026.

The porn-brained version of ChatGPT is something that CEO Sam Altman promised earlier this year after some of the chatbot’s users freaked out about it getting lobotomized by the update to GPT-5. In October, Altman acknowledged the company had cut back on the chatbot’s personality in response to growing concerns about user mental health following a wrongful death lawsuit filed against OpenAI by the parents of a 16-year-old who asked ChatGPT, among other things, for advice on how to tie a noose before taking his own life.

The company responded to those concerns with additional parental controls and new efforts to age-gate younger users with a “safer” version of ChatGPT, which estimates the user’s age and filters them into the appropriate experience. That age estimation feature is still key to OpenAI opening the door to “adult” experiences. According to The Verge, Simo told reporters the company is still testing its age verification technology and wants to ensure it can accurately identify teens and not misidentify adults before officially rolling out the split experience.

At this point, the promised “adult mode” is irrevocably linked to Altman’s specific promise of allowing ChatGPT to produce “erotica.” But Altman later clarified that the idea is to give adult users more “freedom” in how they interact with the chatbot, including allowing it to develop more of a customized “personality” over the course of conversations with the user.

Adults may be better equipped than kids to remember they are talking to a chatbot and not a real person, but it is still dubious to suggest that giving users the ability to get attached to a robot’s fake personality is in any way good for safety. A study published earlier this year in the Journal of Social and Personal Relationships found that adults who have developed emotional connections with chatbots are significantly more likely to experience higher levels of psychological distress than those who don’t. Other studies have found that people with fewer real-life relationships are more likely to confide in chatbots, and even OpenAI has acknowledged in the past that some users are at risk of becoming emotionally reliant on ChatGPT. Presenting that as simply being a choice that an adult can make at a time when we know so little about this technology feels a bit like passing the buck.

Explore more on these topics

Share this story

Join our Newsletters

Subscribe and interact with our community, get up to date with our customised Newsletters and much more.

The Best Tech Gifts of 2025

Latest news

Latest Reviews

Related Articles

Gadgets gifts are the best gifts to get friends and family.

Surely this is the update that turns everything around.

A police department in a suburb of Phoenix has used ChatGPT twice this year to generate images of crime suspects.

Sorry to the Polymarket bettors.

AI chatbots are generating fake titles that people insist are real.

They want the companies to "mitigate the harm caused by sycophantic and delusional outputs" from AI products.

©2025 GIZMODO USA LLC.

All rights reserved.

Source: Gizmodo

Previous

Next