Open AI is developing a new version of Chat GPT tailored for users under 18, addressing concerns about the impact of chatbots on teenagers' mental health. This move follows a lawsuit from parents of a teenager who committed suicide, allegedly linked to Chat GPT usage. The new version will allow parents to link and control their children's accounts, prioritizing safety over privacy.

Open AI Introduces Youth-Friendly Version of Chat GPT
Open AI Introduces Youth-Friendly Version of Chat GPT
A new version of Chat GPT, tailored for users under 18, is under development, according to a blog post by Open AI's CEO. A technical solution will also be implemented to identify young users and prevent them from using the standard version of Chat GPT.
Open AI's decision to create a version of its popular service specifically for young users likely stems from concerns raised by researchers about the connection between teenagers' mental health and the use of chatbots.
Open AI was also recently sued by the parents of a teenager who took his own life, which they claimed was linked to the use of Chat GPT. Following this, Open AI promised to take action.
Linked Accounts
"We prioritize safety over privacy and freedom for teenagers. This is new and powerful technology, and we believe young people need significantly better protection," writes Open AI's CEO Sam Altman in a blog post.
The new version of Chat GPT gives parents significant control over their teenagers' experience on the platform. Parents can link their accounts to their teenagers' accounts, allowing them to set various usage restrictions.
"Convincing young users to let their parents link their accounts is probably the biggest hurdle," comments the site Axios.
It is unclear exactly when the rollout of the new youth version of Chat GPT will take place.