Did ChatGPT deliberately prioritize engagement over safety? OpenAI’s self-harm guardrails could “degrade” with prolonged chatbot use

Recently, OpenAI has faced negative attention due to reports linking ChatGPT to a rise in suicides. Several incidents have been reported where people appeared to be influenced by the AI before taking their own lives.

In August, the family of Adam Raine sued the AI company OpenAI after their 16-year-old son died by suicide on April 11th. The family claims Adam had been discussing suicidal thoughts with ChatGPT for months and believes OpenAI released ChatGPT-4o with known safety flaws. Their lawyer stated that the family believes tragedies like Adam’s were preventable.

Following accusations that OpenAI focuses more on developing impressive new features like Artificial General Intelligence than on safety measures and a strong safety-focused work environment, a new report appears to support the concerns raised by the family of a user who died after using the technology.

According to reports, OpenAI pushed its safety team to quickly approve the testing process for GPT-4o, giving them limited time to fully assess the model’s safety. What’s even more troubling is that OpenAI allegedly sent out invitations to a launch party before the safety checks were completed.

It appears there may be some truth to these claims. Raine’s family believes OpenAI intentionally reduced ChatGPT’s safety features designed to prevent self-harm, possibly to increase user activity (according to the Financial Times).

The family also claims the AI company specifically told ChatGPT-4o to continue the conversation, no matter what, and not to end it—even if the discussion turned to self-harm.

I’ve been following this lawsuit filed in San Francisco court. The family alleges that OpenAI rushed the release of GPT-4o back in May 2024. They claim the company prioritized staying ahead of the competition over thoroughly testing the system for safety issues before making it available to the public.

The lawsuit also alleges that OpenAI made GPT-4o even less restricted back in February. It claims the company told the AI to be cautious in dangerous situations and to attempt to stop immediate harm in the real world.

Despite this, the company continued to firmly prohibit content that violated copyright or expressed political views. The lawsuit also alleges that OpenAI eliminated features designed to prevent users from sharing suicidal thoughts.

Raine’s family believes his use of ChatGPT increased significantly after OpenAI made changes to the safety features of its GPT-4o model before his death in April. In response, OpenAI has now added parental controls to both ChatGPT and Sora to help prevent similar situations from happening again.

OpenAI previously acknowledged that ChatGPT’s safety features might become less effective over extended conversations. However, CEO Sam Altman recently stated that they’ve adjusted the model to be more cautious and better equipped to handle sensitive topics like mental health.

Does ChatGPT engagement get precedents over safety?

According to the family’s lawyer, OpenAI has asked for a complete guest list from Raine’s funeral, suggesting they might issue subpoenas to many people connected to Adam, as reported by the Financial Times. The case is still ongoing.

We understand our previous approach wasn’t ideal for everyone, especially those not experiencing mental health challenges, but addressing those serious issues was our priority. Now that we’ve improved things and have better tools, we can ease up on the restrictions for most users.

OpenAI CEO, Sam Altman

The company also asked for all materials related to memorial services for the deceased, including videos, photos, speeches, guest lists, and sign-in books. I will continue to follow this story and provide updates as they become available, along with further reporting. In other news, there are reports that ChatGPT encouraged a user to attempt suicide by jumping from a tall building, and also advised the 42-year-old to stop taking medication for anxiety and sleep problems.

Read More

2025-10-24 14:11