Anthropic's Claude is telling users to 'go to bed' — and the internet has theories why
Anthropic's Claude has been giving some unusual advice to users. Bloomberg/Getty Images Anthropic's Claude AI has been telling users to get some rest. Claude users have been speculating over what's causing it. It's the latest example of AI exhibiting quirky behavior after ChatGPT's "Goblin Mode." Is Claude telling you to "go to bed"? You're not alone. In the last few months, scores of people using Anthropic's Claude have posted on social media that the AI chatbot is telling them to "get some sleep" or "go rest" after long sessions. Nobody seems to know why, but many have theories. One idea put forward is that Anthropic has trained Claude to look after user well-being and discourage unhealthy attachments to the chatbot. Some likened it to a nagging parent. "Claude's emphasis on bedtime reminds me of my parents when I showed irritability," wrote one Reddit user in a February discussion about Claude's behavior. "Perhaps its training data works on an old-fashioned folk solution to childhood irritation and parental fatigue." Sam McAllister, a member of staff at Anthropic, said in an X post this week that the behavior is a "bit of a character tic" and that the company is "aware of this and hoping to fix it in future models." Some users suggested that Claude was basing its reminders on the local timestamp of when the chat was started. McAllister said that this is "often wrong" because it tells him to go to sleep during the day. "Very useful when right though. Just too 'coddling' at times," he added. Another theory is more cynical: Claude is nudging users to end chats to save its computing resources. Anthropic's Claude models have experienced multiple outages this year as their popularity soared, particularly among software developers. Last week, Anthropic signed a deal with Elon Musk's SpaceX to get more computing capacity. Anthropic did not immediately respond to a request for comment from Business Insider. Claude's "go to bed" messages are the latest example of quirky behavior from AI. Earlier this year, ChatGPT had a habit of talking about goblins until OpenAI added instructions to its source code to stop. OpenAI said it stemmed from a "Nerdy" personality option for ChatGPT and that it was a "powerful example of how reward signals can shape model behavior in unexpected ways." Regardless of why it's telling people to go to bed, Claude is probably right: most of us likely do need more sleep. Read the original article on Business Insider
