Categories USA News

After teen death lawsuits, Character.AI will restrict chats for under-18 users

On Wednesday, Character.AI announced it will bar anyone under the age of 18 from open-ended chats with its AI characters starting on November 25, implementing one of the most restrictive age policies yet among AI chatbot platforms. The company faces multiple lawsuits from families who say its chatbots contributed to teenager deaths by suicide.

Over the next month, Character.AI says it will ramp down chatbot use among minors by identifying them and placing a two-hour daily limit on their chatbot access. The company plans to use technology to detect underage users based on conversations and interactions on the platform, as well as information from connected social media accounts. On November 25, those users will no longer be able to create or talk to chatbots, though they can still read previous conversations. The company said it is working to build alternative features for users under the age of 18, such as the ability to create videos, stories, and streams with AI characters.

Character.AI CEO Karandeep Anand told The New York Times that the company wants to set an example for the industry. “We’re making a very bold step to say for teen users, chatbots are not the way for entertainment, but there are much better ways to serve them,” Anand said in the interview. The company also plans to establish an AI safety lab.

Read full article

Comments

More From Author

You May Also Like

Judge blocks Trump administration from withholding transportation funds over immigration enforcement

Illinois, along with 19 other states, won yet another victory in court Tuesday, with a…

Ohio State, Indiana top season’s first College Football Playoff rankings

Ohio State received top billing in the first College Football Playoff rankings of the season…

Fire damages Albany Park apartment building

No one was hurt after a fire engulfed an Albany Park apartment building Tuesday afternoon.…