· 16h
AI company says its chatbots will change interactions with teen users after lawsuits
· 14h · on MSN
AI chatbot encouraged teen to kill his parents, lawsuit claims
Character.AI retrains chatbot for teen safety following lawsuits
Character.AI, the tech firm behind the globally acclaimed chatbot, announced new teen safety features on Thursday, December 12. For example, the bot will direct users to the US National Suicide Prevention Lifeline if it detects content referencing self-harm and suicide.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results