10 things you should never tell an AI chatbot


This is a heartbreaking story coming from Florida. Megan Garcia thought her 14-year-old son spent all his time playing video games. She didn’t know he was having abusive, in-depth, sexual conversations with a chatbot powered by the Character AI app.

Sewell Setzer III stopped sleeping and his grades plummeted. He ultimately committed suicide. Just seconds before her death, Megan said in a lawsuit that the robot told her, “Please come back to me as soon as possible, my love.” The boy asked, “What if I told you I can go home now?” » His Character AI robot responded, “Please do it, my sweet king.”

You have to be smart

Leave a Reply

Your email address will not be published. Required fields are marked *