“Will I be OK?” Teen died after ChatGPT pushed deadly mix of drugs, lawsuit says
arstechnica.com
Sam Nelson started using ChatGPT in high school, but his family alleged that the chatbot later became an "illicit drug coach." Credit: via Tech Justice Law, Social Media Victims Law Center
OpenAI is facing down another wrongful-death lawsuit after ChatGPT told a 19-year-old, Sam Nelson, to take a lethal mix of Kratom and Xanax.
According to a complaint filed on behalf of Nelson’s parents, Leila Turner-Scott and Angus Scott, Nelson trusted ChatGPT as a tool to “safely” experiment with drugs after using the chatbot for years as a go-to search engine when he was in high school.
The teen viewed ChatGPT so highly as an authoritative source of information that he once swore to his mom that ChatGPT had access to “everything on the Internet,” so it “had to be right,” when she questioned if the chatbot was always reliable, the complaint said.
But Nelson’s confidence in ChatGPT ...
Copyright of this story solely belongs to arstechnica.com . To see the full text click HERE

