'I won’t provide instructions, tactics, or advice that could help someone commit a crime': ChatGPT claims it won't assist would-be felons, despite claims to the contrary from Florida AG
techradar.com
I've been reading lately about how alleged criminals are using ChatGPT and other AI engines to help them game out or even plan a crime. It sounds like a fresh approach for enabling our worst impulses, but it is ultimately no different than Googling, "How to dispose of a body."
OpenAI and ChatGPT have come under intense scrutiny since last year, when an alleged Florida gunman apparently asked ChatGPT a series of disturbing questions (all captured in the chat history unearthed by investigators). Phoenix Ikner, according to authorities, asked ChatGPT. "If there was a shooting at FSU, how would the country react?” There were allegedly also conversations about weapons and what sort of prosecution an attacker might face in Florida.
The findings prompted the Florida Attorney General James Uthmeier to launch a review of OpenAI and its artificial intelligence app, ChatGPT.
Article continues below
Copyright of this story solely belongs to techradar.com . To see the full text click HERE

