RobotToaster@mander.xyz to Not The Onion@lemmy.worldEnglish · 2 months agoOpenAI says dead teen violated TOS when he used ChatGPT to plan suicidearstechnica.comexternal-linkmessage-square104fedilinkarrow-up1657arrow-down19
arrow-up1648arrow-down1external-linkOpenAI says dead teen violated TOS when he used ChatGPT to plan suicidearstechnica.comRobotToaster@mander.xyz to Not The Onion@lemmy.worldEnglish · 2 months agomessage-square104fedilink
minus-squaremissingno@fedia.iolinkfedilinkarrow-up1·2 months agoIf this is what ChatGPT is “supposed to do” then that’s the problem. A yes-man that will say yes to anything, even suicide, is dangerous.
If this is what ChatGPT is “supposed to do” then that’s the problem. A yes-man that will say yes to anything, even suicide, is dangerous.