RobotToaster@mander.xyz to Not The Onion@lemmy.worldEnglish · 4 months agoOpenAI says dead teen violated TOS when he used ChatGPT to plan suicidearstechnica.comexternal-linkmessage-square104fedilinkarrow-up1658arrow-down19
arrow-up1649arrow-down1external-linkOpenAI says dead teen violated TOS when he used ChatGPT to plan suicidearstechnica.comRobotToaster@mander.xyz to Not The Onion@lemmy.worldEnglish · 4 months agomessage-square104fedilink
minus-squaremissingno@fedia.iolinkfedilinkarrow-up1·4 months agoIf this is what ChatGPT is “supposed to do” then that’s the problem. A yes-man that will say yes to anything, even suicide, is dangerous.
If this is what ChatGPT is “supposed to do” then that’s the problem. A yes-man that will say yes to anything, even suicide, is dangerous.