basiclemmon98@lemmy.dbzer0.com to Not The Onion@lemmy.worldEnglish · 1 day agoAfter using ChatGPT, man swaps his salt for sodium bromide—and suffers psychosisarstechnica.comexternal-linkmessage-square43fedilinkarrow-up1332arrow-down12file-text
arrow-up1330arrow-down1external-linkAfter using ChatGPT, man swaps his salt for sodium bromide—and suffers psychosisarstechnica.combasiclemmon98@lemmy.dbzer0.com to Not The Onion@lemmy.worldEnglish · 1 day agomessage-square43fedilinkfile-text
minus-squareNutWrench@lemmy.mllinkfedilinkEnglisharrow-up23arrow-down1·18 hours agoAn early AI was once asked, “Bob has a headache. What should Bob do?” And the AI replied, “Bob should cut off his own head.” The point being: AIs will give you logical solutions to your problems but they won’t always give you practical ones.
minus-squarekrunklom@lemmy.ziplinkfedilinkEnglisharrow-up2·10 hours agoexcept they won’t always give you logical answers.
minus-squareRedditRefugee69@lemmynsfw.comlinkfedilinkEnglisharrow-up8·17 hours agoYes, eating one small rock a day is logical.
An early AI was once asked, “Bob has a headache. What should Bob do?” And the AI replied, “Bob should cut off his own head.”
The point being: AIs will give you logical solutions to your problems but they won’t always give you practical ones.
except they won’t always give you logical answers.
Yes, eating one small rock a day is logical.