• 10 Posts
  • 230 Comments
Joined 2 年前
cake
Cake day: 2023年9月21日

help-circle
















  • I think the main thing is even if they were using the same underlying model (like chatgpt or Claude), they give them different prompts. For example, the one you linked seems more clearly prompted to give you a humorous roast style summary. Just from the screenshot from Reddit I get the impression they gave it a prompt about “you are an assistant for community moderators who are evaluating what course to take with a user” or something like that.




  • Besides the other commenter highlighting the specific nature of the linked study, I will say I’m generally doing technical queries where if the answer is wrong, it’s apparent because the AI suggestion doesn’t work. Think “how do I change this setting” or “what’s wrong with the syntax in this line of code”. If I try the AI’s advice and it doesn’t work, then I ask again or try something else.

    I would be more concerned about subjects where I don’t have any domain knowledge whatsoever, and not working on a specific application of knowledge, because then it could be a long while before I realize the response was wrong.