No I like yours better.
No I like yours better.
What about semantics?
“Nothing is better than cake."
“But bread is better than nothing.
"Does that mean that bread is better than cake?”
Damn. Just when I’d learned to pronounce Eyjafjallajökull.
We donate to Wikipedia once per year.
One of my favourites is “Christ on a bike!” because it’s so hilarious.
Stopped going when they made shit coffee.
They don’t give a definition of ‘incomplete’ or ‘faulty’? Is that on purpose?
“has a model of how words relate to each other, but does not have a model of the objects to which the words refer.
It engages in predictive logic, but cannot perform syllogistic logic - reasoning to a logical conclusion from a set of propositions that are assumed to be true”
Is this true of all current LLMs?
Thank you for replying. This is the level of info I used to love on Reddit and now love on Lemmy.
Thanks for your reply, I appreciate the correction and the info.
Interesting article.
“Instead of passing a law preventing civilians from carrying weapons of war, they enacted a rule prohibiting spectators from carrying small signs into meetings.”
“Americans, whether they own a gun or don’t, want guns kept out of the hands of dangerous and unstable people. Americans, whether they vote for Republicans or Democrats, don’t want children to be blasted into bits at their school desks. As we have lately learned here in Tennessee, that’s a lot of common ground.”
This is my complaint. It ranks popular videos with the title words out of order, over videos with the words in phrase order when I’ve used quote marks as a command to only return results containing the phrase.
I also assume that for both Google and YouTube, content they want me to see is being ranked above content I choose. I am the product, not the customer, and to me that’s not acceptable in a search engine.
The thing that strikes me about LLMs is that they have been created to chat. To converse. They’re partly influenced by Turing tests where the objective is to convince someone you’re human by keeping up a conversation. They weren’t designed to create meaningful content or factual content.
People still seem to want to use chat GPT to create something, and fix the accuracy as a second step. I say go back to the drawing board and create a tool that analyses statements and tries to create information based on trusted linked open data sources.
Discuss :)
I search Google for “Music behind the scenes”. Because the first word is music(?) Google gives me four songs with some of the keywords, but not in phrase order. Then it gives me seven YouTube videos, then one website that actually contains the phrase, and in fact refers to the videos I’m looking for.
But what it absolutely refused to give me, no matter how hard I tried, was this: https://youtu.be/7r01e_SZ5ic?si=GdOpoP8dBp372yjg
I presume this is because the videos aren’t monetised? Anyway precision score 11/30, and as for recall, even if I click on ‘videos’ the five of them that have the exact phrase in the title don’t appear at all.
Absolutely. Do you remember the herp derp extension that would turn them all into ‘herp derp herp derp’ so that you wouldn’t have to read them?
The customer is not always right. Sometimes the customer is a douchebag.
Moderation in all things.
To avoid negative thinking, challenge the thought that a problem is personal, pervasive or permanent (Martin Seligman)
Parenting: Set a good example. Don’t punish. Teach. Tell them what TO do, not what not to do.
Having ideas about the way things ought to be is great, but you can only respond to what is.
Be excellent to each other. Do as you would be done by.
Ah, the corporate enshittification of search.
Umm… correlation vs causation?? Anyone??
Hence the job title ‘prompt engineer’ I guess. If you know about Soylent Green, AI is people!