When I tried it in the past, I kinda didn’t take it seriously because everything was confined to its instance, but now, there’s full-featured global search and proper federation everywhere? Wow, I thought I heard there were some technical obstacles making it very unlikely, but now it’s just there and works great! I asked ChatGPT and it says this feature was added 5 years ago! Really? I’m not sure how I didn’t notice this sooner. Was it really there for so long? With flairs showing original instance where video comes from and everything?
From what I can tell, running an LLM isn’t really all that energy intensive, it’s the training that takes loads of energy. And it’s not like regular searches don’t use loads of energy to initially index web results.
And this also ignores the gap between having a question, and knowing how to search for the answer. You might not even know where to start. Maybe you can search a vague question, but you’re essentially hoping that somewhere in the first few results is a relevant discussion to get you on the right path. GPT, I find, is more efficient for getting from vague questions to more directed queries.
I find this attitude much more troubling than responsible LLM use. You should not be trusting tertiary sources, no matter how good their track record, you should be checking the sources used by Wikipedia too. You should always be checking your sources.
That’s beyond the scope of my argument, and not really much worse than pasting directly from any tertiary source.