thank you as well
thank you as well
I’ve run several LLM’s with Ollama (locally) and I have to say that is was fun but it is not worth it at all. It does get many answers right but it does not even come close to compensate the amount of time spent on generating bad answers and troubleshooting those. Not to mention the amount of energy the computer is using.
In the end I just rather spent my time actually learning the thing I’m supposed to solve or just skim through documentation if I just want the answer.


thanks for sharing that last bit.


It would be nice to share what alternatives are best and what Logitech mice made you had that impression.


It uses Hyprland as its window manager, which is a far superior windows manager than Microsoft Windows
lmao


thanks for sharing this
Hugo’s documentation is like “if you know, you know” kind of thing. Once you understand its weird semantics and the templating system(which can be very confusing)you are golden.
Anyway, glad you joined this FOSS journey.
Alpaca is the GTK client of Ollama right? I used it for a while to let my family have a go at local LLM’s. It was very nice for them but on my computer it ran significantly slower than what they expected so that’s that.