

There is a cool self-hosted version of Perplexity out there now, called Perplexica
. It can be configured to use Ollama (local inferencing) and your own, self-hosted SearXNG instance to do the actual search and collation.
I have been using it for a week and it really works.
If you say so. I’m just trying to be helpful instead of offering scare quotes.