OpenWebUI + Ollama + SearxNG. OpenWebUI can do llm web search using the engine of your choice (even self hosted SearxNG!). From there it’s easy to set the default prompt to always give you the top (10, 20, whatever) raw results so you’re not confined to ai results. It’s not quite duck.ai slick but I think I can get there with some more tinkering.
You can self host that too ;)
OpenWebUI + Ollama + SearxNG. OpenWebUI can do llm web search using the engine of your choice (even self hosted SearxNG!). From there it’s easy to set the default prompt to always give you the top (10, 20, whatever) raw results so you’re not confined to ai results. It’s not quite duck.ai slick but I think I can get there with some more tinkering.
Is there a guide on how to do this on Linux + 16GB Radeon?
I mean, I could write one! I kind of just pieced it together from guides on the three individuals
Edit: back of the napkin guide below is basically in the OpenWebUI docs already! I use NixOS (btw) but docker/podman should work well.
OpenWebUI + Ollama setup – tl;dr
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:mainOpenWebUI SearXNG guide – a little more involved, but not difficult.
Ohoho? That’s interesting. I don’t have the horse power to selfhost an AI, but that’s good to know !
Thanks for the pointer !!!