… by running your own instance of the free and open-source federated metasearch engine SearXNG on OpenBSD!

  • sunstoned@lemmus.org
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    6 hours ago

    You can self host that too ;)

    OpenWebUI + Ollama + SearxNG. OpenWebUI can do llm web search using the engine of your choice (even self hosted SearxNG!). From there it’s easy to set the default prompt to always give you the top (10, 20, whatever) raw results so you’re not confined to ai results. It’s not quite duck.ai slick but I think I can get there with some more tinkering.

      • sunstoned@lemmus.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        2 hours ago

        I mean, I could write one! I kind of just pieced it together from guides on the three individuals

        Edit: back of the napkin guide below is basically in the OpenWebUI docs already! I use NixOS (btw) but docker/podman should work well.

        OpenWebUI + Ollama setup – tl;dr docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

        OpenWebUI SearXNG guide – a little more involved, but not difficult.

    • N0x0n@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      Ohoho? That’s interesting. I don’t have the horse power to selfhost an AI, but that’s good to know !

      Thanks for the pointer !!!