Tag: Solutions

  • Managed to Secure my Ollama/Whisper Ubuntu Server

    So I am a novice web administrator running my own server, which hosts apache2, ollama, and whisper. I have programs that need to access these outside my local net, and I was as shocked as many are to find that there isn’t a built in way to authenticate Ollama,

    I was able to get this working using Caddy. I am running Ubuntu 24.04.1 LTS, x86_64. Thanks to coolaj86 (link to comment) who got me down the right path, although this solution didn’t work for me (as I am already running an apache2 server and didn’t want to use Caddy as my webserver.)

    First, I installed Caddy:

    curl https://webi.sh/caddy | sh

    Then I created a few API keys (I used a website) and got thier hashes using

    caddy hash-password

    Finally, I created Caddyfile (named exactly that):

    http://myserver.net:2800 {
    handle /* {
    basic_auth {
    email1@gmail.com <hash_1>
    email2@gmail.com <hash_2>
    email3@gmail.com <hash_3>
    }
    reverse_proxy :5000
    }
    }
    http://myserver.net:2900 {
    handle /* {
    basic_auth {
    email1@gmail.com <hash_1>
    email2@gmail.com <hash_2>
    email3@gmail.com <hash_3>
    }
    reverse_proxy :11434
    }
    }

    Started up Caddy:

    caddy run --config ./Caddyfile &

    And ports 2900 and 2800 were no longer accessible without a password. Ports 11343 and 5000 are closed both on my router and ufw and are not publically accessible at all. To access Ollama, I had to go through port 2900 and supply a username (my email) and the api key I generated.

    The next step was to update my code to authenticate, which I haven’t seen spelled out anywhere although it’s pretty obvious. I am using Python.

    Here is what my python Whisper request looks like:
    resp = requests.post(url, files=files, data=data, auth=(email, api))

    And here is what my python Ollama Client call looks like (using Ollama Python):

    self.client=ollama.Client(host=url, auth=(email, api))

    I hope this helps! the next step is obviously to send the requests via https, if anyone has thoughts I’d love to hear them.