Unable to Access Ollama Models on Open WebUI After Enabling HTTPS: Troubleshooting and Resolution Guide

Unable to Access Ollama Models on Open WebUI After Enabling HTTPS: Troubleshooting and Resolution Guide

This guide is a continuation of my previous article, “Step-by-Step Guide to Set Up HTTPS for Dockerized Open WebUI on Linux”. If you followed the guide to enable HTTPS for your Open WebUI and now find yourself unable to access or select Ollama models, this troubleshooting and resolution guide is for you.

After enabling HTTPS, some users have encountered issues where the Open WebUI loses access to the installed models, such as Llama2 and the default Arena model. This guide aims to help you diagnose and resolve these issues, ensuring that your Open WebUI can successfully communicate with the Ollama service and list all available models.

By following the steps outlined in this guide, you will be able to restore full functionality to your Open WebUI, allowing you to access and select the Ollama models seamlessly. Let’s dive into the troubleshooting process and get your system back on track.

After enabling HTTPS on my Open WebUI + Ollama installation, I lost access to the installed models. Specifically, from the Open WebUI, I could no longer access or select models such as Llama2 or the default Arena model. The HTTPS setup involved using an Nginx container as a reverse proxy to redirect traffic on port 443 to the internal Open WebUI container running on port 8080.

  • Open WebUI Container: open-webui
  • Nginx Container: nginx-ssl
  • Docker Network: webui-network
  • Ollama Service: Running on localhost (127.0.0.1)

Upon checking the Open WebUI container logs, the following errors were observed:

# docker logs open-webui

INFO  [open_webui.apps.ollama.main] get_all_models()
ERROR [open_webui.apps.ollama.main] Connection error: Cannot connect to host 127.0.0.1:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)]

These errors indicated a connection issue between Open WebUI and the Ollama API, likely due to network isolation caused by the HTTPS changes.

During the HTTPS changes, both the open-webui and nginx-ssl containers were placed on the webui-network, while the Ollama service remained on the host network (127.0.0.1). This network segregation prevented the open-webui container from communicating with the Ollama service.

To resolve the issue, the Ollama service needed to bind to the host interface IP (172.26.16.157) instead of localhost (127.0.0.1). The following steps were taken:

1) Configure Ollama to Bind to any Host IP: The systemd service file for Ollama was edited to add the OLLAMA_HOST environment variable. This was done by running:

       # systemctl edit ollama.service
    

    and adding:

       [Service]
       Environment="OLLAMA_HOST=0.0.0.0"
    

    under the [Service] section. After saving [ ‘Ctrl + O’ ] and exiting [‘Ctrl + X’] the nano editor, the systemd configuration was reloaded with:

       # systemctl daemon-reload
    

    and the Ollama service was restarted with:

       # systemctl restart ollama
    

    2) Verify Ollama Configuration: The service status was checked using:

         # systemctl status ollama
      

      and the connection was tested with:

         # curl http://172.26.16.157:11434
         # curl http://127.0.0.1:11434

      which confirmed that Ollama was running on both the local-host and host interface IP.

      3) Update Open-WebUI Docker Container Configuration: The open-webui container was stopped using:

           # docker stop open-webui
        

        and removed with:

           # docker rm open-webui
        

        The container was then run again with the updated configuration using:

           # docker run -d --network webui-network -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://172.26.16.157:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
        

        4) Configure Open WebUI:

          Access the Open WebUI GUI: Navigate to Settings > Admin Settings > Connections.

          5) Update Ollama API URL: Click on the “+” sign to add a new URL

          Existing/Old URL: http://127.0.0.1:11434
          New URL: http://172.26.16.157:11434
          Click the refresh icon next to the “+” to verify the connection. A successful connection will display a “Server connection verified” notification.

          6) Verify Models: Return to the home page of the Open WebUI GUI. The models should now be visible and selectable.

            By configuring the Ollama service to bind to all network interfaces and updating the Open-WebUI container to use the host’s IP address, the connection issue was resolved. This allowed the Open WebUI to successfully access the models from the Ollama service. This incident guide provides a detailed resolution process for similar issues and can be used as a reference for future troubleshooting.


            About the Author

            Joshua Makuru Nomwesigwa is a seasoned Telecommunications Engineer with vast experience in IP Technologies; he eats, drinks, and dreams IP packets. He is a passionate evangelist of the forth industrial revolution (4IR) a.k.a Industry 4.0 and all the technologies that it brings; 5G, Cloud Computing, BigData, Artificial Intelligence (AI), Machine Learning (ML), Internet of Things (IoT), Quantum Computing, etc. Basically, anything techie because a normal life is boring.

            Spread the word:

            Leave a Reply