How to Fix Ollama Port Not Connecting to Open WebUI Error

Running the Large Language Model (LLM) for running self-hosted AI is now widely used by users. Ollama is one popular tool that provides an environment in which to run different LLMs locally. It can be installed on Linux, Mac, and even on Windows.

Ollama is an environment for running the LLMs, and it does not offer a graphical interface to initiate the chat like ChatGPT. To do so, you need to install Open WebUI. After connecting Open WebUI with Ollama, you can use it like a local ChatGPT.

If you have installed Ollama on your Mac and are not connecting to the web UI or other chat application, this is a common issue many users face. However, the issue won’t happen if you have installed both Ollama and Open WebUI on the same machine. But if both apps are running on two different machines in your local network, you may face an issue connecting Open WebUI with Ollama due to the Ollama Port not being available to other devices.

How to fix Ollama not connecting to the Open WebUI issue?

Now, let us check how we can fix the Open WebUI not connecting to the Ollama Issue. I’m not connecting to let you know how we can fix that issue. So, once you face the Connection is not established error with Ollama and Open WebUI, follow the below steps.

Open the terminal of your device where Ollama Is Installed and running.
To fix that, simply type the following command on your Mac terminal

OLLAMA_HOST= ollama serve



Once done, run the following commands

sudo systemctl daemon-reload

sudo systemctl restart ollama

After that, try loading the Open WebUI and connecting it with Ollama. You can see it is working fine now. Now, you can start using the Open WebUI with any Language Model like LLAMA 3 and chat using the AI.

Leave a Comment