Deployment Overview of Open WebUI with Ollama on Server¶
Prerequisites and Basic Requirements¶
The deployment requires a Linux server running Ubuntu. The following conditions must be met before proceeding:
- Root privileges or
sudoaccess are required to install system packages and manage services. - Docker Engine must be installed and running on the server.
- The server must have access to the internet to download container images and model files.
- Port
8080must be available for the Open WebUI application. - Port
11434is used internally by the Ollama service. - The domain name
hostkey.comis configured for SSL certificate management via Certbot.
File and Directory Structure¶
The application utilizes specific directories for configuration, data storage, and certificates:
/root/nginx: Contains the Docker Compose configuration for the Nginx proxy and Certbot./root/nginx/compose.yml: The Docker Compose file defining the Nginx and Certbot services./data/nginx/user_conf.d: Stores custom Nginx configuration files, including the host-specific configuration{{ prefix }}{{ server_id }}.hostkey.in.conf./data/nginx/nginx-certbot.env: Environment file containing settings for the Nginx-Certbot container./etc/letsencrypt: Mount point for SSL certificates managed by Certbot./app/backend/data: Persistent volume location for Open WebUI backend data./etc/systemd/system/ollama.service: Systemd unit file for the Ollama service.
Application Installation Process¶
The deployment involves installing the Ollama inference engine and the Open WebUI interface.
Ollama Installation¶
The Ollama service is installed using the official installation script. The following steps are performed:
- The installation script is executed via
curl -fsSL https://ollama.com/install.sh | sh. - The
ollamasystem user is created if it does not already exist. - The original
ollama.servicefile is backed up to/etc/systemd/system/ollama.service.bak. - The
ollama.servicefile is updated to include the following environment variables: OLLAMA_HOST=0.0.0.0OLLAMA_ORIGINS=*LLAMA_FLASH_ATTENTION=1- The systemd daemon is reloaded, and the
ollamaservice is restarted and enabled to start on boot. - The
phi4model is pulled using the commandollama pull phi4.
Open WebUI Installation¶
The Open WebUI application is deployed as a Docker container with CUDA support:
- The container image
ghcr.io/open-webui/open-webui:cudais pulled. - The container is named
open-webui. - The container is configured to restart automatically (
--restart always). - The container exposes port
8080on the host. - The container utilizes all available GPUs (
--gpus all). - The environment variable
ENVis set todev. - The environment variable
OLLAMA_BASE_URLSis set tohttp://host.docker.internal:11434. - A named volume
open-webuiis mounted to/app/backend/datafor data persistence. - The host is added to the container's DNS resolution via
--add-host=host.docker.internal:host-gateway.
Docker Containers and Their Deployment¶
Two primary Docker deployments are managed on the server: the Open WebUI container and the Nginx-Certbot stack.
Open WebUI Container¶
The Open WebUI container is launched using the following command structure:
docker run -d -p 8080:8080 --gpus all \
--add-host=host.docker.internal:host-gateway \
-v open-webui:/app/backend/data \
--name open-webui \
-e ENV='dev' \
-e OLLAMA_BASE_URLS='http://host.docker.internal:11434' \
--restart always ghcr.io/open-webui/open-webui:cuda
Nginx and Certbot Stack¶
The Nginx proxy and Certbot are managed via Docker Compose located in /root/nginx/compose.yml. The configuration includes:
- Service Name:
nginx - Image:
jonasal/nginx-certbot:latest - Restart Policy:
unless-stopped - Network Mode:
host - Environment Variables:
[email protected]- Variables loaded from
/data/nginx/nginx-certbot.env - Volumes:
nginx_secrets(external) mounted to/etc/letsencrypt/data/nginx/user_conf.dmounted to/etc/nginx/user_conf.d
The stack is started using the command:
executed from the /root/nginx directory.
Proxy Servers¶
The Nginx proxy is configured to handle traffic for the application and manage SSL certificates.
- The Nginx container uses the
jonasal/nginx-certbot:latestimage. - Custom configuration files are stored in
/data/nginx/user_conf.d. - The specific host configuration file
{{ prefix }}{{ server_id }}.hostkey.in.confis modified to route traffic. - The
proxy_passdirective within thelocation /block is set tohttp://127.0.0.1:8080to forward requests to the Open WebUI container. - SSL certificates are managed automatically by Certbot using the email
[email protected]. - The
nginx_secretsvolume is used to store Let's Encrypt certificates.
Starting, Stopping, and Updating¶
Service management commands are used to control the Ollama service and Docker containers.
Ollama Service¶
- To restart the service:
systemctl restart ollama - To enable the service on boot:
systemctl enable ollama - To reload the systemd daemon after configuration changes:
systemctl daemon-reload
Open WebUI Container¶
- To check the container status:
docker inspect -f '{{.State.Status}}' open-webui - To stop the container:
docker stop open-webui - To start the container:
docker start open-webui - To update the container, the existing container must be removed and a new one created using the
docker runcommand with the latest image tag.
Nginx and Certbot Stack¶
- To start the stack:
docker compose up -d(executed from/root/nginx) - To stop the stack:
docker compose down(executed from/root/nginx) - To update the stack, modify the
compose.ymlfile or environment variables and rundocker compose up -dagain.