Deployment Overview of Open WebUI with Ollama on Server¶
Prerequisites and Basic Requirements¶
The deployment requires a server running the Ubuntu operating system. The following conditions must be met before proceeding: - Root privileges or sudo access are required to install system packages and manage services. - Docker and Docker Compose must be installed and operational on the host system. - The server must have access to the internet to download installation scripts, Docker images, and model weights. - Port 8080 must be available for the Open WebUI application. - Port 11434 is utilized internally by the Ollama service. - Ports 80 and 443 are required for the Nginx proxy and SSL certificate management.
File and Directory Structure¶
The application utilizes specific directories for configuration, data storage, and certificates: - /root/nginx: Contains the Docker Compose configuration for the Nginx proxy and Certbot. - /root/nginx/compose.yml: The Docker Compose file defining the Nginx service. - /data/nginx/user_conf.d: Directory storing custom Nginx configuration files, including host-specific settings. - /data/nginx/nginx-certbot.env: Environment file containing configuration for the Nginx-Certbot service. - /etc/letsencrypt: Mount point for SSL certificates managed by Certbot. - /app/backend/data: Persistent volume location for Open WebUI backend data. - /etc/systemd/system/ollama.service: Systemd unit file for the Ollama service.
Application Installation Process¶
The deployment involves installing the Ollama backend, pulling the specific language model, and launching the Open WebUI frontend via Docker.
- Install Ollama: The Ollama service is installed using the official installation script.
- Configure Ollama Service: The
ollama.servicefile is modified to allow external connections and enable specific optimizations. The following environment variables are set:OLLAMA_HOST=0.0.0.0OLLAMA_ORIGINS=*LLAMA_FLASH_ATTENTION=1
- Download Model: The
llama3.3model is pulled into the local Ollama repository. - Launch Open WebUI: The Open WebUI container is started with CUDA support, mapping port
8080and connecting to the local Ollama instance.
Docker Containers and Their Deployment¶
Two primary Docker containers are deployed as part of this architecture: Open WebUI and Nginx-Certbot.
Open WebUI Container¶
The Open WebUI container is launched using the docker run command with the following specifications: - Image: ghcr.io/open-webui/open-webui:cuda - Container Name: open-webui - Port Mapping: Host port 8080 maps to container port 8080. - GPU Access: The --gpus all flag is used to enable GPU acceleration. - Host Resolution: The --add-host=host.docker.internal:host-gateway flag allows the container to reach the host network. - Volume: A named volume open-webui is mounted to /app/backend/data for persistent storage. - Environment Variables: - ENV=dev - OLLAMA_BASE_URLS=http://host.docker.internal:11434 - Restart Policy: Set to always.
Nginx-Certbot Container¶
The Nginx proxy is managed via Docker Compose located in /root/nginx/compose.yml. - Image: jonasal/nginx-certbot:latest - Restart Policy: unless-stopped - Network Mode: host - Environment: - [email protected] - Loads additional variables from /data/nginx/nginx-certbot.env - Volumes: - nginx_secrets (external) mounted to /etc/letsencrypt - /data/nginx/user_conf.d mounted to /etc/nginx/user_conf.d
Proxy Servers¶
Nginx is configured as a reverse proxy to handle incoming traffic and manage SSL certificates using Certbot.
- Configuration Location: Custom host configurations are stored in
/data/nginx/user_conf.d. - Proxy Pass: The Nginx configuration directs traffic to the Open WebUI application running on the host at
http://127.0.0.1:8080. - SSL Management: The
nginx-certbotcontainer automatically handles SSL certificate generation and renewal for the configured domains. - Deployment: The proxy service is started using the command
docker compose up -dwithin the/root/nginxdirectory.
Access Rights and Security¶
Security and access controls are implemented through user management and service configuration: - Ollama User: A dedicated system user named ollama is created to run the Ollama service. - Firewall: The Nginx proxy listens on standard HTTP/HTTPS ports, while the Open WebUI application is exposed only on port 8080 via the proxy. - Service Isolation: The Open WebUI container runs with specific environment variables restricting its origin access and defining its backend connection. - File Permissions: - The /root/nginx directory is owned by root with 0755 permissions. - The compose.yml file is owned by root with 0644 permissions.
Starting, Stopping, and Updating¶
Service management is handled via systemd for Ollama and Docker commands for the containers.
Ollama Service¶
- Restart and Enable:
- Backup: A backup of the original service file is maintained at
/etc/systemd/system/ollama.service.bak.
Docker Containers¶
- Start Nginx Proxy:
- Check Open WebUI Status:
- Update: To update the application, pull the latest Docker images and restart the containers. For Ollama, pull the latest model version using
ollama pull llama3.3.