Deployment Overview of Open WebUI with Ollama on Server¶
Prerequisites and Basic Requirements¶
The following requirements must be met on the server before deployment:
-
Operating System: Ubuntu
-
Privileges: Root access is required for service configuration and container management
-
Domain: The application is configured for the
hostkey.inzone -
Ports:
-
Port
11434for Ollama backend -
Port
8080for Open WebUI internal service -
Port
443for external HTTPS access
FQDN of the Final Panel¶
The application is accessible via the Fully Qualified Domain Name (FQDN) format:
llama{Server ID}.hostkey.in:443
Replace {Server ID} with the specific server identifier to access the panel.
File and Directory Structure¶
The deployment utilizes the following directory structure for configuration, data, and certificates:
-
/root/nginx: Contains the Nginx and Certbot Docker Compose configuration. -
/root/nginx/compose.yml: The main Docker Compose file for the reverse proxy. -
/data/nginx/user_conf.d/: Stores custom Nginx configuration files for specific domains. -
/data/nginx/nginx-certbot.env: Environment variables for the Nginx-Certbot service. -
/data/nginx/user_conf.d/llama{Server ID}.hostkey.in.conf: Specific Nginx configuration for the application. -
/etc/systemd/system/ollama.service: Systemd service file for Ollama. -
/etc/systemd/system/ollama.service.bak: Backup of the original Ollama service file.
Application Installation Process¶
The application stack consists of Ollama running as a native system service and Open WebUI running as a Docker container.
-
Ollama Installation:
-
Installed via the official shell script:
curl -fsSL https://ollama.com/install.sh | sh -
The
ollamasystem user is created to manage the service. -
The
llama3.3model is downloaded and ready for use. -
Open WebUI Deployment:
-
Deployed as a Docker container using the
ghcr.io/open-webui/open-webui:cudaimage. -
The container is named
open-webuiand is configured to restart automatically. -
GPU acceleration is enabled via the
--gpus allflag.
Access Rights and Security¶
-
Firewall: The system allows incoming traffic on port
443(HTTPS) for external users. Internal services communicate over ports8080and11434. -
Users: A dedicated system user named
ollamais created to run the Ollama service. -
Service Restrictions:
-
Ollama is configured to listen on all network interfaces (
0.0.0.0). -
Origin restrictions are disabled (
OLLAMA_ORIGINS=*) to allow requests from the Open WebUI interface. -
Flash Attention is enabled (
LLAMA_FLASH_ATTENTION=1) for performance optimization.
Docker Containers and Their Deployment¶
The deployment utilizes the following Docker containers:
-
Open WebUI:
-
Image:
ghcr.io/open-webui/open-webui:cuda -
Port Mapping: Host port
8080maps to container port8080. -
Volumes: A named volume
open-webuiis mounted to/app/backend/datafor data persistence. -
Environment Variables:
-
ENV: Set todev -
OLLAMA_BASE_URLS: Set tohttp://host.docker.internal:11434
-
-
Additional Flags:
--add-host=host.docker.internal:host-gatewayis used to resolve the host machine from within the container. -
Nginx Certbot Proxy:
-
Image:
jonasal/nginx-certbot:latest -
Network Mode:
host -
Volumes:
-
nginx_secrets(external) mounted to/etc/letsencrypt -
/data/nginx/user_conf.dmounted to/etc/nginx/user_conf.d
-
-
Environment: Uses
[email protected]for certificate notifications.
Proxy Servers¶
A reverse proxy is implemented using Nginx with automatic SSL certificate management via Certbot.
-
Software: Nginx-Certbot Docker container.
-
SSL/TLS: Enabled via Let's Encrypt certificates stored in the
nginx_secretsvolume. -
Configuration:
-
The proxy configuration is located in
/data/nginx/user_conf.d/llama{Server ID}.hostkey.in.conf. -
The
proxy_passdirective is set to forward traffic tohttp://127.0.0.1:8080. -
External requests on port
443are routed to the internal Open WebUI service.
Permission Settings¶
-
Nginx Configuration Directory:
-
Path:
/root/nginx -
Owner:
root -
Group:
root -
Mode:
0755 -
Docker Compose File:
-
Path:
/root/nginx/compose.yml -
Owner:
root -
Group:
root -
Mode:
0644
Location of Configuration Files and Data¶
| Component | Configuration/Path | Description |
|---|---|---|
| Nginx Proxy | /root/nginx/compose.yml | Main Docker Compose definition |
| Nginx Conf | /data/nginx/user_conf.d/llama{Server ID}.hostkey.in.conf | Domain-specific proxy rules |
| SSL Secrets | nginx_secrets (Docker volume) | Let's Encrypt certificates |
| Ollama Service | /etc/systemd/system/ollama.service | Systemd unit file |
| Open WebUI Data | open-webui (Docker volume) | Backend application data |
| Certbot Env | /data/nginx/nginx-certbot.env | Environment variables for proxy |
Available Ports for Connection¶
-
Port 443: HTTPS access to the Open WebUI interface via the custom domain.
-
Port 8080: Internal access to the Open WebUI container (bound to
127.0.0.1via proxy). -
Port 11434: Internal access to the Ollama API service.
Starting, Stopping, and Updating¶
-
Ollama Service Management:
-
Restart the service:
systemctl restart ollama -
Reload daemon:
systemctl daemon-reload -
The service is enabled to start on boot.
-
Docker Containers:
-
Open WebUI is configured with
--restart alwaysto ensure it restarts automatically on failure or system reboot. -
The Nginx-Certbot proxy is managed via
docker composein the/root/nginxdirectory. -
To restart the proxy stack:
docker compose up -dexecuted from/root/nginx.