Deployment Overview of DeepSeek-R1:70B on Server¶
Prerequisites and Basic Requirements¶
The following system prerequisites must be met before deploying the application:
-
Operating System: Ubuntu (verified via
aptpackage manager usage). -
Privileges: Root access (
sudo) is required for installing system packages, managing systemd services, and configuring Docker. -
Dependencies:
-
curl -
ca-certificates -
Docker: Docker Engine must be installed and running to support containerized services.
-
Hardware: A GPU capable of CUDA acceleration is required for the
open-webuicontainer to function with the specified model.
FQDN of the Final Panel¶
The application is accessible via the following fully qualified domain name (FQDN) format:
deepseek<Server ID>.hostkey.in
The service listens on standard HTTPS port 443.
Application Installation Process¶
The deployment involves installing the Ollama runtime, pulling the DeepSeek model, and launching the Open WebUI interface.
-
Install Ollama: The Ollama runtime is installed using the official installer script.
-
Pull the DeepSeek Model: The specific model
deepseek-r1:70bis pulled to the local Ollama registry. -
Launch Open WebUI: The web interface is deployed as a Docker container named
open-webui.
Docker Containers and Their Deployment¶
Two primary Docker services are deployed: the proxy stack for SSL termination and the Open WebUI application.
Open WebUI Container¶
-
Name:
open-webui -
Image:
ghcr.io/open-webui/open-webui:cuda -
Network Mode:
host -
Restart Policy:
always -
Environment Variables:
-
ENV:dev -
OLLAMA_BASE_URLS:http://127.0.0.1:11434 -
Volumes:
-
open-webui:/app/backend/data(Named volume for backend data persistence) -
Device Requests: GPU capabilities enabled.
Nginx and Certbot Stack¶
The proxy and SSL management are handled via a Docker Compose setup.
-
Compose File Location:
/root/nginx/compose.yml -
Service Name:
nginx -
Image:
jonasal/nginx-certbot:latest -
Restart Policy:
unless-stopped -
Network Mode:
host -
Volumes:
-
nginx_secrets:/etc/letsencrypt(External named volume) -
/data/nginx/user_conf.d:/etc/nginx/user_conf.d(Bind mount for user configuration) -
Environment:
-
CERTBOT_EMAIL:[email protected] -
Env File:
/data/nginx/nginx-certbot.env
Proxy Servers¶
The application uses Nginx with Certbot to manage SSL certificates and handle incoming HTTPS traffic.
-
SSL Provider: Let's Encrypt via Certbot.
-
Certificate Storage:
/etc/letsencrypt(mapped from thenginx_secretsDocker volume). -
Configuration Directory:
/data/nginx/user_conf.d. -
Nginx Vhost Configuration:
-
File:
/data/nginx/user_conf.d/deepseek<Server ID>.hostkey.in.conf -
Function: Redirects HTTP to HTTPS and proxies requests to the internal Open WebUI service.
-
Proxy Target:
http://127.0.0.1:8080 -
Header Configuration:
proxy_set_header Host $hostensures the original host header is passed to the upstream. -
ACME Challenge:
-
Port:
80(HTTP) -
File:
/data/nginx/user_conf.d/deepseek<Server ID>.hostkey.in.http.conf -
Root:
/var/www/certbot(Inside the Nginx container)
Ollama Service Configuration¶
The Ollama service is managed as a native systemd service, modified to support external connections and specific environment variables.
-
Service File:
/etc/systemd/system/ollama.service(Original backed up to.bak) -
Override Directory:
/etc/systemd/system/ollama.service.d/ -
Override File:
/etc/systemd/system/ollama.service.d/override.conf -
Environment Variables:
-
OLLAMA_HOST:0.0.0.0 -
OLLAMA_ORIGINS:* -
LLAMA_FLASH_ATTENTION:1 -
Status: Enabled and running.
Location of Configuration Files and Data¶
The following paths contain critical configuration and data for the deployment:
| Component | Path | Description |
|---|---|---|
| Nginx Configs | /data/nginx/user_conf.d/ | Directory containing Nginx server block configurations. |
| Nginx Env File | /data/nginx/nginx-certbot.env | Environment variables for the Nginx container. |
| Nginx Compose | /root/nginx/compose.yml | Docker Compose file for the proxy stack. |
| Nginx Directory | /root/nginx | Working directory for Docker Compose proxy deployment. |
| Ollama Override | /etc/systemd/system/ollama.service.d/override.conf | Systemd override for Ollama environment settings. |
| Open WebUI Data | open-webui (Volume) | Named Docker volume storing application data. |
Available Ports for Connection¶
The system exposes the following ports for communication:
-
Port 443: HTTPS traffic for the Open WebUI interface via Nginx proxy.
-
Port 80: HTTP traffic for ACME challenges and initial SSL redirection.
-
Port 8080: Internal traffic; Nginx proxies this port to the Open WebUI service.
-
Port 11434: Ollama API endpoint (internal, bound to
127.0.0.1). -
Port 3000: Defined in group variables as
internal_port, though the current proxy configuration routes directly to port 8080.
Starting, Stopping, and Updating¶
Ollama Service Management¶
Manage the Ollama service using systemd commands:
Docker Container Management¶
-
Open WebUI:
-
Nginx Proxy Stack: Navigate to the compose directory and manage services:
Nginx Container Management¶
To reload the Nginx configuration inside the container after changes: