Skip to content

Deployment Overview of gpt-oss-20b on Server

Prerequisites and Basic Requirements

The deployment of the application requires a server running a compatible Linux distribution, specifically Ubuntu, to support the required package management and system services. The following prerequisites must be met before proceeding:

  • The server must have root privileges or sudo access.

  • The curl utility must be available on the system.

  • Docker Engine must be installed and running.

  • The ollama service must be installed and configured to listen on all network interfaces.

  • The domain hostkey.in must be accessible for DNS resolution and certificate generation.

  • Port 8080 (internal) and Port 443 (external) must be available for traffic routing.

FQDN of the Final Panel

The application is accessible via the Fully Qualified Domain Name (FQDN) using the hostkey.in domain. The format follows the pattern prefix followed by the Server ID and the port.

  • Base Domain: hostkey.in

  • URL Format: gpt-oss<Server ID>.hostkey.in:443

  • Access Protocol: HTTPS

File and Directory Structure

The application utilizes specific directories for configuration, data storage, and SSL certificates. The key locations are as follows:

  • Nginx Configuration Directory: /root/nginx

  • Nginx Compose File: /root/nginx/compose.yml

  • User Configuration Directory: /data/nginx/user_conf.d

  • Specific User Config File: /data/nginx/user_conf.d/gpt-oss<Server ID>.hostkey.in.conf

  • SSL Certificates Volume: Mounted at /etc/letsencrypt within the Nginx container.

  • Ollama Models Directory: /usr/share/ollama/.ollama/models

  • Open WebUI Data Volume: Mapped to the Docker volume open-webui (physical path typically /var/lib/docker/volumes/open-webui/_data).

Application Installation Process

The installation involves setting up the Ollama backend and the Open WebUI frontend via Docker.

  1. Install Ollama: The Ollama package is installed using the official installation script.

    curl -fsSL https://ollama.com/install.sh | sh
    

  2. Configure Ollama System Service: The ollama service is modified to listen on all interfaces (0.0.0.0) and enable specific environment variables for origins and flash attention. The service file is located at /etc/systemd/system/ollama.service.

  3. Environment Variables Set:

    • OLLAMA_HOST=0.0.0.0

    • OLLAMA_ORIGINS=*

    • LLAMA_FLASH_ATTENTION=1

  4. Download Model: The specific model gpt-oss:20b is pulled into the Ollama repository.

    ollama pull gpt-oss:20b
    

  5. Deploy Open WebUI: The Open WebUI container is deployed with GPU support and specific environment variables to connect to the local Ollama instance.

  6. Image: ghcr.io/open-webui/open-webui:cuda

  7. Container Name: open-webui

  8. Port Mapping: Exposes host port 8080 to container port 8080.

Docker Containers and Their Deployment

Two primary Docker containers are managed in this deployment: Open WebUI and Nginx Certbot.

Open WebUI Container:

  • Deployment Command:

    docker run -d -p 8080:8080 --gpus all \
      --add-host=host.docker.internal:host-gateway \
      -v open-webui:/app/backend/data \
      --name open-webui \
      -e ENV='dev' \
      -e OLLAMA_BASE_URLS='http://host.docker.internal:11434' \
      --restart always ghcr.io/open-webui/open-webui:cuda
    

  • Key Parameters:

  • --gpus all: Enables GPU acceleration.

  • -v open-webui:/app/backend/data: Persists application data.

  • -e OLLAMA_BASE_URLS: Points to the Ollama instance via the host gateway.

Nginx and Certbot Container:

  • Deployment Method: Managed via docker compose in the directory /root/nginx.

  • Configuration File: /root/nginx/compose.yml.

  • Image: jonasal/nginx-certbot:latest.

  • Network Mode: host.

  • Volumes:

  • nginx_secrets (external) mapped to /etc/letsencrypt.

  • Host directory /data/nginx/user_conf.d mapped to /etc/nginx/user_conf.d.

  • Environment Variable:

  • [email protected]

Proxy Servers

Access to the application is managed through an Nginx proxy container that handles SSL termination and routing.

  • Proxy Software: Nginx with Certbot for Let's Encrypt SSL certificates.

  • Configuration Location: Custom configuration is added to /data/nginx/user_conf.d/gpt-oss<Server ID>.hostkey.in.conf.

  • Routing Rule: The location block / proxies requests to the internal Open WebUI instance.

    location / {
        proxy_pass http://127.0.0.1:8080;
    }
    

  • External Port: 443 (HTTPS).

  • Internal Port: 8080 (HTTP).

Access Rights and Security

  • System User: A dedicated system user ollama is created and utilized by the Ollama service.

  • Firewall: The configuration assumes external traffic is allowed on port 443. Internal traffic on port 8080 is handled locally.

  • SSL/TLS: Secure connections are enforced via the Nginx container using Let's Encrypt certificates managed by Certbot.

  • Restrictions: The Nginx proxy configuration restricts access to the specific subdomain defined in the user_conf.d directory.

Location of Configuration Files and Data

The following table summarizes the critical file locations for the deployment:

Component File/Directory Path Description
Nginx Config Directory /root/nginx Directory containing the Docker Compose setup for the proxy.
Nginx Compose File /root/nginx/compose.yml Definition of the Nginx and Certbot container.
User Nginx Config /data/nginx/user_conf.d/gpt-oss<Server ID>.hostkey.in.conf Specific proxy rules for the application domain.
SSL Secrets /etc/letsencrypt (Docker Volume) Storage for SSL certificates (mounted volume).
Ollama Service /etc/systemd/system/ollama.service Systemd unit file for the Ollama backend.
Ollama Backup /etc/systemd/system/ollama.service.bak Backup of the original service file.
Open WebUI Data /var/lib/docker/volumes/open-webui/_data Physical location of the Docker volume for app data.
Ollama Models /usr/share/ollama/.ollama/models Storage for downloaded AI models.

Available Ports for Connection

The deployment utilizes the following ports for network communication:

  • Port 443:

  • Type: External (HTTPS)

  • Usage: Secure web access via the hostkey.in domain.

  • Service: Nginx Proxy.

  • Port 8080:

  • Type: Internal (HTTP)

  • Usage: Local connection from the Nginx proxy to the Open WebUI container.

  • Service: Open WebUI.

  • Port 11434:

  • Type: Internal (HTTP)

  • Usage: Local connection from the Open WebUI container to the Ollama backend.

  • Service: Ollama.

Starting, Stopping, and Updating

Management of the services is handled through Docker commands for the containers and systemctl for the Ollama service.

  • Restarting the Proxy: The Nginx and Certbot container stack is managed via Docker Compose in the /root/nginx directory.

    cd /root/nginx
    docker compose up -d
    

  • Restarting Open WebUI: The Open WebUI container is configured with --restart always, meaning it will automatically restart on failure or reboot. To manually restart:

    docker restart open-webui
    

  • Managing Ollama Service: The Ollama service is managed via systemd.

  • Reload daemon after configuration changes:

    systemctl daemon-reload
    

  • Restart the service:

    systemctl restart ollama
    

  • Check status:

    systemctl status ollama
    

question_mark
Is there anything I can help you with?
question_mark
AI Assistant ×