Deployment Overview of Qwen3-Coder on Server¶
Prerequisites and Basic Requirements¶
The deployment requires a Linux server running Ubuntu. The following components must be available on the system:
- Root privileges or
sudoaccess for system-wide installations. - Docker Engine installed and running with GPU support enabled.
- Network access to
ollama.comfor downloading the installation script and models. - Network access to
ghcr.iofor pulling the Open WebUI container image. - Port
8080must be open for the Open WebUI application. - Port
11434must be accessible internally for the Ollama service. - Port
80and443must be open for the Nginx reverse proxy and SSL certificate management.
File and Directory Structure¶
The application utilizes specific directories for configuration, data storage, and certificates:
/root/nginx/: Contains the Docker Compose configuration for the Nginx proxy./root/nginx/compose.yml: The Docker Compose file defining the Nginx and Certbot services./data/nginx/: Stores Nginx configuration files and environment variables./data/nginx/user_conf.d/: Contains custom Nginx server block configurations./data/nginx/nginx-certbot.env: Environment file for Nginx and Certbot settings./etc/systemd/system/ollama.service: Systemd unit file for the Ollama service./usr/share/ollama/.ollama/models/: Default storage location for Ollama models, includingqwen3-coder./var/lib/docker/volumes/open-webui/: Docker volume storing Open WebUI backend data.
Application Installation Process¶
The deployment involves installing the Ollama service, pulling the Qwen3-Coder model, and running the Open WebUI container.
- Install Ollama: The Ollama service is installed using the official installation script.
- Configure Ollama Service: The
ollama.servicefile is updated to expose the service on all network interfaces and enable specific environment variables.OLLAMA_HOSTis set to0.0.0.0.OLLAMA_ORIGINSis set to*.LLAMA_FLASH_ATTENTIONis set to1.
- Pull the Model: The
qwen3-codermodel is downloaded and stored locally. - Deploy Open WebUI: The Open WebUI application is deployed as a Docker container with GPU acceleration.
Docker Containers and Their Deployment¶
Two primary Docker-based components are deployed: the Open WebUI application and the Nginx reverse proxy with Certbot.
Open WebUI Container¶
The Open WebUI container is configured with the following parameters: - Image: ghcr.io/open-webui/open-webui:cuda - Name: open-webui - Ports: Maps host port 8080 to container port 8080. - Volumes: Mounts the open-webui named volume to /app/backend/data. - Environment Variables: - ENV: Set to dev. - OLLAMA_BASE_URLS: Set to http://host.docker.internal:11434. - Restart Policy: Set to always. - GPU Access: Enabled via --gpus all. - Host Resolution: Adds a custom host entry for host.docker.internal pointing to the host gateway.
Nginx and Certbot Container¶
The Nginx proxy is managed via Docker Compose located at /root/nginx/compose.yml. - Image: jonasal/nginx-certbot:latest - Network Mode: Uses host networking. - Volumes: - Mounts the external nginx_secrets volume to /etc/letsencrypt. - Mounts /data/nginx/user_conf.d to /etc/nginx/user_conf.d. - Environment: - CERTBOT_EMAIL: Set to [email protected]. - Loads additional environment variables from /data/nginx/nginx-certbot.env. - Restart Policy: Set to unless-stopped.
Proxy Servers¶
The Nginx reverse proxy is configured to handle incoming traffic and manage SSL certificates via Certbot.
- Configuration Location: Custom server configurations are stored in
/data/nginx/user_conf.d/. - Proxy Pass: The Nginx configuration includes a
location /block that forwards traffic to the Open WebUI application running onhttp://127.0.0.1:8080. - SSL Management: The
nginx-certbotcontainer automatically handles SSL certificate generation and renewal using Let's Encrypt. - Deployment: The proxy stack is started using the command: This command is executed from the
/root/nginxdirectory.
Starting, Stopping, and Updating¶
Ollama Service¶
The Ollama service is managed via systemd.
- Restart Service:
- Enable Service:
Open WebUI Container¶
The Open WebUI container is managed via Docker commands.
- Start/Restart: Since the container is configured with
--restart always, it will automatically start on boot. To manually restart: - Stop:
- Update: To update the container to the latest version, pull the new image and recreate the container:
docker pull ghcr.io/open-webui/open-webui:cuda docker rm -f open-webui docker run -d -p 8080:8080 --gpus all \ --add-host=host.docker.internal:host-gateway \ -v open-webui:/app/backend/data \ --name open-webui \ -e ENV='dev' \ -e OLLAMA_BASE_URLS='http://host.docker.internal:11434' \ --restart always ghcr.io/open-webui/open-webui:cuda
Nginx Proxy Stack¶
The Nginx and Certbot stack is managed via Docker Compose.
- Start/Restart:
- Stop: