Skip to Content

Deploy TabbyML

Self-hosted AI coding assistant. An open-source, self-hosted alternative to GitHub Copilot.

⭐ 25.0k stars📜 Apache License 2.0🔴 Advanced⏱ ~20 minutes

What You’ll Get

A fully working TabbyML instance running on your server. Your data stays on your hardware — no third-party access, no usage limits, no surprise invoices.

Prerequisites

  • A server with Docker and Docker Compose installed (setup guide)
  • A domain name pointed to your server (optional but recommended)
  • Basic terminal access (SSH)

The Config

Create a directory for TabbyML and add this docker-compose.yml:

# ------------------------------------------------------------------------- # 🚀 Created and distributed by The AltStack # 🌍 https://thealtstack.com # ------------------------------------------------------------------------- version: '3.8' services: tabby: image: tabbyml/tabby:latest container_name: tabby restart: unless-stopped ports: - "8080:8080" volumes: - tabby-data:/data volumes: tabby-data:

Let’s Ship It

# Create a directory mkdir -p /opt/tabby && cd /opt/tabby # Create the docker-compose.yml (paste the config above) nano docker-compose.yml # Pull images and start docker compose up -d # Watch the logs docker compose logs -f

Post-Deployment Checklist

  • Service is accessible on the configured port
  • Admin account created (if applicable)
  • Reverse proxy configured (Caddy guide)
  • SSL/HTTPS working
  • Backup script set up (backup guide)
  • Uptime monitor added (Uptime Kuma)

The “I Broke It” Section

Container won’t start?

docker compose logs tabby | tail -50

Port already in use?

# Find what's using the port lsof -i :PORT_NUMBER

Need to start fresh?

docker compose down -v # ⚠️ This deletes volumes/data! docker compose up -d

Going Further