AI-Powered Deployment
AI Deploy uses a language model to analyze a project and generate a Docker Compose configuration automatically. Paste a URL, review the generated config, and deploy.
Prerequisites
AI Deploy requires an AI provider to be configured in Settings > AI Provider:
| Provider | Configuration |
|---|---|
| OpenAI | API key, model name (e.g., gpt-4o) |
| LiteLLM | Proxy URL, API key, model name |
| Ollama | Server URL (e.g., http://ollama:11434), model name |
Any OpenAI-compatible API endpoint works.
How It Works
Step 1: Provide a URL
Navigate to Deploy > AI Deploy and enter:
| Field | Description |
|---|---|
| URL | GitHub repo URL, documentation link, or project website |
| Service Name | Name for the service (becomes the subdomain) |
| Customer | Which customer owns this service |
Step 2: AI Analysis
The panel sends the URL to the configured AI model, which:
- Fetches and reads the repository README, Dockerfile, compose files, and package manifests
- Identifies the tech stack, dependencies, and required services
- Generates a Docker Compose configuration with:
- Correct base images and versions
- Required environment variables with sensible defaults
- Volume mounts for persistent data
- Health check endpoints
- Multi-service setups (e.g., app + database)
- Returns the configuration with an explanation of design decisions
Analysis progress streams to the UI via WebSocket so you can watch the AI's reasoning.
Step 3: Review and Edit
The generated Docker Compose configuration is displayed in an editor. You can:
- Review the AI's explanation of why it chose specific images and settings
- Modify environment variables, image versions, or resource limits
- Add or remove services
- Adjust volume mounts or networking
The editor supports YAML syntax highlighting and validation.
Step 4: Deploy
Click Deploy to execute the generated configuration. From this point, the process is identical to a Compose Deploy:
- Traefik labels are injected for routing
- Images are pulled
- Containers are started
- Health checks are monitored
- SSL certificates are provisioned
Use Cases
AI Deploy works well for:
- Unknown projects — paste a GitHub URL and let AI figure out how to run it
- Quick prototyping — get a service running without reading deployment docs
- Complex stacks — AI handles multi-service setups (app, database, cache, worker)
- Documentation links — paste a project's install docs and AI extracts the deployment steps
Limitations
- AI-generated configurations should always be reviewed before deploying
- Quality depends on the AI model — larger models produce better results
- Private repositories require the Git Deploy method instead
- Very new or niche projects may not be well-known to the AI model
- The AI does not have access to the repository's actual source code beyond what is publicly readable
Fallback
If the AI-generated configuration does not work:
- Check the container logs for errors
- Edit the generated compose file in the service settings
- Use Git Deploy or Compose Deploy for more control