What this is: a small, private stack. Open WebUI persists chats on-disk. LiteLLM proxies OpenAI-compatible requests to Venice at https://api.venice.ai/api/v1. Perplexica provides a Perplexity-like experience using SearXNG for retrieval. Tailscale gives private access from your devices.
This project is designed to be run in a private, trusted environment. Do not expose this stack to the public internet without careful consideration and additional security measures.
Please read the WARNING.md file for a detailed explanation of the security implications of the default configuration.
- Docker
- Docker Compose
- Tailscale account (optional, but recommended for private access)
-
Clone the repository:
git clone https://github.com/your-username/openwebui-venice-stack.git cd openwebui-venice-stack -
Create the
.envfile:Copy the
.env.examplefile to.envand fill in the required API keys and tokens.cp .env.example .env
You will need to set the following variables:
VENICE_API_KEY: Your Venice API key.LITELLM_MASTER_KEY: A master key to protect your LiteLLM instance. You can generate a random string for this.TS_AUTHKEY: Your Tailscale auth key (if you are using Tailscale).WEBUI_ADMIN_PASSWORD: A strong password for the Open WebUI admin user.
-
Start the stack:
make up
This will pull the latest Docker images and start all the services.
-
Check the health of the services:
make health
-
Test the stack:
make test-all
- Open WebUI: http://:3001
- Perplexica: http://:3000
- SearXNG: http://:8085
- LiteLLM: http://:4000/v1
We strongly recommend using Tailscale to keep the entire stack private. By default, the services are only exposed on the host machine. If you need to access the services from other devices, Tailscale is the easiest and most secure way to do it.
- Open WebUI Authentication: The
WEBUI_AUTHis enabled by default. You must set a strong admin password in the.envfile. - LiteLLM Master Key: The
LITELLM_MASTER_KEYis required to access the LiteLLM service. - Policy Sidecar: The
policy_sidecarservice provides an additional layer of security by redacting sensitive information from logs and enforcing policies. By default, thePOLICY_ALLOW_HEADER_OVERRIDESis disabled.
The stack is configured through the docker-compose.yml file and the configuration files in the configs directory.
The following environment variables are used to configure the stack:
VENICE_API_KEY: Your Venice API key.LITELLM_MASTER_KEY: A master key to protect your LiteLLM instance.TS_AUTHKEY: Your Tailscale auth key.WEBUI_ADMIN_PASSWORD: The password for the Open WebUI admin user.POLICY_API_TOKEN: An optional token to protect thepolicy_sidecarservice.POLICY_ALLOW_HEADER_OVERRIDES: Set totrueto allow clients to override policy settings via HTTP headers. Defaults tofalse.
configs/litellm_config.yaml: Configures the LiteLLM service.configs/perplexica.config.toml: Configures the Perplexica service.configs/searxng/settings.yml: Configures the SearXNG service.
- Add more profiles under
profiles/(toggle Venicevenice_parameterslike web search, citations, thinking controls). - Switch Perplexica’s
MODEL_NAMEinconfigs/perplexica.config.tomlto any Venice model you prefer. - If you need HTTPS/public, use the Caddy service and put access in front of everything.
See docker-compose.extras.yml for monitoring, logs, Watchtower, and RAG API.