MongoDB connection string. Optional in Docker: when omitted, the container starts an embedded MongoDB instance at /data/mongodb. Set this to use an external MongoDB (e.g. mongodb://localhost:27017/archmax).
Public URL of this instance (e.g. https://archmax.example.com). Set this when running behind a reverse proxy. Automatically configures CORS allowed origins and auth callback URL.
PORT
3000
API server port
CORS_ORIGINS
(derived)
Comma-separated allowed origins. Defaults to APP_BASE_URL when set, otherwise http://localhost:5173. Override to allow additional origins.
AUTH_BASE_URL
(derived)
Public URL for auth callbacks. Defaults to APP_BASE_URL when set, otherwise http://localhost:PORT.
Encrypts database connection passwords and API keys at rest using AES-256-GCM. Generate with openssl rand -base64 32. Without this, credentials are stored in plaintext.
Root data directory. All persistent data lives under this path: projects/ (semantic models), mongodb/ (embedded DB), .duckdb/ (extension cache).
Inside the Docker image, persistent application data lives under /data/:
Path
Contents
Persistent?
/data/projects/
Semantic model YAML files
Yes
/data/mongodb/
Embedded MongoDB data (when using embedded mode)
Yes
/data/.duckdb/
DuckDB extension cache
Yes
/tmp/redis/
Embedded Redis data
No (ephemeral)
A single bind mount (-v ~/.archmax:/data) maps the data directory to ~/.archmax on the host, including project files, the DuckDB extension cache, and embedded MongoDB data. When using an external MongoDB via MONGODB_URI, the mongodb/ directory is unused. See the Docker Reference for details.
OpenAI-compatible API endpoint. Change this when using a provider other than OpenRouter.
AGENT_API_KEY
-
Required for agent features. API key for your chosen provider.
AGENT_MODEL
anthropic/claude-sonnet-4
Model identifier (must match your provider’s naming convention)
AGENT_TITLE_MODEL
-
Cheap/fast model for conversation title generation
Supported providers: OpenRouter (default), OpenAI, Azure OpenAI, Ollama, or any OpenAI-compatible endpoint. To get started quickly, create an account at openrouter.ai, generate an API key, and set AGENT_API_KEY to that value.
Redis connection URL for BullMQ worker queue. Optional in Docker: when omitted, the container starts an embedded Redis instance. Without Redis (local dev), the agent runs in-process.