Overview
The mainswitchAILocal command starts the proxy server that provides OpenAI/Gemini/Claude compatible API interfaces for CLI models.
Usage
Server Flags
Path to the configuration file. The server looks for
config.yaml in the current working directory by default.Server password for authentication. For security, prefer using the
SERVER_PASSWORD environment variable instead.Google Cloud Project ID for Gemini API access. Only required when using Vertex AI or project-specific Gemini endpoints.
Environment Variables
The server supports extensive configuration through environment variables, especially for cloud deployments.Storage Backends
PostgreSQL Token Store
PostgreSQL Token Store
Git Token Store
Git Token Store
Enable version-controlled token storage using Git:
Git repository URL for token storage
Git username for authentication
Git access token (personal access token or password)
Local path for git repository clone
Object Store (S3-Compatible)
Object Store (S3-Compatible)
Enable S3-compatible object storage for tokens:
S3-compatible endpoint URL (supports http:// or https://)
Access key ID
Secret access key
Bucket name for token storage
Local cache directory
Cloud Deployment
Set to
cloud to enable cloud deployment mode. In this mode, the server waits for configuration before starting.Examples
Output
When the server starts successfully, you’ll see:Security Considerations
Configuration Precedence
The server loads configuration in the following order:- PostgreSQL Store (if
PGSTORE_DSNis set) - Object Store (if
OBJECTSTORE_ENDPOINTis set) - Git Store (if
GITSTORE_GIT_URLis set) - Local File (specified via
--configor defaultconfig.yaml)
Related Commands
- Login Commands - Authenticate with AI providers
- Memory Commands - Manage routing history and preferences
- Heartbeat Commands - Monitor provider health