AI Providers
Configure multiple AI providers — Anthropic, OpenAI, Google, Ollama, and AWS Bedrock
Foxl supports multiple AI providers. You can use Foxl credits through the cloud relay, or bring your own API keys for direct access.
Default: Foxl Relay
By default, Foxl routes AI requests through the cloud relay at relay.foxl.ai. This uses your Foxl credits and provides access to Claude models via AWS Bedrock.
No configuration needed — sign in and start chatting.
CLI Providers (Subscription SSO)
Use your existing subscriptions from Claude, Gemini, or OpenAI directly in Foxl — no API key needed. Each CLI runs locally as a subprocess and handles authentication through your subscription.
CLI providers are desktop-only. The CLI runs locally and authenticates through your subscription. No Foxl credits consumed.
Claude Code (Claude Pro/Max)
Use your Claude Pro or Max subscription.
- Install:
npm install -g @anthropic-ai/claude-code - Authenticate:
claude auth login - Select Claude Code (SSO) in Settings > Providers
Models: Opus 4.6, Sonnet 4.6, Haiku 4.5. Credentials auto-detected from macOS Keychain or ~/.claude/.credentials.json.
Gemini CLI (Google AI)
Use your Google AI subscription or API key.
- Install:
npm install -g @google/gemini-cli - Authenticate:
gemini auth loginor setGEMINI_API_KEY - Select Gemini CLI in Settings > Providers
Models: Gemini 2.5 Pro, Gemini 2.5 Flash. Credentials from ~/.gemini/settings.json or environment variable.
Codex CLI (OpenAI)
Use your OpenAI subscription.
- Install:
npm install -g @openai/codex - Authenticate:
codex loginor setOPENAI_API_KEY - Select Codex CLI in Settings > Providers
Models: o3, o4 Mini, GPT 4.1. Credentials from ~/.codex/auth.json, macOS Keychain, or environment variable.
Bring Your Own Key (BYOK)
Add your own API keys to use models directly without consuming Foxl credits. Works on both desktop and web (app.foxl.ai).
Anthropic
Direct access to Claude models via the Anthropic API.
- Go to Settings > Providers
- Select Anthropic
- Enter your API key from console.anthropic.com
- Select a model (Claude Opus 4.6, Sonnet 4.6, Haiku 4.5)
OpenAI
Access GPT-4o, GPT-4, and other OpenAI models.
- Go to Settings > Providers
- Select OpenAI
- Enter your API key from platform.openai.com
- Select a model
Google (Gemini)
Access Gemini models via Google AI.
- Go to Settings > Providers
- Select Google
- Enter your API key from aistudio.google.com
- Select a Gemini model
Ollama (Local Models)
Run AI models entirely on your machine with zero cost and full privacy.
- Install Ollama from ollama.com
- Pull a model:
ollama pull llama3orollama pull mistral - In Foxl Settings > Providers, select Ollama
- Foxl auto-detects running Ollama models
Ollama models run entirely on your machine. No internet connection, no API calls, no credits consumed. Perfect for privacy-sensitive work.
AWS Bedrock
For users with their own AWS account and Bedrock access.
- Configure AWS credentials (
~/.aws/credentialsor environment variables) - In Settings > Providers, select AWS Bedrock
- Ensure the models you want are enabled in your AWS Bedrock console
Provider Health Check
The Settings > Providers page shows the status of each configured provider:
- Connected: Provider is reachable and API key is valid
- Error: API key is invalid or provider is unreachable
- Not configured: No API key entered
Model Switching
You can switch between models and providers at any time using the model selector in the chat input area. Your selection persists across conversations.
Cost Comparison
| Provider | Cost | Privacy | Speed |
|---|---|---|---|
| Foxl Relay | Credits (included in plan) | Data routed through relay | Fast (AWS Bedrock) |
| Claude Code SSO | Free (subscription) | Local CLI to Claude | Fast |
| Gemini CLI | Free (subscription/API key) | Local CLI to Google | Fast |
| Codex CLI | Free (subscription) | Local CLI to OpenAI | Fast |
| Anthropic BYOK | Pay-per-use to Anthropic | Direct to Anthropic | Fast |
| OpenAI BYOK | Pay-per-use to OpenAI | Direct to OpenAI | Fast |
| Google BYOK | Pay-per-use to Google | Direct to Google | Fast |
| Ollama | Free | 100% local, no network | Depends on hardware |