Foxl Docs

AI Providers

Configure multiple AI providers — Anthropic, OpenAI, Google, Ollama, and AWS Bedrock

Foxl supports multiple AI providers. You can use Foxl credits through the cloud relay, or bring your own API keys for direct access.

Default: Foxl Relay

By default, Foxl routes AI requests through the cloud relay at relay.foxl.ai. This uses your Foxl credits and provides access to Claude models via AWS Bedrock.

No configuration needed — sign in and start chatting.

CLI Providers (Subscription SSO)

Use your existing subscriptions from Claude, Gemini, or OpenAI directly in Foxl — no API key needed. Each CLI runs locally as a subprocess and handles authentication through your subscription.

CLI providers are desktop-only. The CLI runs locally and authenticates through your subscription. No Foxl credits consumed.

Claude Code (Claude Pro/Max)

Use your Claude Pro or Max subscription.

  1. Install: npm install -g @anthropic-ai/claude-code
  2. Authenticate: claude auth login
  3. Select Claude Code (SSO) in Settings > Providers

Models: Opus 4.6, Sonnet 4.6, Haiku 4.5. Credentials auto-detected from macOS Keychain or ~/.claude/.credentials.json.

Gemini CLI (Google AI)

Use your Google AI subscription or API key.

  1. Install: npm install -g @google/gemini-cli
  2. Authenticate: gemini auth login or set GEMINI_API_KEY
  3. Select Gemini CLI in Settings > Providers

Models: Gemini 2.5 Pro, Gemini 2.5 Flash. Credentials from ~/.gemini/settings.json or environment variable.

Codex CLI (OpenAI)

Use your OpenAI subscription.

  1. Install: npm install -g @openai/codex
  2. Authenticate: codex login or set OPENAI_API_KEY
  3. Select Codex CLI in Settings > Providers

Models: o3, o4 Mini, GPT 4.1. Credentials from ~/.codex/auth.json, macOS Keychain, or environment variable.

Bring Your Own Key (BYOK)

Add your own API keys to use models directly without consuming Foxl credits. Works on both desktop and web (app.foxl.ai).

Anthropic

Direct access to Claude models via the Anthropic API.

  1. Go to Settings > Providers
  2. Select Anthropic
  3. Enter your API key from console.anthropic.com
  4. Select a model (Claude Opus 4.6, Sonnet 4.6, Haiku 4.5)

OpenAI

Access GPT-4o, GPT-4, and other OpenAI models.

  1. Go to Settings > Providers
  2. Select OpenAI
  3. Enter your API key from platform.openai.com
  4. Select a model

Google (Gemini)

Access Gemini models via Google AI.

  1. Go to Settings > Providers
  2. Select Google
  3. Enter your API key from aistudio.google.com
  4. Select a Gemini model

Ollama (Local Models)

Run AI models entirely on your machine with zero cost and full privacy.

  1. Install Ollama from ollama.com
  2. Pull a model: ollama pull llama3 or ollama pull mistral
  3. In Foxl Settings > Providers, select Ollama
  4. Foxl auto-detects running Ollama models

Ollama models run entirely on your machine. No internet connection, no API calls, no credits consumed. Perfect for privacy-sensitive work.

AWS Bedrock

For users with their own AWS account and Bedrock access.

  1. Configure AWS credentials (~/.aws/credentials or environment variables)
  2. In Settings > Providers, select AWS Bedrock
  3. Ensure the models you want are enabled in your AWS Bedrock console

Provider Health Check

The Settings > Providers page shows the status of each configured provider:

  • Connected: Provider is reachable and API key is valid
  • Error: API key is invalid or provider is unreachable
  • Not configured: No API key entered

Model Switching

You can switch between models and providers at any time using the model selector in the chat input area. Your selection persists across conversations.

Cost Comparison

ProviderCostPrivacySpeed
Foxl RelayCredits (included in plan)Data routed through relayFast (AWS Bedrock)
Claude Code SSOFree (subscription)Local CLI to ClaudeFast
Gemini CLIFree (subscription/API key)Local CLI to GoogleFast
Codex CLIFree (subscription)Local CLI to OpenAIFast
Anthropic BYOKPay-per-use to AnthropicDirect to AnthropicFast
OpenAI BYOKPay-per-use to OpenAIDirect to OpenAIFast
Google BYOKPay-per-use to GoogleDirect to GoogleFast
OllamaFree100% local, no networkDepends on hardware

On this page