Skip to content

Bring Your Own Key

Use your own API key for unlimited AI usage with any supported provider.

Supported Providers

ProviderModelsHow to Get Key
OpenAIGPT-4o, GPT-4.1, GPT-4.1 Miniplatform.openai.com
AnthropicClaude Opus 4, Sonnet 4, Haikuconsole.anthropic.com
GoogleGemini 2.5 Pro, Flashaistudio.google.com
OllamaLlama, Mistral, CodeLlama (local)ollama.com
OpenRouter200+ modelsopenrouter.ai

Setup

  1. Open the AI chat panel
  2. Click the model dropdown
  3. Select your provider
  4. Enter your API key
  5. Choose a model
  6. Start chatting

Your API key is stored locally and never sent to CfxKit servers.

Ollama (Local AI)

Run AI models locally on your machine — completely free and private.

  1. Install Ollama
  2. Pull a model: ollama pull llama3.2
  3. In CfxKit, select Ollama as provider
  4. The default URL is http://localhost:11434
  5. Select your model and chat

TIP

Local models don't have access to the CfxKit knowledge base (97K+ chunks). For FiveM/RedM-specific answers, use CfxKit AI 1.0 or a cloud provider with a larger model.

OpenRouter

Access 200+ models through a single API key.

  1. Get a key from openrouter.ai
  2. Select OpenRouter in CfxKit
  3. Enter your key
  4. Browse and select from available models

Built by Boltise