Bring Your Own Key
Use your own API key for unlimited AI usage with any supported provider.
Supported Providers
| Provider | Models | How to Get Key |
|---|---|---|
| OpenAI | GPT-4o, GPT-4.1, GPT-4.1 Mini | platform.openai.com |
| Anthropic | Claude Opus 4, Sonnet 4, Haiku | console.anthropic.com |
| Gemini 2.5 Pro, Flash | aistudio.google.com | |
| Ollama | Llama, Mistral, CodeLlama (local) | ollama.com |
| OpenRouter | 200+ models | openrouter.ai |
Setup
- Open the AI chat panel
- Click the model dropdown
- Select your provider
- Enter your API key
- Choose a model
- Start chatting
Your API key is stored locally and never sent to CfxKit servers.
Ollama (Local AI)
Run AI models locally on your machine — completely free and private.
- Install Ollama
- Pull a model:
ollama pull llama3.2 - In CfxKit, select Ollama as provider
- The default URL is
http://localhost:11434 - Select your model and chat
TIP
Local models don't have access to the CfxKit knowledge base (97K+ chunks). For FiveM/RedM-specific answers, use CfxKit AI 1.0 or a cloud provider with a larger model.
OpenRouter
Access 200+ models through a single API key.
- Get a key from openrouter.ai
- Select OpenRouter in CfxKit
- Enter your key
- Browse and select from available models