7acadae
CopilotKitDocs
  • Docs
  • Integrations
  • Reference
Get Started
QuickstartCoding Agents
Concepts
ArchitectureGenerative UI OverviewOSS vs Enterprise
Agentic Protocols
OverviewAG-UIAG-UI MiddlewareMCPA2A
Build Chat UIs
Prebuilt Components
CopilotChatCopilotSidebarCopilotPopup
Custom Look and Feel
CSS CustomizationSlots (Subcomponents)Fully Headless UIReasoning Messages
Multimodal AttachmentsVoice
Build Generative UI
Controlled
Tool-based Generative UITool RenderingState RenderingReasoning
Your Components
Display ComponentsInteractive Components
Declarative
A2UIDynamic Schema A2UIFixed Schema A2UI
Open-Ended
MCP Apps
Adding Agent Powers
Frontend ToolsShared State
Human-in-the-Loop
HITL OverviewPausing the Agent for InputHeadless Interrupts
Sub-AgentsAgent ConfigProgrammatic Control
Agents & Backends
Built-in Agent
Backend
Copilot RuntimeFactory ModeAG-UI
Runtime Server AdapterAuthentication
Built-in Agent (TanStack AI)
Advanced ConfigurationMCP ServersModel SelectionServer Tools
Observe & Operate
InspectorVS Code Extension
Troubleshooting
Common Copilot IssuesError Debugging & ObservabilityDebug ModeAG-UI Event InspectorHook ExplorerError Observability Connectors
Enterprise
CopilotKit PremiumHow the Enterprise Intelligence Platform WorksHow Threads & Persistence WorkObservabilitySelf-Hosting IntelligenceThreads
Deploy
AWS AgentCore
What's New
Full MCP Apps SupportLangGraph Deep Agents in CopilotKitA2UI Launches with full AG-UI SupportCopilotKit v1.50Generative UI Spec SupportA2A and MCP Handshake
Migrate
Migrate to V2Migrate to 1.8.2
Other
Contributing
Code ContributionsDocumentation Contributions
Anonymous Telemetry
Built-in Agent (TanStack AI)Model Selection

Model Selection

Choose and configure models for your Built-in Agent.

The Built-in Agent uses the Vercel AI SDK under the hood, giving you access to models from OpenAI, Anthropic, and Google — plus the ability to use any custom AI SDK model.

Supported Models#

Specify a model using the "provider:model" format (or "provider/model" — both work).

OpenAI#

ModelSpecifier
GPT-5openai:gpt-5
GPT-5 Miniopenai:gpt-5-mini
GPT-4.1openai:gpt-4.1
GPT-4.1 Miniopenai:gpt-4.1-mini
GPT-4.1 Nanoopenai:gpt-4.1-nano
GPT-5.4 Miniopenai:gpt-5.4-mini
o3openai:o3
o3-miniopenai:o3-mini
o4-miniopenai:o4-mini
const agent = new BuiltInAgent({
  model: "openai:gpt-4.1",
});

Anthropic#

ModelSpecifier
Claude Sonnet 4.5anthropic:claude-sonnet-4-5
Claude Sonnet 4anthropic:claude-sonnet-4
Claude 3.7 Sonnetanthropic:claude-3-7-sonnet
Claude Opus 4.1anthropic:claude-opus-4-1
Claude Opus 4anthropic:claude-opus-4
Claude 3.5 Haikuanthropic:claude-3-5-haiku
const agent = new BuiltInAgent({
  model: "anthropic:claude-sonnet-4-5",
});

Google#

ModelSpecifier
Gemini 2.5 Progoogle:gemini-2.5-pro
Gemini 2.5 Flashgoogle:gemini-2.5-flash
Gemini 2.5 Flash Litegoogle:gemini-2.5-flash-lite
const agent = new BuiltInAgent({
  model: "google:gemini-2.5-pro",
});

Environment Variables#

Set the API key for your chosen provider:

# OpenAI
OPENAI_API_KEY=sk-...

# Anthropic
ANTHROPIC_API_KEY=sk-ant-...

# Google
GOOGLE_API_KEY=...

Alternatively, pass the API key directly in your configuration:

const agent = new BuiltInAgent({
  model: "openai:gpt-4.1",
  apiKey: process.env.MY_OPENAI_KEY, // [!code highlight]
});

Custom Models (AI SDK)#

For models not in the built-in list, you can pass any Vercel AI SDK LanguageModel instance directly:

import { BuiltInAgent } from "@copilotkit/runtime/v2";
import { createOpenAI } from "@ai-sdk/openai"; // [!code highlight]

const customProvider = createOpenAI({ // [!code highlight]
  apiKey: process.env.MY_API_KEY, // [!code highlight]
  baseURL: "https://my-proxy.example.com/v1", // [!code highlight]
}); // [!code highlight]

const agent = new BuiltInAgent({
  model: customProvider("my-fine-tuned-model"), // [!code highlight]
});

This works with any AI SDK provider — Azure OpenAI, AWS Bedrock, Ollama, or any OpenAI-compatible endpoint:

import { createAzure } from "@ai-sdk/azure";

const azure = createAzure({
  resourceName: "my-resource",
  apiKey: process.env.AZURE_API_KEY,
});

const agent = new BuiltInAgent({
  model: azure("my-deployment"),
});

How It Works#

Under the hood, the Built-in Agent resolves model strings to AI SDK provider instances:

  • "openai:gpt-4.1" → @ai-sdk/openai → openai("gpt-4.1")
  • "anthropic:claude-sonnet-4-5" → @ai-sdk/anthropic → anthropic("claude-sonnet-4-5")
  • "google:gemini-2.5-pro" → @ai-sdk/google → google("gemini-2.5-pro")

Both "provider:model" and "provider/model" separators are supported and work identically.

Need a different AI SDK or full control?

The Custom Agent lets you bring your own AI SDK, TanStack AI, or any custom LLM backend — while CopilotKit handles the rest.

On this page
Supported ModelsOpenAIAnthropicGoogleEnvironment VariablesCustom Models (AI SDK)How It Works