Read through this guide to set up xAI’s Grok API & get familiar with how it is
used across the different apps.
Set up Grok API
First, create an account or sign in at the xAI Console. Next, navigate to the “API Keys” section and click “Create new API key”. Make sure to save your API key somewhere safe and do not share it with anyone. Once you have your API key, paste it in your .env file:Available Models
Grok models are defined in the unified model registry atlib/ai/models.ts.
| Model | ID | Features |
|---|---|---|
| Grok 4 | grok-4-latest | — |
| Grok 4.1 | grok-4-1 | — |
| Grok 4.1 Fast Reasoning | grok-4-1-fast-reasoning | Thinking/Reasoning |
Grok 4.1 Fast Reasoning supports thinking/reasoning mode, allowing the model
to show its chain of thought before providing a final answer.
Apps Using Grok
Grok is integrated through the Vercel AI SDK 6.0. Provider routing is handled bylib/ai/ai-utils.ts using customModel(), getProviderFromModelId(), and getModelInstance().
Chat
Multi-provider chat app — Grok is available as an LLM provider
Structured Output
Generate structured JSON output using Grok models
Structure
The codebase uses Vercel AI SDK 6.0 with a unified model registry instead of direct xAI API calls. Models are registered inlib/ai/models.ts and routing is managed through lib/ai/ai-utils.ts.
The typical flow for an AI request:
- The user selects a model from the unified model registry
- The request is routed through
getModelInstance()inlib/ai/ai-utils.ts - The provider is determined via
getProviderFromModelId() - The model is instantiated with
customModel() - The response is streamed back to the user
- Results are stored in Supabase
Structure
Understand the project structure of the codebase

