Skip to main content
xAI’s Grok models bring strong reasoning capabilities to the table, including a fast reasoning mode that shows its chain of thought. If you want to give your users access to cutting-edge models from xAI, this is how to set it up.

Get your API key

1

Create or sign into your xAI account

Head to the xAI Console and sign up or log in.
2

Generate an API key

Navigate to API Keys and click Create new API key.
3

Add it to your env

Paste the key in your .env.local file:
XAI_API_KEY=your_xai_api_key
Save your API key somewhere safe right away — you won’t be able to retrieve it later.

Available models

All models are defined in the unified model registry at lib/ai/models.ts.
ModelIDFeatures
Grok 4grok-4-latest
Grok 4.1grok-4-1
Grok 4.1 Fast Reasoninggrok-4-1-fast-reasoningThinking/Reasoning
Grok 4.1 Fast Reasoning supports thinking/reasoning mode — the model shows its chain of thought before giving you the final answer. It’s great for complex analysis and problem-solving tasks.

Apps using Grok

Grok is integrated through Vercel AI SDK 6.0, with provider routing handled by lib/ai/ai-utils.ts.

Chat

Multi-provider chat app — Grok is available as an LLM provider

Marketing Plan

Generate structured marketing plans using Grok models

Launch Simulator

Generate Product Hunt launch simulations using Grok models

How it works

The codebase uses Vercel AI SDK 6.0 with a unified model registry — no direct xAI API calls needed. Here’s the typical flow for an AI request:
  1. You select a model from the unified registry
  2. The request goes through getModelInstance() in lib/ai/ai-utils.ts
  3. The provider is determined via getProviderFromModelId()
  4. The model is instantiated with customModel()
  5. The response is streamed back to you
  6. Results are stored in PostgreSQL

Structure

Understand the full project structure of the codebase.