Read through this guide to set up OpenAI & get familiar with how it is used
across the different apps.
Set up OpenAI
First, create an OpenAI account or sign in. Next, navigate to the API key page and “Create new secret key”, optionally naming the key. Make sure to save this somewhere safe and do not share it with anyone. Once you have your API key, paste it in your .env file:Available Models
OpenAI models are defined in the unified model registry atlib/ai/models.ts. All models support vision capability and internet access.
| Model | ID | Features |
|---|---|---|
| GPT-5 | gpt-5 | Vision, Internet |
| GPT-5 Mini | gpt-5-mini | Vision, Internet |
| GPT-5 Nano | gpt-5-nano | Vision, Internet |
| GPT-4o | gpt-4o | Vision, Internet |
| o3 | o3 | Vision, Internet, Reasoning |
lib/ai/models.ts:
Apps Using OpenAI
OpenAI is integrated through the Vercel AI SDK 6.0. Provider routing is handled bylib/ai/ai-utils.ts using customModel(), getProviderFromModelId(), and getModelInstance().
Chat
Multi-provider chat app — OpenAI is available as a primary LLM provider
Image Studio
Generate images using the GPT-Image model
Vision
Analyze and describe images using OpenAI vision models
Audio
Record audio and summarize transcriptions using GPT-5 Mini
Structured Output
Generate structured JSON output using OpenAI models
Structure
The codebase uses Vercel AI SDK 6.0 with a unified model registry instead of direct OpenAI API calls for chat-based interactions. Models are registered inlib/ai/models.ts and routing is managed through lib/ai/ai-utils.ts.
The typical flow for an AI request:
- The user selects a model from the unified model registry
- The request is routed through
getModelInstance()inlib/ai/ai-utils.ts - The provider is determined via
getProviderFromModelId() - The model is instantiated with
customModel() - The response is streamed back to the user
- Results are stored in Supabase
Structure
Understand the project structure of the codebase

