Skip to main content
Read through this guide to set up OpenAI & get familiar with how it is used across the different apps.

Set up OpenAI

First, create an OpenAI account or sign in. Next, navigate to the API key page and “Create new secret key”, optionally naming the key. Make sure to save this somewhere safe and do not share it with anyone. Once you have your API key, paste it in your .env file:
OPENAI_API_KEY=your_openai_api_key

Available Models

OpenAI models are defined in the unified model registry at lib/ai/models.ts. All models support vision capability and internet access.
ModelIDFeatures
GPT-5gpt-5Vision, Internet
GPT-5 Minigpt-5-miniVision, Internet
GPT-5 Nanogpt-5-nanoVision, Internet
GPT-4ogpt-4oVision, Internet
o3o3Vision, Internet, Reasoning
Here is an example of how a model is defined in lib/ai/models.ts:
{
  id: "gpt-5",
  name: "GPT-5",
  provider: "openai",
  features: {
    vision: true,
    internet: true,
  },
}

Apps Using OpenAI

OpenAI is integrated through the Vercel AI SDK 6.0. Provider routing is handled by lib/ai/ai-utils.ts using customModel(), getProviderFromModelId(), and getModelInstance().

Structure

The codebase uses Vercel AI SDK 6.0 with a unified model registry instead of direct OpenAI API calls for chat-based interactions. Models are registered in lib/ai/models.ts and routing is managed through lib/ai/ai-utils.ts. The typical flow for an AI request:
  1. The user selects a model from the unified model registry
  2. The request is routed through getModelInstance() in lib/ai/ai-utils.ts
  3. The provider is determined via getProviderFromModelId()
  4. The model is instantiated with customModel()
  5. The response is streamed back to the user
  6. Results are stored in Supabase
More information on structure of the codebase can be found here:

Structure

Understand the project structure of the codebase