Skip to main content
OpenAI is the Swiss Army knife of this codebase — it powers chat, image generation, document embeddings, audio summaries, and several structured-output apps. If you’re only going to set up one provider first, this is still the smoothest default.

Get your API key

1

Create or sign into your OpenAI account

Head to platform.openai.com and sign up or log in.
2

Generate a secret key

Go to the API keys page and click Create new secret key. Give it a name you’ll recognize.
3

Add it to your env

Paste the key in your .env.local file:
OPENAI_API_KEY=your_openai_api_key
Save your API key somewhere safe right away — OpenAI only shows it once. If you lose it, you’ll need to create a new one.

Available models

All models are defined in the unified model registry at lib/ai/models.ts. Every OpenAI model supports vision and internet access.
ModelIDFeatures
GPT-5gpt-5Vision, Internet
GPT-5 Minigpt-5-miniVision, Internet
GPT-5 Nanogpt-5-nanoVision, Internet
GPT-4ogpt-4oVision, Internet
o3o3Vision, Internet, Reasoning
Here’s the shape used in the shared registry:
{
  "gpt-5": {
    name: "GPT-5",
    provider: "openai",
    vision: true,
    hasInternet: true,
  }
}

Apps using OpenAI

OpenAI shows up through two paths in this repo: shared text/vision models go through the Vercel AI SDK and lib/ai/ai-utils.ts, while Image Studio uses the direct OpenAI image API.

Chat

Multi-provider chat app — OpenAI is a primary LLM provider

Image Studio

Generate images using the GPT-Image model

Audio

Record audio and summarize transcriptions using GPT-5 Mini

Vector RAG

OpenAI embeddings power the PDF/document workflow

Marketing Plan

Generate structured marketing plans

Launch Simulator

Generate Product Hunt launch simulations

How it works

The codebase uses Vercel AI SDK 6.0 with a unified model registry — no direct OpenAI API calls for chat-based interactions. Here’s the typical flow for an AI request:
  1. You select a model from the unified registry
  2. The request goes through getModelInstance() in lib/ai/ai-utils.ts
  3. The provider is determined via getProviderFromModelId()
  4. The model is instantiated with customModel()
  5. The response is streamed back to you
  6. Results are stored in PostgreSQL

Structure

Understand the full project structure of the codebase.