Read through this guide to set up Google Gemini & get familiar with how it is
used across the different apps.
Set up Google Gemini
First, go to Google AI Studio and sign in with your Google account. Next, navigate to the API keys section and create a new API key. Make sure to save this somewhere safe and do not share it with anyone. Once you have your API key, paste it in your .env file:Available Models
Google Gemini models are defined in the unified model registry atlib/ai/models.ts.
| Model | ID | Features |
|---|---|---|
| Gemini 3 Pro | gemini-3-pro-preview | Vision, Search Grounding |
| Gemini 3 Pro Image | gemini-3-pro-image-preview | Vision, Search Grounding, Image Generation |
| Gemini 3.1 Flash Lite | gemini-3.1-flash-lite-preview | Vision, Search Grounding |
| Gemini 2.5 Flash | gemini-2.5-flash | Vision, Search Grounding, Thinking/Reasoning |
Gemini models support search grounding, which enables native web search
capabilities. This allows the model to access up-to-date information directly
from the web when answering questions.
Gemini 2.5 Flash supports thinking capability, allowing the model to show
its chain of thought before providing a final answer.
Apps Using Google Gemini
Google Gemini is integrated through the Vercel AI SDK 6.0. Provider routing is handled bylib/ai/ai-utils.ts using customModel(), getProviderFromModelId(), and getModelInstance().
Chat
Multi-provider chat app β Gemini is available as an LLM provider with search grounding support
Image Studio
Generate images using the Gemini 3 Pro Image model
Structured Output
Generate structured JSON output using Gemini models
Structure
The codebase uses Vercel AI SDK 6.0 with a unified model registry. Models are registered inlib/ai/models.ts and routing is managed through lib/ai/ai-utils.ts.
The typical flow for an AI request:
- The user selects a model from the unified model registry
- The request is routed through
getModelInstance()inlib/ai/ai-utils.ts - The provider is determined via
getProviderFromModelId() - The model is instantiated with
customModel() - The response is streamed back to the user
- Results are stored in Supabase
Structure
Understand the project structure of the codebase

