Get your API key
Create or sign into your Groq account
Head to the Groq Console and sign up or log in.
Generate an API key
Go to the API keys page and click Create API key.
Available models
All models are defined in the unified model registry atlib/ai/models.ts. Both models support vision capability.
| Model | ID | Features |
|---|---|---|
| Llama 4 Scout | meta-llama/llama-4-scout-17b-16e-instruct | Vision |
| Llama 4 Maverick | meta-llama/llama-4-maverick-17b-128e-instruct | Vision |
Apps using Groq
Groq is integrated through Vercel AI SDK 6.0, with provider routing handled bylib/ai/ai-utils.ts.
Chat
Multi-provider chat app — Groq is available as an LLM provider
Marketing Plan
Generate structured marketing plans using Groq models
Launch Simulator
Generate Product Hunt launch simulations using Groq models
The shipped Audio app does not use Groq today. It uses Replicate Whisper for transcription and OpenAI for summaries.
How it works
The codebase uses Vercel AI SDK 6.0 with a unified model registry — no direct Groq API calls needed. Here’s the typical flow for an AI request:- You select a model from the unified registry
- The request goes through
getModelInstance()inlib/ai/ai-utils.ts - The provider is determined via
getProviderFromModelId() - The model is instantiated with
customModel() - The response is streamed back to you
- Results are stored in PostgreSQL
Structure
Understand the full project structure of the codebase.

