Skip to main content
Groq is all about speed. Their custom LPU hardware runs Llama models at blazing-fast inference speeds, making it perfect for features where your users don’t want to wait around. If responsiveness matters to your product, Groq is a great addition.

Get your API key

1

Create or sign into your Groq account

Head to the Groq Console and sign up or log in.
2

Generate an API key

Go to the API keys page and click Create API key.
3

Add it to your env

Paste the key in your .env.local file:
GROQ_API_KEY=your_groq_api_key
Save your API key right away — you won’t be able to see it again after creation.

Available models

All models are defined in the unified model registry at lib/ai/models.ts. Both models support vision capability.
ModelIDFeatures
Llama 4 Scoutmeta-llama/llama-4-scout-17b-16e-instructVision
Llama 4 Maverickmeta-llama/llama-4-maverick-17b-128e-instructVision
Groq’s speed advantage is most noticeable in chat and structured generation tasks. If you’re building something that needs fast turnaround, these models deliver.

Apps using Groq

Groq is integrated through Vercel AI SDK 6.0, with provider routing handled by lib/ai/ai-utils.ts.

Chat

Multi-provider chat app — Groq is available as an LLM provider

Marketing Plan

Generate structured marketing plans using Groq models

Launch Simulator

Generate Product Hunt launch simulations using Groq models
The shipped Audio app does not use Groq today. It uses Replicate Whisper for transcription and OpenAI for summaries.

How it works

The codebase uses Vercel AI SDK 6.0 with a unified model registry — no direct Groq API calls needed. Here’s the typical flow for an AI request:
  1. You select a model from the unified registry
  2. The request goes through getModelInstance() in lib/ai/ai-utils.ts
  3. The provider is determined via getProviderFromModelId()
  4. The model is instantiated with customModel()
  5. The response is streamed back to you
  6. Results are stored in PostgreSQL

Structure

Understand the full project structure of the codebase.