Skip to main content
Anthropic’s Claude models are known for strong reasoning, long-context handling, and thoughtful responses. If you want to offer your users a premium chat experience alongside OpenAI, Anthropic is a great pick.

Get your API key

1

Create or sign into your Anthropic account

Head to the Anthropic Console and sign up or log in.
2

Generate an API key

Go to the API keys page and click Create API key.
3

Add it to your env

Paste the key in your .env.local file:
ANTHROPIC_API_KEY=your_anthropic_api_key
Save your API key right away — you won’t be able to see it again after creation.

Available models

All models are defined in the unified model registry at lib/ai/models.ts. Every Anthropic model supports vision and internet access.
ModelIDFeatures
Claude Opus 4.5claude-opus-4-5Vision, Internet
Claude Sonnet 4.5claude-sonnet-4-5Vision, Internet, Thinking/Reasoning
Claude Haiku 4.5claude-haiku-4-5Vision, Internet
Claude Sonnet 4.5 supports thinking/reasoning mode — the model shows its chain of thought before giving you a final answer. Great for complex tasks where you want transparency.

Apps using Anthropic

Anthropic is integrated through Vercel AI SDK 6.0, with provider routing handled by lib/ai/ai-utils.ts.

Chat

Multi-provider chat app — Anthropic is available as an LLM provider

Marketing Plan

Generate structured marketing plans using Claude models

Launch Simulator

Generate Product Hunt launch simulations using Claude models

How it works

The codebase uses Vercel AI SDK 6.0 with a unified model registry — no direct Anthropic API calls needed. Here’s the typical flow for an AI request:
  1. You select a model from the unified registry
  2. The request goes through getModelInstance() in lib/ai/ai-utils.ts
  3. The provider is determined via getProviderFromModelId()
  4. The model is instantiated with customModel()
  5. The response is streamed back to you
  6. Results are stored in PostgreSQL

Structure

Understand the full project structure of the codebase.