Build a powerful AI assistant with multiple LLMs, generative UI, web browsing, and image analysis
We’ll use multiple AI models, Vercel AI SDK, Supabase, and Serper API to create an enterprise-grade AI assistant with generative UI, web browsing, and image analysis capabilities.
Enterprise-grade AI assistant with multiple models
To build your advanced AI assistant, you’ll need to have several services set up. If you haven’t done so yet, please start with these:
Set up user authentication & PostgreSQL database using Supabase
Set up file storage using Cloudflare R2 for image uploads
Required for GPT models. Get API access from OpenAI
To use all available models, you’ll need to set up additional providers:
Required for Llama models. Sign up and get API key.
Required for Grok models. Get API access from xAI.
Required for Claude models. Get API access from Anthropic
Web scraping is handled by Jina AI, which is free and doesn’t require an API key.
The chat feature uses tables that should already exist if you followed the Quick Setup guide and ran the Supabase migrations.
The required tables are:
chats
: Stores chat sessions and metadatamessages
: Manages message history and contentimages
: Handles image uploads and analysis resultschat_documents
: Stores generated documents and versionsYou can find the relevant SQL code + the necessary functions and RLS policies under supabase/migrations/20240000000003_chat.sql
All tables include Row Level Security (RLS) policies to ensure users can only access their own data.
The chat application is organized in a modular structure:
API Routes (app/api/(apps)/(chat)/*
):
chat/route.ts
: Main chat endpoint with multi-model supportdocument/route.ts
: Document generation and versioningimages/upload/route.ts
: Image upload and analysishistory/route.ts
: Chat history managementChat App (app/(apps)/chat/*
):
page.tsx
: Main chat interface with model selectioninfo.tsx
: App information and features displayprompt.ts
: System prompts for different modestools.ts
: Tool definitions (document, web browsing, app suggestions)toolConfig.ts
: Feature flags and configurationsComponents (components/chat/*
):
widgets/
: Generative UI components (app cards)AI Configuration (lib/ai/*
):
ai-utils.ts
: Model provider integrationchat.ts
: Message handling utilitiesmodels.ts
: Available models configurationYou can customize available models in lib/ai/models.ts
:
The chat application uses the Vercel AI SDK for seamless AI model integration and streaming responses. This provides:
The Vercel AI SDK handles the complexity of managing different AI providers, allowing us to switch between models seamlessly while maintaining a consistent API interface.
The chat application includes a flexible credit system to manage access to premium features:
Free Features:
gpt-4o-mini
, claude-3-5-haiku
, llama-3.1-70b
)Premium Features (require credits):
The credit system is implemented through two main components:
The credit system is integrated into the chat API route (app/api/(apps)/(chat)/chat/route.ts
), which:
You can customize the credit system by: - Modifying FREE_MODELS
in
usage-limits.ts
- Adjusting credit costs in canUseConfiguration
-
Implementing your own paywall by modifying the chat API route - Removing the
credit system entirely for open access
The API returns credit usage information in the x-credit-usage
header:
The credit system is designed to be flexible and can be easily modified or replaced with your own payment/subscription system.