We’ll use multiple AI models, Vercel AI SDK, Supabase, and Serper API to create an enterprise-grade AI assistant with generative UI, web browsing, and image analysis capabilities.

Enterprise-grade AI assistant with multiple models

Features

  • Multiple AI Models: Switch between OpenAI, Claude, Groq, and others
  • Generative UI: Dynamic components and interactive app suggestions
  • Image Analysis: Process and analyze images with vision models
  • Web Search: Real-time internet access via Serper API
  • Web Scraping: Extract content from URLs using Jina AI
  • Document Generation: Create and edit documents in a Canvas interface
  • Multi-language Support: Communicate in various languages

Pre-requisites

To build your advanced AI assistant, you’ll need to have several services set up. If you haven’t done so yet, please start with these:

To use all available models, you’ll need to set up additional providers:

Setting up Serper API

  1. Visit serper.dev
  2. Create an account and get your API key
  3. Add to your environment variables:
SERPER_API_KEY=your_api_key_here

Web scraping is handled by Jina AI, which is free and doesn’t require an API key.

Database Setup

The chat feature uses tables that should already exist if you followed the Quick Setup guide and ran the Supabase migrations.

The required tables are:

  • chats: Stores chat sessions and metadata
  • messages: Manages message history and content
  • images: Handles image uploads and analysis results
  • chat_documents: Stores generated documents and versions

You can find the relevant SQL code + the necessary functions and RLS policies under supabase/migrations/20240000000003_chat.sql

All tables include Row Level Security (RLS) policies to ensure users can only access their own data.

App Structure

The chat application is organized in a modular structure:

  1. API Routes (app/api/(apps)/(chat)/*):

    • chat/route.ts: Main chat endpoint with multi-model support
    • document/route.ts: Document generation and versioning
    • images/upload/route.ts: Image upload and analysis
    • history/route.ts: Chat history management
  2. Chat App (app/(apps)/chat/*):

    • page.tsx: Main chat interface with model selection
    • info.tsx: App information and features display
    • prompt.ts: System prompts for different modes
    • tools.ts: Tool definitions (document, web browsing, app suggestions)
    • toolConfig.ts: Feature flags and configurations
  3. Components (components/chat/*):

    • widgets/: Generative UI components (app cards)
  4. AI Configuration (lib/ai/*):

    • ai-utils.ts: Model provider integration
    • chat.ts: Message handling utilities
    • models.ts: Available models configuration

Model Configuration

You can customize available models in lib/ai/models.ts:

lib/ai/models.ts
export const AI_MODEL_DISPLAY = {
  "gpt-4o-mini": {
    name: "GPT-4o mini",
    logo: "/providers/openai.webp",
    vision: true,
  },
  "claude-3-5-sonnet-latest": {
    name: "Claude 3.5 Sonnet",
    logo: "/providers/anthropic.jpeg",
    vision: true,
  },
  // Add or remove models as needed
};

Vercel AI SDK Integration

The chat application uses the Vercel AI SDK for seamless AI model integration and streaming responses. This provides:

  • Unified interface for multiple AI providers
  • Real-time streaming responses
  • Built-in TypeScript support
  • Efficient message handling

The Vercel AI SDK handles the complexity of managing different AI providers, allowing us to switch between models seamlessly while maintaining a consistent API interface.

Credit System

The chat application includes a flexible credit system to manage access to premium features:

Features & Limitations

  • Free Features:

    • Access to basic models (gpt-4o-mini, claude-3-5-haiku, llama-3.1-70b)
    • Standard chat functionality
    • Image analysis with free models
  • Premium Features (require credits):

    • Advanced AI models
    • Web browsing capabilities
    • Additional features as configured

Implementation

The credit system is implemented through two main components:

// Customize which models are free
export const FREE_MODELS = [
  "gpt-4o-mini",
  "claude-3-5-haiku-latest",
  "llama-3.1-70b-versatile",
] as const;

// Credit validation logic
export function canUseConfiguration(
  credits: number,
  config: {
    modelId?: AIModel;
    isBrowseEnabled: boolean;
  }
) {
  // ... credit check logic
}

The credit system is integrated into the chat API route (app/api/(apps)/(chat)/chat/route.ts), which:

  1. Checks user’s available credits
  2. Validates feature access
  3. Deducts credits for premium usage
  4. Returns credit usage information in response headers

You can customize the credit system by: - Modifying FREE_MODELS in usage-limits.ts - Adjusting credit costs in canUseConfiguration - Implementing your own paywall by modifying the chat API route - Removing the credit system entirely for open access

Credit Headers

The API returns credit usage information in the x-credit-usage header:

{
  cost: number;        // Credits used
  remaining: number;   // Available balance
  features: string[];  // Premium features accessed
}

The credit system is designed to be flexible and can be easily modified or replaced with your own payment/subscription system.