Chat is the most advanced app in AnotherWrapper Premium. It is not a basic
chatbot demo. It is a multi-model AI workspace with streaming responses, web
browsing, PDF chat, citations, generative UI, and persistent conversation
history.
What This App Actually Does
From the user’s point of view, this app feels like a serious AI assistant, not just a prompt box. Users can:- switch between models like GPT, Claude, Gemini, Grok, DeepSeek, and Llama
- chat with streaming responses
- upload PDFs and ask questions about them
- browse the web when browsing is enabled
- work with images and other multimodal inputs
- generate documents and UI blocks inside the chat flow
- keep conversation history in their account
- see citations when answers are grounded in uploaded documents
Why It Feels More Capable Than A Basic Chat Demo
The current chat app combines several systems working together:- Vercel AI SDK for streaming, tools, and provider abstraction
- multiple providers so users are not locked into one model family
- native web search tools for supported providers
- RAG so uploaded PDFs can be searched semantically
- Supabase for auth, chat history, and document records
- object storage for uploads
- credit gating so premium models and premium actions can be monetized
Supported Models
| Provider | Models |
|---|---|
| OpenAI | GPT-5, GPT-5 mini, o3 |
| Anthropic | Claude Opus 4.5, Sonnet 4.5, Haiku 4.5 |
| Gemini 3 Pro, Gemini 2.5 Flash | |
| Groq | Llama 4 Scout, Llama 4 Maverick |
| xAI | Grok 4, Grok 4.1 |
| DeepSeek | DeepSeek models |
lib/ai/models.ts. You can add, remove, or modify available models there.
What You Need To Run It
The following services are required to run the chat app:Supabase
Set up user authentication & PostgreSQL database using Supabase
Storage
Set up S3-compatible file storage for image and PDF uploads
OpenAI
Required for GPT models and PDF embeddings. Get API access from OpenAI
Anthropic
Required for Claude models. Get API access from Anthropic
Google AI
Required for Gemini models. Set up the Google provider guide first.
Groq
Required for Llama models. Set up the Groq provider guide first.
xAI (Grok)
Required for Grok models. Set up the xAI provider guide first.
DeepSeek
Required for DeepSeek models. Set up the DeepSeek provider guide first.
Setting up Web Search
Web search is optional and uses the native tool support exposed by the selected model provider. There is no separate search-provider setup in the current chat implementation. Supported native search providers in the app:- OpenAI
- Anthropic
- xAI
Search availability depends on the selected model/provider. Models without
native web search support, such as some Groq or DeepSeek paths, will not get
a fallback search provider automatically.
How A Chat Request Works
Here is the simple version of the pipeline:- the user sends a message
- the app checks the chosen model, browsing mode, and attached documents
- if documents are active, the app retrieves relevant chunks from the vector database
- the server builds the system prompt and tool list
- the answer streams back in real time
- the final messages and metadata are saved in Supabase
PDF Chat, Vector Search, And Citations
The document workflow is one of the strongest parts of the app. When a user uploads a PDF:- the file is stored
- text is extracted
- the text is split into chunks
- each chunk is embedded with OpenAI
- those embeddings are stored in Supabase with
pgvector - later, user questions are matched against those chunks
- the retrieved context is injected into the prompt
- citations are shown in the UI
Generative UI And Tools
This chat app is not text-only. It also supports tool-driven interactions such as:- document creation
- document updates
- app suggestions
- provider-native web search
Credit Gating
The repo includes an app-wide credit layer. In practical terms:- free models can stay free
- premium models can cost credits
- browsing can also cost credits
- the app returns usage metadata so the UI can show what happened
Good First Customizations
If you are building your own product on top of this app, the usual first edits are:- update the model list in
lib/ai/models.ts - change which models are free vs. premium
- adjust the system prompt in
app/(apps)/chat/prompt.ts - add or remove tools under
app/(apps)/chat/tools/ - tune the document retrieval flow under
lib/rag/
Verification
Your chat setup is working if:- the page loads and streams responses
- switching models changes the provider/model badge
- browsing works on models that support native search
- uploaded PDFs can be indexed and cited
- new messages persist after refresh

