Screen.Recording.2025-05-21.at.4.50.37.AM.mov
- Dual-mode chatbot: Normal LLM responses OR targeted web search results
- Domain-specific search: Select from dropdown options (GitHub, LinkedIn, Articles, News, etc.)
- Real-time results: Get live data from specific platforms
- Example: Select "LinkedIn" → Ask for "big tech recruiters hiring software engineers" → Receive 5-10 actual recruiter profiles
- User selects search category from dropdown and submits query
- Chat context passed to LLM to generate optimized search query
- Refined query sent to Exa AI API for domain-specific results
- Search results fed into LLM again which then generates final response to user
- Maintains conversational flow while providing current information
- Exa Exa Search
- Find webpages using Exa’s embeddings-based or Google-style keyword search
- Get clean, up-to-date, parsed HTML from Exa search results
- Based on a link, find and return pages that are similar in meaning
- Get direct answers to questions using Exa’s Answer API
- Next.js App Router
- AI SDK
- Unified API for generating text, structured objects, and tool calls with LLMs
- Hooks for building dynamic chat and generative user interfaces
- Data Persistence
- Neon Serverless Postgres for saving chat history and user data
- Vercel Blob for efficient file storage
- Auth.js
- Simple and secure authentication
- shadcn/ui
- Styling with Tailwind CSS
- Component primitives from Radix UI for accessibility and flexibility
Must populate the environment variables defined in .env.example
to run this chatbot. Recommended to use Vercel Environment Variables, but a .env
file is all that is necessary.
- Install Vercel CLI:
npm i -g vercel
- Link local instance with Vercel and GitHub accounts (creates
.vercel
directory):vercel link
- Download your environment variables:
vercel env pull
pnpm install
pnpm dev
It should now be running on localhost:3000.