
It’s my portfolio, reimagined as a chat conversation, answered by AI.
Tech Stack · Sources of Inspiration · Deploy Your Own · Running locally
Nathan's AI is a unique twist on the traditional portfolio. Instead of scrolling through pages of information, visitors can simply ask questions to learn about my career, skills, projects, and experiences. Built with Next.js, Tailwind CSS, and Vercel’s AI SDK, this chatbot acts as an interactive resume, letting you explore my journey in a conversational way.
- Next.js App Router
- Advanced routing for seamless navigation and performance
- React Server Components (RSCs) and Server Actions for server-side rendering and increased performance
- AI SDK
- Unified API for generating text, structured objects, and tool calls with LLMs
- Supports Anthropic (default), OpenAI, Cohere, and other model providers
- shadcn/ui
- Styling with Tailwind CSS
- Component primitives from Radix UI for accessibility and flexibility
- Data Persistence
- Vercel Postgres powered by Neon for saving chat history
- @upstash/ratelimit
- Preventing excessive usage of the chat
- motion
- A modern animation library for JavaScript and React
- Clean and easy to use animations
I am proud of my designs, but it doesn't came from the pure source of my imagination. Here you can find links of website I used to create my own design.
You can deploy your own version of Nathan's AI Chatbot to Vercel with one click:
Nathan's AI depends on multiple services to function properly, and as you saw by deploying to Vercel, you need a few environment variables to make it work. Follow these steps to configure the necessary environment variables.
Grab your Anthropic API Key and paste it into your .env.local
file:
ANTHROPIC_API_KEY="your-api-key-here"
Nathan's AI currently uses Anthropic, but you can switch to another provider by updating the model inside streamText()
in lib/chat/actions.tsx. For more details, check out the AI SDK documentation. If you switch providers, remember to update the relevant environment variables accordingly.
To prevent users (or bots) from consuming all your AI credits in a caffeine-fueled chat spree, we use Upstash for rate limiting.
- Create a Redis database on Upstash.
- Add these variables to your
.env.local
file:
KV_URL="your-kv-url"
KV_REST_API_URL="your-rest-api-url"
KV_REST_API_TOKEN="your-api-token"
KV_REST_API_READ_ONLY_TOKEN="your-read-only-token"
If you want to save chat logs for fine-tuning responses (definitely not because you're nosy), you'll need a Postgres database.
- Create a Postgres database on Neon.
- Add your connection string to
.env.local
:
POSTGRES_URL="your-database-url"
Customize the AI’s responses by replacing all content inside the ./content/** directory with your own experiences, education, or whatever makes your AI unique.
The content structure is defined in:
Adapt these files as needed to fit your requirements.
If your name happens to be Nathan, congratulations! The chatbot is already personalized for you. No changes needed.
For everyone else, you'll need to update all occurrences of its current name.
Run a simple:
g/Nathan's AI/
This will show all occurrences so you can update them efficiently.
That's not my problem sorry, find your way's to replace it.
You will need to use the environment variables defined in .env.example
to run Nathan's AI locally. It's recommended you use Vercel Environment Variables for this, but a .env
file is all that is necessary.
Note: You should not commit your
.env
file or it will expose secrets that will allow others to control access to your various authentication provider accounts.
- Install Vercel CLI:
npm i -g vercel
- Link local instance with Vercel and GitHub accounts (creates
.vercel
directory):vercel link
- Download your environment variables:
vercel env pull
bun install
bun run dev
That's it, you are all set! If you run into any problems or have any questions, please hesitate to ask me.