This is a simple chat application built with Next.js and Ngrok to enable secure tunneling for local development. This app uses Ollama AI for real-time chat communication.
- Real-time chat interface
- Easy deployment with Next.js
- Secure tunnel for local development using Ngrok
- Integrated with Ollama API for intelligent chat responses
- Node.js (v14 or later)
- Ngrok (optional)
- Ollama llama3.1
git clone https://github.com/heinhtoo/ollama-chat.git
cd ollama-chat
Before running the app, make sure to install the necessary packages:
npm install
Create a .env.local file at the root of the project to configure your Ollama public api url
OLLAMA_BASEURL="http://localhost:11434/api"
Now you can start the development server:
npm run dev
Once development is complete, you can deploy your Next.js app to Vercel or any other preferred hosting service.
To create a secure tunnel for local ollama model, run Ngrok on port 11434:
ngrok http 11434 --host-header="localhost:11434"
Copy the forwarding URL provided by Ngrok and update environment variables in your production server
- Blog Post: Deploy ollama chat app on Vercel