A Modern Interface
for Local AI

Experience a streamlined, open-source client for Ollama models. Built with a focus on simplicity, efficiency, and inclusivity.

Ollama UI Interface

Get Started in Minutes 🚀

Quick setup guide to get you up and running

1

Prerequisites

  • Ollama running at localhost:11434
  • Node.js (Latest LTS)
  • Yarn package manager
2

Installation

Terminal
git clone https://github.com/ArtyLLaMa/OllamaUI.git
cd OllamaUI
yarn install
yarn run dev
3

Ready to Go!

Access the UI at http://localhost:3000

Vision support requires Ollama v0.4.0+