Setting Up Open-WebUI with Ollama: A Comprehensive Guide for CPU and GPU Systems
Are you looking for a PRIVATE flexible and efficient way to run Open-WebUI with Ollama, whether you have a CPU-only Linux machine or a powerful GPU setup? Look no further! This blog post provides a detailed guide to deploying Open-WebUI and Ollama with support for both configurations. Leveraging Docker Compose for seamless deployment, this solution is designed to be robust, scalable, and user-friendly.
Why Open-WebUI with Ollama?
Open-WebUI is a versatile platform for running large language models easily. Paired with Ollama, a tool designed to simplify model management and execution, this combination becomes a powerhouse for AI workflows. With support for CPU and GPU configurations, this setup ensures accessibility for many systems while taking advantage of available hardware acceleration. I use it on a Vostro with Linux Mint, i5, Ollama and 16Gbs of RAM.
To make things even better, I’ve created a GitHub repository where you can find all the necessary files and scripts to deploy this setup effortlessly. Check it out here: OpenWeb-UI_with_Ollama.
Key Features of This Setup
- Single-Container Solution: Open-WebUI and Ollama are bundled into a single container image for simplicity.
- Flexible Support: Includes options for both CPU-only and GPU-enabled deployments.
- Docker Compose Integration: Easy deployment and management of containers.
- Persistent Data Storage: All configurations and data are stored in designated volumes, ensuring durability.
Step-by-Step Guide
Follow these steps to set up Open-WebUI with Ollama on your system:
1. Clone the Repository
First, clone the GitHub repository to your local machine:
git clone https://github.com/ciberjohn/lealdasilva.git
cd lealdasilva/OpenWeb-UI_with_Ollama
2. Review the Installation Scripts
The repository includes two installation scripts:
setup_ollama_and_openwebui_cpu.sh
: For systems with CPU-only support.setup_ollama_and_openwebui_gpu.sh
: For systems with NVIDIA GPU support.
Each script:
- Installs Docker and Docker Compose if not already installed.
- Sets up the necessary project directories and Docker Compose configuration.
- Deploys the containerised version of Open-WebUI and Ollama with the appropriate configuration.
3. Choose and Run the Script
Make the chosen script executable and run it. For example:
CPU-Only Version:
chmod +x setup_ollama_and_openwebui_cpu.sh
./setup_ollama_and_openwebui_cpu.sh
GPU Version:
chmod +x setup_ollama_and_openwebui_gpu.sh
./setup_ollama_and_openwebui_gpu.sh
4. Access Open-WebUI
Once the script completes, you can access Open-WebUI in your browser at:
http://localhost:3000
All data for Open-WebUI and Ollama is stored under the designated volumes, ensuring your configurations and data persist across container restarts.
Inside the Docker Compose Configurations
Here’s an overview of the configurations for both setups:
CPU-Only Configuration (docker-compose.cpu.yml
):
version: "3.8"
services:
open-webui:
image: ghcr.io/open-webui/open-webui:ollama
container_name: open-webui
ports:
- "3000:8080"
volumes:
- ollama:/root/.ollama
- open-webui:/app/backend/data
restart: always
volumes:
ollama:
open-webui:
GPU Configuration (docker-compose.gpu.yml
):
version: "3.8"
services:
open-webui:
image: ghcr.io/open-webui/open-webui:ollama
container_name: open-webui
ports:
- "3000:8080"
volumes:
- ollama:/root/.ollama
- open-webui:/app/backend/data
deploy:
resources:
reservations:
devices:
- driver: "nvidia"
count: all
capabilities: ["gpu"]
restart: always
volumes:
ollama:
open-webui:
Why did I spend my Monday Night assembling this Solution?
- Flexibility: Easily switch between CPU and GPU configurations based on your system.
- Scalability: Add more models or configurations as your requirements grow.
- Ease of Use: With Docker Compose, managing and deploying containers is a breeze.
Get Started Now
Ready to harness the power of Open-WebUI and Ollama? Head over to the GitHub repository and follow the instructions:
OpenWeb-UI_with_Ollama on GitHub
Feel free to reach out if you have any questions or feedback.
https://linkstack.lealdasilva.com/@joaosilva
Let’s build and share great tools together!