Open WebUI is an open-source front-end tool that connects with locally running LLMs (e.g., LLaMA, Mistral, Gemma, etc.) to provide a ChatGPT-like interactive UI. Below is a step-by-step guide on how to use it for beginners.
Premise: Overview of Open WebUI
- GitHub: Open WebUI (GitHub)
License: Proprietary license based on the BSD 3 clause license
https://github.com/open-webui/open-webui - Documentation: Open WebUI Docs
https://docs.openwebui.com/
Open WebUI connects to local LLMs (e.g.,ollamalmstudiollama.cpp etc.) via HTTP API to provide a web UI that allows chatting.
Usage steps (Docker premise)
Here’s how to use it in environments where Docker can be used (Windows/macOS/Linux).
Preparation: Steps to install Docker Desktop for Windows (Windows version)
Follow these instructions to install Docker Desktop for Windows and launch it in Linux container mode.

Make sure Docker (Linux container mode) is running
Open PowerShell or Command Prompt
- Search for “PowerShell” in the Start menu and launch it
- Or you can use “Command Prompt (cmd)”
Run the following command from PowerShell or Command Prompt to check your docker version:
docker versionExample of execution result) If you have been able to get the version correctly, it is OK.
Client:
Version: 28.0.4
API version: 1.48
Go version: go1.23.7
Git commit: b8034c0
Built: Tue Mar 25 15:07:48 2025
OS/Arch: windows/amd64
Context: desktop-linux
Server: Docker Desktop 4.40.0 (187762)
Engine:
Version: 28.0.4
API version: 1.48 (minimum version 1.24)
Go version: go1.23.7
Git commit: 6430e49
Built: Tue Mar 25 15:07:22 2025
OS/Arch: linux/amd64
Experimental: false
・・・Launching Open WebUI (Docker)
The following command launches the Open WebUI in a Docker container, making it accessible on port 3000, while connecting to the Ollama server and running it in the background with a configuration that persists the data.
Run the command:
docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=http://192.168.1.10:11434 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main*If you are already using port 3000, change it to an arbitrary value, such as 3001:8080.
Example URLs to access:http://localhost:3001
Command description
| item | substance | explanation |
|---|---|---|
docker run | Docker Commands | Command to create and launch a new Docker container |
-d | Detached Mode | Run the container in the background (hide the logs). |
-p 3000:8080 | Port Mapping | The host’s 3000 port → forwarded to the port in the 8080 container. It will be accessible from http://localhost:3000 the browser.*If you are already using port 3000, change it to an arbitrary value, such as 3001:8080. Example URLs to access: http://localhost:3001 |
-e OLLAMA_BASE_URL=http://192.168.1.10:11434 | Setting Environment Variables | Specify the URL of the Ollama API for the Open WebUI. This example assumes 192.168.1.10:11434 that Ollama is running on . |
-v open-webui:/app/backend/data | Persisting Data Volumes | open-webui Use a Docker volume named Persist to the app’s data directory /app/backend/data . Open WebUI settings, conversation history, etc. are kept. |
--name open-webui | Specifying a container name | Name this container open-webui . Note that containers with the same name cannot exist at the same time. |
--restart always | Auto-restart settings | Docker automatically restarts when the container stops. It is also automatically restarted when the PC is restarted, so it is suitable for service operation. |
ghcr.io/open-webui/open-webui:main | Docker image to use | Get the official Docker image for Open WebUI from the GitHub Container Registry and launch it with tags main . |
Note: Maindocker Commands Involved
| command | purpose |
|---|---|
docker run | Create a new container and start it (first time or recreate) |
docker start コンテナ名 | Restart a container that already exists at a stop (without deletion) |
docker restart コンテナ名 | Stop→ Restart (in 1 command) |
docker stop コンテナ名 | Temporarily stop (do not delete) containers |
docker rm コンテナ名 | Permanently delete containers (per name and configuration) |
docker ps -a | See all containers, even downed |
docker volume ls | Check the volume list |
Example 1) Stopping → starting open-webui
docker stop open-webui
docker start open-webuiExample 2) Delete → recreate open-webui
docker rm
docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=http://192.168.1.10:11434 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:mainWhere the open-webui entity is stored when executing the above command
| substance | place |
|---|---|
| Executable/Code | Inside the container (stored below on the host. /var/lib/docker/ Direct access deprecated) |
| Data (e.g., chat history) | -v open-webui-data:/app/backend/data Save to the specified Docker volume in the |
| Web screen displayed | Served on the front-end (e.g., Next.js) in the container |
Preparation Procedure to install “Ollama” (Windows version)
This article explains the steps to install Ollama and launch a local LLM.

Verify that Ollama is running
ollama run モデル名For example, ollama run llama3, etc., to start the model.
Access with a browser
Visit the following URL:
http://localhost:ポート番号Change the port number to the specified value (e.g., 3001, optional) as needed.
http://localhost:3000Login and user registration will be displayed, so you can log in by setting any name and password.



Talk to a model
- When you start it for the first time, it recognizes and automatically uses Ollama models.
- If you have multiple models, you can switch between them for each chat.
- You can also edit prompt templates, system prompts, and more.

How to start the second and subsequent times
Launch Docker Desktop (Linux container mode)
For Windows, make sure you have Docker Desktop launched (Linux container mode).
Launch Ollama example)ollama run llama3
ollama run モデル名Ollama is designed with a daemon running in the background, so once activated, it will reside.
Closing the terminal can also stop the Ollama process. In that case, restart again with the same command.
Launch Open WebUI
If you are deploying Open WebUI as a Docker container:open-webui * is the container name. If you set it to an alias, start it with that name.
docker start open-webuiAccess the Open WebUI in your browser
http://localhost:ポート番号By default, the port number is 3000 OR 8080 .docker ps You can see the actual exposed port number in →.
Official documentation
- Open WebUI Official GitHub
https://github.com/open-webui/open-webui - Open WebUI documentation
https://docs.openwebui.com/
frequently asked questions
| question | response |
|---|---|
| The model does not come out | Ollama not starting/API connection URL is incorrectly specified |
| I want to change to a port other than 3000 | -p 3000:3000 -p 8080:3000 Change to |
| Can you understand Japanese? | If the model is compatible (LLaMA3, Qwen, Gemma, etc.), |
| How compatible is it with GPU-using models? | If you can start it with Ollama, no problem. Dependent on GPU environment |
Comments