Open WebUI

AI 채팅 및 어시스턴트

A 셀프 호스팅 AI chat UI compatible with Ollama and OpenAI-compatible APIs. Capable of running fully offline.

4.4
Webセルフホスティング(Docker)

Open WebUI이란?

Open WebUI is an 오픈소스, 셀프 호스팅 AI chat platform. 다음을 지원합니다: a wide range of LLM runners including Ollama, OpenAI-compatible APIs, and more, and can operate completely offline. 다음을 제공합니다: advanced features such as RAG (Retrieval-Augmented Generation), voice and video calls, document processing, and Python tool calling. 다음을 지원합니다: 9 vector databases including ChromaDB, PostgreSQL, Qdrant, and Milvus, and includes 프롬프트 injection protection via LLM-Guard. It is 가장 인기 있는 것 중 하나인 셀프 호스팅 AI 도구 of 2026 for individuals and organizations that prioritize privacy.

Open WebUI 스크린샷

요금제

1Free (오픈소스)

주요 기능

Multi-model support
RAG
Voice & video calls
Document processing
Prompt injection protection
9 vector database support

장점 및 단점

장점

  • Completely free and 오픈소스
  • Works offline
  • 뛰어난 privacy protection

단점

  • Requires technical knowledge such as Docker for setup
  • 일본어 localization of the UI is incomplete
  • Requires your own GPU/server

자주 묻는 질문

Q. Is Open WebUI free?

A. 네, it is completely 오픈소스 and free. 그러나 you need a server or PC (possibly with a GPU) to run it.

Q. What is the difference from Ollama?

A. Ollama is an engine for running LLMs locally, while Open WebUI is its frontend (user interface). By connecting Open WebUI to Ollama, you can use local AI through a ChatGPT-like UI.

관련 도구

AIpedia에서 더 탐색하기