MENU

Fun & Interesting

Self-Host a local AI platform! Ollama + Open WebUI

Christian Lempa 45,504 lượt xem 1 month ago
Video Not Working? Fix It Now

Check out Twingate and supercharge your security: https://bit.ly/3Y1OaZi

In this video, I'll show you my new self-hosted AI platform deployed in my HomeLab using the free and open-source Ollama platform. I'll walk you through setting up OpenWebUI for an easy-to-use web interface with advanced features even ChatGPT might envy—and securing it with Traefik and Authentik. Plus, I’ll share some valuable tips to avoid common pitfalls when building your own local AI server.

References

- Ollama GPU Requirements: https://github.com/ollama/ollama/blob/main/docs/gpu.md
-AMD ROCm: https://rocm.docs.amd.com/projects/radeon/en/latest/docs/install/native_linux/install-radeon.html

Learn more

- Traefik: https://www.patreon.com/posts/114924172
- Authentik: https://www.patreon.com/posts/100779796

________________

💜 Support me and become a Fan!
→ https://christianlempa.de/patreon
→ https://www.youtube.com/channel/UCZNhwA1B5YqiY1nLzmM0ZRg/join

💬 Join our Community!
→ https://christianlempa.de/discord

👉 Follow me everywhere
→ https://christianlempa.de

________________

Read my Tech Documentation
https://christianlempa.de/docs

My Gear and Equipment
https://christianlempa.de/kit

________________

Timestamps:

00:00 Introduction
02:38 Hardware Requirements
06:58 Software Planning
08:35 Problems with Proxmox…
10:55 Installing a new LXC Container
14:44 Install Ollama on Linux
17:41 Ollama basics
19:56 Install OpenWeb UI
27:37 OpenWebUI basics
31:05 Using AI models
35:01 Web Searching
37:18 Why I still don’t trust AI

________________
Links can include affiliate links.

Comment