MENU

Fun & Interesting

LLM Chat App in Python w/ Ollama-py and Streamlit

Decoder 10,859 1 year ago
Video Not Working? Fix It Now

In this video I walk through the new Ollama Python library, and use it to build a chat app with UI powered by Streamlit. After reviewing some important methods from this library, I touch on Python generators as we construct our chat app, step by step. Check out my other Ollama videos - https://youtube.com/playlist?list=PL4041kTesIWby5zznE5UySIsGPrGuEqdB Links: Code from video - https://decoder.sh/videos/llm-chat-app-in-python-w_-ollama_py-and-streamlit Ollama-py - https://github.com/ollama/ollama-python Streamlit - https://streamlit.io/ My website - https://decoder.sh Timestamps: 00:00 - Intro 00:26 - Why not use the CLI? 01:17 - Looking at the ollama-py library 02:26 - Setting up Python environment 04:05 - Reviewing Ollama functions 04:14 - list() 04:52 - show() 05:44 - chat() 06:55 - Looking at Streamlit 07:59 - Start writing our app 08:51 - App: user input 11:16 - App: message history 13:09 - App: adding ollama response 15:00 - App: chooing a model 17:07 - Introducing generators 18:52 - App: streaming responses 21:22 - App: review 22:10 - Where to find the code 22:27 - Thank you for 2k

Comment