MENU

Fun & Interesting

Function Calling with LLM using LangChain + Ollama

Yankee Maharjan 4,341 5 months ago
Video Not Working? Fix It Now

Function calling in Large Language Models is essential to provide it with the access to real time data, take actions on your behalf or perform computation. Learn how to use Function Calling in your LLM application utilizing LangChain, Ollama and Streamlit! --- 🔨Tools: - LangChain: https://python.langchain.com/docs/introduction/ - Ollama: https://ollama.dev/ - Streamlit: https://docs.streamlit.io/ --- Code Example: https://github.com/yankeexe/llm-function-calling-demo Ollama Blog (Tool/Function Calling): https://ollama.com/blog/tool-support --- ⚡️ Follow me: - Github: https://github.com/yankeexe - LinkedIn: https://www.linkedin.com/in/yankeemaharjan - Twitter (X): https://x.com/yankexe - Website: https://yankee.dev -- 🎞️ Chapters 0:00 Intro 1:33 Project setup 2:05 Coding 14:55 Internal: LLM Output with Tool 15:43 Internal: Function Context - JSON Schema 17:30 Outro

Comment