MENU

Fun & Interesting

Intro and demo to LLM models and ollama on AKS

Houssem Dellai 1,260 8 months ago
Video Not Working? Fix It Now

Simply put, this is how you can deploy an LLM/SLM models like Mistral, Microsoft Phi, Llama on your preferred machine whether that is your local machine, docker container or Kubernetes. The demo uses ollama as a platform for deploying and managing the LLM models. Disclaimer: This is part of my Udemy course: https://www.udemy.com/instructor/course/5139294 Follow me on Twitter for more content: https://twitter.com/houssemdellai

Comment