MENU

Fun & Interesting

Building a fully local research assistant from scratch with Ollama

LangChain 52,752 lượt xem 3 months ago
Video Not Working? Fix It Now

Research and summarization is one of the most popular agent use-cases identified in our State of AI Agents survey. We've seen high demand for lightweight research assistants that can be run with local LLMs for personal use. Here, we show an assistant that uses Ollama for local model access and LangGraph for agent orchestration. In the video we show how to use it with LangGraph Studio, configure it for different LLMs, and we built it from scratch.

Code:
https://github.com/langchain-ai/research-rabbit

Video notes:
https://mirror-feeling-d80.notion.site/Fully-Local-Research-Summarization-157808527b1780bfa5e5f4650630ae08?pvs=73

Comment