MENU

Fun & Interesting

Context Is The Next Frontier by Jacob Buckman, CEO of Manifest AI

Democratize Intelligence 4,335 3 months ago
Video Not Working? Fix It Now

Co-founder and CEO of Manifest AI, Jacob Buckman, at the Democratize Intelligence Summit on January 24th, 2025 in San Francisco, CA. In his talk, Buckman discusses the importance of context length in large language models (LLMs) and why expanding it is the next major step in AI development. He explains that while models have grown in size and data, the amount of text they can process at once (context length) remains limited, leading to inefficiencies. Buckman introduces the idea that transformers, the dominant AI architecture today, function similarly to recurrent neural networks (RNNs) but with a growing memory (KV cache). He argues that increasing context length will improve AI capabilities, enabling models to retain more information without frequent resets, eliminating the need for fine-tuning, and shifting AI from centralized hyperscalers to decentralized systems. However, he notes that traditional transformers struggle with long contexts due to their quadratic scaling costs, making sub-quadratic architectures the key to overcoming this limitation. Buckman presents his team’s open-source work on "power attention," a technique that enhances efficiency in long-context processing. He predicts that by 2025, all major AI companies will be developing models with sub-quadratic architectures, and by 2027, transformers will be mostly replaced. Follow Democratize Intelligence at www.demi.so

Comment