Today we're joined by Abdul Fatir Ansari, a machine learning scientist at AWS AI Labs in Berlin, to discuss his paper, "Chronos: Learning the Language of Time Series" - https://arxiv.org/abs/2403.07815. Fatir explains the challenges of leveraging pre-trained language models for time series forecasting. We explore the advantages of Chronos over statistical models, as well as its promising results in zero-shot forecasting benchmarks. Finally, we address critiques of Chronos, the ongoing research to improve synthetic data quality, and the potential for integrating Chronos into production systems. 🎧 / 🎥 Listen or watch the full episode on our page: https://twimlai.com/go/685. 🔔 Subscribe to our channel for more great content just like this: https://youtube.com/twimlai?sub_confirmation=1 🗣️ CONNECT WITH US! =============================== Subscribe to the TWIML AI Podcast: https://twimlai.com/podcast/twimlai/ Follow us on Twitter: https://twitter.com/twimlai Follow us on LinkedIn: https://www.linkedin.com/company/twimlai/ Join our Slack Community: https://twimlai.com/community/ Subscribe to our newsletter: https://twimlai.com/newsletter/ Want to get in touch? Send us a message: https://twimlai.com/contact/ 📖 CHAPTERS =============================== 00:00 - Introduction 02:11 - Inspiration for Chronos 04:30 - Overview of statistical models 07:04 - Overfitting 08:17 - LLMs in time series forecasting 10:20 - Tokenization 15:25 - Why T5? 16:35 - Data augmentation 25:28 - Evaluation 27:45 - Result 31:15 - In domain vs zero shot 33:35 - Performance across different patterns 36:25 - Critique of Chronos 40:15 - Chronos in production 41:00 - Future of Chronos 42:00 - Conclusion 🔗 LINKS & RESOURCES =============================== Large Language Models Are Zero-Shot Time Series Forecasters - https://arxiv.org/pdf/2310.07820 Time-LLM: Time Series Forecasting by Reprogramming Large Language Models - https://arxiv.org/abs/2310.01728 LLM4TS: Aligning Pre-Trained LLMs as Data-Efficient Time-Series Forecasters - https://arxiv.org/pdf/2308.08469 Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting - https://arxiv.org/abs/2310.08278 Unified Training of Universal Time Series Forecasting Transformers (Moirai) - https://arxiv.org/abs/2402.02592 📸 Camera: https://amzn.to/3TQ3zsg 🎙️Microphone: https://amzn.to/3t5zXeV 🚦Lights: https://amzn.to/3TQlX49 🎛️ Audio Interface: https://amzn.to/3TVFAIq 🎚️ Stream Deck: https://amzn.to/3zzm7F5