Join Hugo Bowne-Anderson and Ferras Hamad (Machine Learning Leader at DoorDash, formerly at Netflix, Meta, and Uber) for a fireside chat exploring the evolving landscape of machine learning and AI systems. Drawing from Ferras’s experience at some of the most innovative tech companies, this conversation will dive into the challenges and opportunities of building and scaling ML systems that bridge infrastructure and application layers. Key Topics of Discussion From Infrastructure to Business Value: Insights into how companies like Netflix, Meta, Uber, and DoorDash approach the ML lifecycle, from infrastructure design to application-level outcomes that drive business impact. - The Convergence of ML Tools: A look at the trend of ML platforms converging to support both advanced users and non-specialists, addressing diverse personas and use cases. - LLMs and In-Context Learning: How the rise of large language models is reshaping traditional ML systems, from tooling requirements to integration into production environments. - Team Collaboration in ML Development: The importance of cross-functional relationships between data scientists, engineers, and platform teams to foster innovation and efficiency. - Skill Sets for the Future: How the blending of roles like ML engineers, data scientists, and software engineers is creating demand for “full-stack” ML professionals. - Operationalizing ML Across Industries: Lessons on scaling ML operations in sectors from streaming to delivery, with practical advice for companies at every stage of their data journey. This session is ideal for software engineers, ML practitioners, and technical leaders seeking insights into the rapidly evolving ML and AI landscape. Whether you're tackling infrastructure challenges, deploying models at scale, or just starting with ML, you'll leave with valuable takeaways to guide your work. 00:00 Welcome and Introduction 00:10 Guest Introduction: Ferras Hamad 01:53 Metaflow and Its Impact 06:27 Diverse Applications of ML at DoorDash 08:43 Knowledge Graphs: The Backbone of AI? 13:24 Challenges with LLMs and Their Integration 13:54 Comparing ML Approaches Across Companies 20:59 Generative AI in Production 27:11 The Rise of Open Source in ML 28:55 Challenges with Traditional ML Models 30:40 Future of Feature Stores and Prompt Management in GenAI 32:30 Traditional ML Models vs. LLMs 35:55 Convergence of ML Tools for All Users 38:17 Collaboration Between Teams in ML 41:59 Hiring Data Scientists for Ideas 43:34 Centralized vs. Embedded Data Science Teams 49:05 Skills for Future ML Practitioners 51:21 Advice for Scaling ML Infrastructure 53:14 Future Opportunities and Challenges in ML 55:48 Conclusion and Final Thoughts