MENU

Fun & Interesting

Drift Detection: An Introduction with Seldon

Seldon 6,069 4 years ago
Video Not Working? Fix It Now

Deployed machine learning models can fail spectacularly in response to seemingly benign changes to the underlying process being modelled. Concerningly, when labels are not available, as is often the case in deployment settings, this failure can occur silently and go unnoticed. This can pose great risk to an organisation.  Drift detection is the discipline focused on detecting such changes and awareness of its importance is growing among machine learning practitioners and researchers. In this video Seldon researcher Oliver Cobb provides an introduction to the subject, explaining how drift can occur, why it pays to detect it and how it can be detected in a principled manner. Of particular focus are the practicalities and challenges around detecting it as quickly as possible in deployment settings where high dimensional and unlabelled data is arriving continuously. Those interested in adding drift detection functionality to their own projects are encouraged to check out our open-source Python library alibi-detect. Want to find out more how Seldon can help your organisation? Trial Seldon Deploy for free: https://go.seldon.io/deploytrial You can also check out the albii-detect Github page here: https://github.com/SeldonIO/alibi-detect 0:00 - Introduction 1:12 - Preliminaries 3:37 - What is drift? 5:35 - Types of drift 9:52 - Change or chance? 13:35 - Online drift detection 17:08 - Windowing strategies 27:17 - Relationship to outlier detection 28:30 - Specifying test statistics 33:32 - Anatomy of a drift detector 34:30 - End-to-end example 36:00 - Alibi Detect summary

Comment