search menu icon-carat-right cmu-wordmark

Augur: A Step Towards Realistic Drift Detection in Production ML Systems

April 2022 White Paper
Sebastián Echeverría, Lena Pons, Jeff Chrabaszcz (Govini)

The toolset and experiments reported in this paper provide an initial demonstration of (1) drift behavior analysis (2) metrics and thresholds (3) libraries for drift detection.

Publisher:

Software Engineering Institute

Abstract

The inference quality of deployed machine learning (ML) models degrades over time due to differences between training and production data, typically referred to as drift. While large organizations rely on periodic training to evade drift, the reality is that not all organizations have the data and the resources required to do so. We propose a process for drift behavior analysis at model development time that determines the set of metrics and thresholds to monitor for runtime drift detection. Better understanding of how models will react to drift before they are deployed, combined with a mechanism for how to detect this drift in production, is an important aspect of Responsible AI. The toolset and experiments reported in this paper provide an initial demonstration of (1) drift behavior analysis as a part of the model development process, (2) metrics and thresholds that need to be monitored for drift detection in production, and (3) libraries for drift detection that can be embedded in production monitoring infrastructures.