DriftWatch
DriftWatch is a web app (with optional Slack alerts) that monitors machine-learning models in production for data drift, concept drift, and silent performance decay. You connect it to your prediction logs and (optionally) ground-truth labels; it automatically profiles feature distributions, detects anomalies, and flags segments where the model is failing (e.g., new customer cohorts, new geos, new devices). It generates plain-English incident reports, suggests likely root causes (feature pipeline changes, upstream schema shifts, seasonality), and recommends concrete actions: retrain triggers, threshold adjustments, or rollback. It’s not a full MLOps platform—those are crowded and expensive. It’s a focused “smoke alarm” for teams that already ship models but don’t have time to build robust monitoring. Pricing is usage-based per model and event volume, targeting small-to-mid teams that need reliability without enterprise overhead.