Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

This section describes the Champion/Challenger feature in ModelOp Center, and how to use it to help determine the best model method or approach to promote to production.

Table of Contents

Table of Contents

Introduction

Data Science requires experimentation using a variety of methods, approaches, frameworks, and potentially even coding languages. Champion/Challenger is an industry standard technique to compare the performance of competing model strategies, or the performance of a recently updated model against a previous version of the same While most data scientists will conduct their experimentation in their model development environment, data scientists--and the managerial and governance reviewers--often want to review that a candidate model for production performs better than any currently running version of the model. The Champion/Challenger feature in ModelOp Center does presents a side-by side comparison of the performance of different models in order to determine which model is better to promote to production--or versions of the same model--to help users analyze which model or model version is best suited for production usage.

The following phases lay the groundwork for doing a Champion/Challenger comparison.

  1. Define the evaluation metrics for the model. See Model Efficacy Metrics and Monitoring: Overview.

  2. Build Automate evaluation tests - this is done either by manually running metrics jobs (see Running a Metrics Job Manually)or through an automated MLC Process. You can build an MLC Process to automatically execute metrics tests for a particular version of a model. See Model Lifecycle Manager AutomationManagement Overview for more information.

  3. Conduct a side-by-side comparison of the test results in the Champion/Challenger page of the Command Center (see next section).

Champion/Challenger Comparison

For the previously generated metrics results (see steps 1 & 2 above), use the following steps to activate the Champion/Challenger feature in the ModelOp Center to compare test results side-by-side.

  1. In the Command Center, navigate to Models.

  2. Choose the (two or more) models you would like to compare.

  3. Select the test results for each of the models

  4. You can view the metrics side-by-side to decide which model is performing better.

Next Article: Model Monitoring: Overview >