Champion/Challenger Model Comparison

This section describes the Champion/Challenger feature in ModelOp Center, and how to use it to help determine the best model method or approach to promote to production.

Table of Contents

 

Introduction

Data Science requires experimentation using a variety of methods, approaches, frameworks, and potentially even coding languages. Champion/Challenger is an industry standard technique to compare the performance of competing model strategies, or the performance of a recently updated model against a previous version of the same model. The Champion/Challenger feature in ModelOp Center does a side-by side comparison of the performance of different models in order to determine which model is better to promote to production.

The following phases lay the groundwork for doing a Champion/Challenger comparison.

  1. Define the evaluation metrics for the model. See Model Efficacy Metrics and Monitoring.

  2. Build evaluation tests - this can be done through an automated MLC Process or manually:

    1. Automated: build an MLC Process to automatically execute metrics tests for a particular version of a model. See ModelOp Life Cycle Manager: Automation for more information.

    2. Manual: Run Batch Metrics Jobs from the Command Center or the CLI. See Model Batch Jobs and Tests and ModelOp CLI Reference.

  3. Conduct a side-by-side comparison of the test results in the Champion/Challenger page of the Command Center.

Champion/Challenger Comparison

For the previously generated metrics results (see steps 1 & 2 above), use the following steps to activate the Champion/Challenger feature in the ModelOp Center to compare test results side-by-side.

  1. In the Command Center, navigate to Models > Champion/ Challenger.

     

  2. Choose a model from the list of stored models. The tests associated with that model appear.

     

  3. Select the specific test result that you want to use in the comparison (there may be more than one), and click Next.

     

  4. The results of the test are displayed.

  5. Click Submit to add this test result for your comparison. The specific test result is visualized on the screen.

  6. Click Add Test Result in the upper-right hand corner to add another test result for the comparison.

     

  7. Repeat steps 2-5 for each model / test result that you want to include in the side-by-side comparison. An example comparison looks like this: