MLOps Conference

MLOps Conference

On DataOps, productionizing ML models, and running experiments at scale.

Milecia McGregor

@milecia

Tuning Hyperparameters with DVC Experiments

Submitted Jun 14, 2021

When you start exploring multiple model architectures with different hyperparameter values, you need a way to quickly iterate. There are a lot of ways to handle this, but all of them require time and you might not be able to go back to a particular point to resume or restart training.

In this talk, you will learn how you can use the open-source tool, DVC, to compare training metrics using two methods for tuning hyperparameters: grid search and random search. You’ll learn how you can save and track the changes in your data, code, and metrics without adding a lot of commits to your Git history. This approach will scale with your data and projects and make sure that your team can reproduce results easily.

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}