Welcome to Module 4!
This module introduces powerful techniques for model calibration and validation. We will explore ModelSkill, a dedicated Python package for comparing MIKE+ model outputs against observed data.
You’ll see how ModelSkill builds upon MIKE IO, leveraging your existing skills in data handling.
Throughout this module, you will learn to:
- Prepare
Observation
andModelResult
objects. - Match observational data with model results.
- Visualize model performance using standard validation plots.
- Quantify model accuracy using statistical skill scores.
This module culminates in a practical homework assignment where you’ll apply these skills to validate a sample MIKE+ model.
Let’s dive in!
Where can I download sample data to follow along?
All of the sample data used in this module is available for download: