View skill metrics and skill scores for selected models. Go to Validation for additional details.
Choose the system of measurement you want to use. Your preference is saved between sessions.
SI - International System of units is the metric system.
US - United States measurement system commonly used in the United States and most U.S. territories.

Display skill metrics | scores for the selected region.
Limit your view to a specific area by selecting from a list of shapefiles that may be associated with your account. See Upload Shapefiles and Locations for additional detail.
Display skill metrics | scores for the selected variable.
Salient offers a suite of skill metrics to assess the forecast error. We offer metrics that assess the error of full probabilistic distributions (CRPS), categorical forecasts (RPS), and even simple deterministic forecast values (MAE).
Continuous ranked probability score (CRPS): CRPS is a scoring rule used to assess the accuracy of probabilistic forecasts of continuous variables. CRPS captures both the accuracy and level of certainty (narrower predicted range) as shown in the figure below. It measures the difference between the cumulative distribution function of the predicted probabilities and the cumulative distribution function of the observed outcomes, rewarding forecasts that are accurate (how close the prediction is to the observed historical value) and precise or sharp (how close the predicted values are to each other). A lower CRPS indicates better calibration and accuracy of the probabilistic forecasts.
Ranked Probability Score (RPS): RPS is a scoring rule used to assess the quality of categorical probabilistic forecasts. It measures the accuracy of predicted probabilities for different categories in comparison to observed outcomes. RPS takes into account the ranking of predicted probabilities and assigns a score based on how well the forecasted probabilities match the observed outcomes. A lower RPS indicates better calibration and accuracy of the probabilistic forecasts.
Mean Absolute Error (MAE): MAE answers the question, “What is the average magnitude of the forecast errors?”
Range: 0 to ∞. Perfect score: 0.
Characteristics: Simple, familiar. Does not indicate the direction of the deviations.
Display skill metrics for the selected model.
Select a benchmark model to display a skill score.