ℹ️ Skipped - page is already crawled
| Filter | Status | Condition | Details |
|---|---|---|---|
| HTTP status | PASS | download_http_code = 200 | HTTP 200 |
| Age cutoff | PASS | download_stamp > now() - 6 MONTH | 0 months ago |
| History drop | PASS | isNull(history_drop_reason) | No drop reason |
| Spam/ban | PASS | fh_dont_index != 1 AND ml_spam_score = 0 | ml_spam_score=0 |
| Canonical | PASS | meta_canonical IS NULL OR = '' OR = src_unparsed | Not set |
| Property | Value |
|---|---|
| URL | https://catboost.ai/docs/en/concepts/loss-functions |
| Last Crawled | 2026-04-08 18:41:32 (1 day ago) |
| First Indexed | 2024-11-18 16:12:15 (1 year ago) |
| HTTP Status Code | 200 |
| Meta Title | Objectives and metrics | CatBoost |
| Meta Description | This section contains basic information regarding the supported metrics for various machine learning problems. Regression. Multiregression. Classification. |
| Meta Canonical | null |
| Boilerpipe Text | This section contains basic information regarding the supported metrics for various machine learning problems.
Regression
Multiregression
Classification
Multiclassification
Multilabel classification
Ranking
Refer to the
Variables used in formulas
section for the description of commonly used variables in the listed metrics.
Metrics can be calculated during the training or separately from the training for a specified model. The calculated values are written to files and can be plotted by
visualization tools
(both during and after the training) for further analysis.
User-defined parameters
Some metrics provide user-defined parameters. These parameters must be set together with the metric name when it is being specified.
The parameters for each metric are set in the following format:
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
The supported parameters vary from one metric to another and are listed alongside the corresponding descriptions.
Usage examples
Quantile:alpha=0.1
List of most important parameters
The following table contains the description of parameters that are used in several metrics. The default values vary from one metric to another and are listed alongside the corresponding descriptions.
Parameter:
use_weights
Description
Use object/group weights to calculate metrics if the specified value is
true
and set all weights to
1
regardless of the input data if the specified value is
false
.
Note
This parameter cannot be used with the optimized objective. If weights are present, they are necessarily used to calculate the optimized objective. This behaviour cannot be disabled.
Parameter:
top
Description
The number of top samples in a group that are used to calculate the ranking metric. Top samples are either the samples with the largest approx values or the ones with the lowest target values if approx values are the same.
Enable, disable and configure metrics calculation
The calculation of metrics can be resource-intensive. It creates a bottleneck in some cases, for example, if many metrics are calculated during the training or the computation is performed on GPU.
The training can be sped up by disabling the calculation of some metrics for the training dataset. Use the
hints=skip_train~true
parameter to disable the calculation of the specified metrics.
Note
The calculation of some metrics is disabled by default for the training dataset to speed up the training. Use the
hints=skip_train~false
parameter to enable the calculation.
Metrics that are not calculated by default for the train dataset
PFound
YetiRank
NDCG
YetiRankPairwise
AUC
NormalizedGini
FilteredDCG
DCG
Usage examples
Enable the calculation of the AUC metric:
AUC:hints=skip_train~false
Disable the calculation of the Logloss metric:
Logloss:hints=skip_train~true
Another way to speed up the training is to set up the frequency of iterations to calculate the values of metrics. Use one of the following parameters:
Command-line version parameters:
--metric-period
Python parameters:
metric_period
R parameters:
metric_period
For example, use the following parameter in Python or R to calculate metrics once per 50 iterations:
metric_period=50 |
| Markdown | [](https://catboost.ai/ "CatBoost")
- Installation
- [Overview](https://catboost.ai/docs/en/concepts/en/concepts/installation)
- Python package installation
- CatBoost for Apache Spark installation
- R package installation
- Command-line version binary
- Build from source
- Key Features
- Training parameters
- Python package
- CatBoost for Apache Spark
- R package
- Command-line version
- Applying models
- Objectives and metrics
- [Overview](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions)
- [Variables used in formulas](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions-variables-used)
- [Regression](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions-regression)
- [Multiregression](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions-multiregression)
- [Classification](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions-classification)
- [Multiclassification](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions-multiclassification)
- [Multilabel Classification](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions-multilabel-classification)
- [Ranking](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions-ranking)
- Model analysis
- Data format description
- [Parameter tuning](https://catboost.ai/docs/en/concepts/en/concepts/parameter-tuning)
- [Speeding up the training](https://catboost.ai/docs/en/concepts/en/concepts/speed-up-training)
- Data visualization
- Algorithm details
- [FAQ](https://catboost.ai/docs/en/concepts/en/concepts/faq)
- Educational materials
- [Development and contributions](https://catboost.ai/docs/en/concepts/en/concepts/development-and-contributions)
- [Contacts](https://catboost.ai/docs/en/concepts/en/concepts/contacts)
Objectives and metrics
## In this article:
- [User-defined parameters](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions#user-defined-parameters)
- [Enable, disable and configure metrics calculation](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions#enable-disable-configure-metrics)
1. Objectives and metrics
2. Overview
# Objectives and metrics
- [User-defined parameters](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions#user-defined-parameters)
- [Enable, disable and configure metrics calculation](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions#enable-disable-configure-metrics)
This section contains basic information regarding the supported metrics for various machine learning problems.
- [Regression](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions-regression)
- [Multiregression](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions-multiregression)
- [Classification](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions-classification)
- [Multiclassification](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions-multiclassification)
- [Multilabel classification](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions-multilabel-classification)
- [Ranking](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions-ranking)
Refer to the [Variables used in formulas](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions-variables-used) section for the description of commonly used variables in the listed metrics.
Metrics can be calculated during the training or separately from the training for a specified model. The calculated values are written to files and can be plotted by [visualization tools](https://catboost.ai/docs/en/concepts/en/features/visualization) (both during and after the training) for further analysis.
## User-defined parameters
Some metrics provide user-defined parameters. These parameters must be set together with the metric name when it is being specified.
The parameters for each metric are set in the following format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
The supported parameters vary from one metric to another and are listed alongside the corresponding descriptions.
#### Usage examples
```
Quantile:alpha=0.1
```
#### List of most important parameters
The following table contains the description of parameters that are used in several metrics. The default values vary from one metric to another and are listed alongside the corresponding descriptions.
**Parameter:** `use_weights`
#### Description
Use object/group weights to calculate metrics if the specified value is "true" and set all weights to "1" regardless of the input data if the specified value is "false".
Note
This parameter cannot be used with the optimized objective. If weights are present, they are necessarily used to calculate the optimized objective. This behaviour cannot be disabled.
**Parameter:** `top`
#### Description
The number of top samples in a group that are used to calculate the ranking metric. Top samples are either the samples with the largest approx values or the ones with the lowest target values if approx values are the same.
## Enable, disable and configure metrics calculation
The calculation of metrics can be resource-intensive. It creates a bottleneck in some cases, for example, if many metrics are calculated during the training or the computation is performed on GPU.
The training can be sped up by disabling the calculation of some metrics for the training dataset. Use the `hints=skip_train~true` parameter to disable the calculation of the specified metrics.
Note
The calculation of some metrics is disabled by default for the training dataset to speed up the training. Use the `hints=skip_train~false` parameter to enable the calculation.
Metrics that are not calculated by default for the train dataset
- PFound
- YetiRank
- NDCG
- YetiRankPairwise
- AUC
- NormalizedGini
- FilteredDCG
- DCG
Usage examples
Enable the calculation of the AUC metric:
```
AUC:hints=skip_train~false
```
Disable the calculation of the Logloss metric:
```
Logloss:hints=skip_train~true
```
Another way to speed up the training is to set up the frequency of iterations to calculate the values of metrics. Use one of the following parameters:
**Command-line version parameters:** `--metric-period`
**Python parameters:** `metric_period`
**R parameters:** `metric_period`
For example, use the following parameter in Python or R to calculate metrics once per 50 iterations:
```
metric_period=50
```
### Was the article helpful?
Yes
No
Previous
[Python](https://catboost.ai/docs/en/concepts/en/concepts/python-reference_apply_catboost_model)
Next
[Variables used in formulas](https://catboost.ai/docs/en/concepts/en/concepts/loss-functions-variables-used)
 |
| Readable Markdown | This section contains basic information regarding the supported metrics for various machine learning problems.
- [Regression](https://catboost.ai/docs/en/concepts/loss-functions-regression)
- [Multiregression](https://catboost.ai/docs/en/concepts/loss-functions-multiregression)
- [Classification](https://catboost.ai/docs/en/concepts/loss-functions-classification)
- [Multiclassification](https://catboost.ai/docs/en/concepts/loss-functions-multiclassification)
- [Multilabel classification](https://catboost.ai/docs/en/concepts/loss-functions-multilabel-classification)
- [Ranking](https://catboost.ai/docs/en/concepts/loss-functions-ranking)
Refer to the [Variables used in formulas](https://catboost.ai/docs/en/concepts/loss-functions-variables-used) section for the description of commonly used variables in the listed metrics.
Metrics can be calculated during the training or separately from the training for a specified model. The calculated values are written to files and can be plotted by [visualization tools](https://catboost.ai/docs/en/features/visualization) (both during and after the training) for further analysis.
## User-defined parameters
Some metrics provide user-defined parameters. These parameters must be set together with the metric name when it is being specified.
The parameters for each metric are set in the following format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
The supported parameters vary from one metric to another and are listed alongside the corresponding descriptions.
#### Usage examples
```
Quantile:alpha=0.1
```
#### List of most important parameters
The following table contains the description of parameters that are used in several metrics. The default values vary from one metric to another and are listed alongside the corresponding descriptions.
**Parameter:** `use_weights`
#### Description
Use object/group weights to calculate metrics if the specified value is "true" and set all weights to "1" regardless of the input data if the specified value is "false".
Note
This parameter cannot be used with the optimized objective. If weights are present, they are necessarily used to calculate the optimized objective. This behaviour cannot be disabled.
**Parameter:** `top`
#### Description
The number of top samples in a group that are used to calculate the ranking metric. Top samples are either the samples with the largest approx values or the ones with the lowest target values if approx values are the same.
## Enable, disable and configure metrics calculation
The calculation of metrics can be resource-intensive. It creates a bottleneck in some cases, for example, if many metrics are calculated during the training or the computation is performed on GPU.
The training can be sped up by disabling the calculation of some metrics for the training dataset. Use the `hints=skip_train~true` parameter to disable the calculation of the specified metrics.
Note
The calculation of some metrics is disabled by default for the training dataset to speed up the training. Use the `hints=skip_train~false` parameter to enable the calculation.
Metrics that are not calculated by default for the train dataset
- PFound
- YetiRank
- NDCG
- YetiRankPairwise
- AUC
- NormalizedGini
- FilteredDCG
- DCG
Usage examples
Enable the calculation of the AUC metric:
```
AUC:hints=skip_train~false
```
Disable the calculation of the Logloss metric:
```
Logloss:hints=skip_train~true
```
Another way to speed up the training is to set up the frequency of iterations to calculate the values of metrics. Use one of the following parameters:
**Command-line version parameters:** `--metric-period`
**Python parameters:** `metric_period`
**R parameters:** `metric_period`
For example, use the following parameter in Python or R to calculate metrics once per 50 iterations:
```
metric_period=50
``` |
| Shard | 169 (laksa) |
| Root Hash | 17435841955170310369 |
| Unparsed URL | ai,catboost!/docs/en/concepts/loss-functions s443 |