ℹ️ Skipped - page is already crawled
| Filter | Status | Condition | Details |
|---|---|---|---|
| HTTP status | PASS | download_http_code = 200 | HTTP 200 |
| Age cutoff | PASS | download_stamp > now() - 6 MONTH | 0.1 months ago |
| History drop | PASS | isNull(history_drop_reason) | No drop reason |
| Spam/ban | PASS | fh_dont_index != 1 AND ml_spam_score = 0 | ml_spam_score=0 |
| Canonical | PASS | meta_canonical IS NULL OR = '' OR = src_unparsed | Not set |
| Property | Value |
|---|---|
| URL | https://catboost.ai/docs/en/features/loss-functions-desc |
| Last Crawled | 2026-04-07 02:34:17 (4 days ago) |
| First Indexed | 2024-11-20 02:48:00 (1 year ago) |
| HTTP Status Code | 200 |
| Meta Title | Implemented metrics | CatBoost |
| Meta Description | CatBoost provides built-in metrics for various machine learning problems. These functions can be used for model optimization or reference purposes. See the Obje |
| Meta Canonical | null |
| Boilerpipe Text | CatBoost provides built-in metrics for various machine learning problems. These functions can be used for model optimization or reference purposes. See theÂ
Objectives and metrics
section for details on the calculation principles.
Choose the implementation for more details.
python
r-package
cli
Python package
The following parameters can be set for the corresponding classes and are used when the model is trained.
Parameters for trained model
Classes:
CatBoost
CatBoostClassifier
CatBoostRegressor
loss-function
The
metric
to use in training. The specified value also determines the machine learning problem to solve. Some metrics support optional parameters (see theÂ
Objectives and metrics
section for details on each metric).
Format:
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
Supported metrics
RMSE
Logloss
MAE
CrossEntropy
Quantile
LogLinQuantile
Lq
MultiRMSE
MultiClass
MultiClassOneVsAll
MultiLogloss
MultiCrossEntropy
MAPE
Poisson
PairLogit
PairLogitPairwise
QueryRMSE
QuerySoftMax
GroupQuantile
Tweedie
YetiRank
YetiRankPairwise
StochasticFilter
StochasticRank
A custom python object can also be set as the value of this parameter (see anÂ
example
).
For example, use the following construction to calculate the value of Quantile with the coefficientÂ
α
=
0.1
\alpha = 0.1
:
Quantile:alpha=0.1
custom_metric
Metric
values to output during training. These functions are not optimized and are displayed for informational purposes only. Some metrics support optional parameters (see theÂ
Objectives and metrics
section for details on each metric).
Format:
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
Supported metrics
Examples:
Calculate the value of CrossEntropy
CrossEntropy
Calculate the value of Quantile with the coefficientÂ
α
=
0.1
\alpha = 0.1
Quantile:alpha=0.1
Calculate the values of Logloss and AUC
[
'Logloss'
,
'AUC'
]
Values of all custom metrics for learn and validation datasets are saved to theÂ
Metric
output files (
learn_error.tsv
and
test_error.tsv
respectively). The directory for these files is specified in theÂ
--train-dir
(
train_dir
) parameter.
Use theÂ
visualization tools
to see a live chart with the dynamics of the specified metrics.
use-best-model
If this parameter is set, the number of trees that are saved in the resulting model is defined as follows:
Build the number of trees defined by the training parameters.
Use the validation dataset to identify the iteration with the optimal value of the metric specified in Â
--eval-metric
(
--eval-metric
).
No trees are saved after this iteration.
This option requires a validation dataset to be provided.
eval-metric
The metric used for overfitting detection (if enabled) and best model selection (if enabled). Some metrics support optional parameters (see theÂ
Objectives and metrics
section for details on each metric).
Format:
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
Supported metrics
A user-defined function can also be set as the value (see anÂ
example
).
Examples:
R2
The following parameters can be set for the corresponding methods and are used when the model is trained or applied.
Parameters for trained or applied model
The following parameters can be set for the corresponding methods and are used when the model is trained or applied.
Classes:
fit
(
CatBoost
)
fit
(
CatBoostClassifier
)
fit
(
CatBoostRegressor
)
use_best_model
If this parameter is set, the number of trees that are saved in the resulting model is defined as follows:
Build the number of trees defined by the training parameters.
Use the validation dataset to identify the iteration with the optimal value of the metric specified in Â
--eval-metric
(
--eval-metric
).
No trees are saved after this iteration.
This option requires a validation dataset to be provided.
verbose
Output the measured evaluation metric to stderr.
plot
Plot the following information during training:
the metric values;
the custom loss values;
the loss function change during feature selection;
the time has passed since training started;
the remaining time until the end of training.
ThisÂ
option can be used
if training is performed in Jupyter notebook.
R package
The following parameters can be set for the corresponding methods and are used when the model is trained or applied.
Method:
catboost.train
loss_function
Description
The
metric
to use in training. The specified value also determines the machine learning problem to solve. Some metrics support optional parameters (see theÂ
Objectives and metrics
section for details on each metric).
Format:
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
Supported metrics
RMSE
Logloss
MAE
CrossEntropy
Quantile
LogLinQuantile
Lq
MultiRMSE
MultiClass
MultiClassOneVsAll
MultiLogloss
MultiCrossEntropy
MAPE
Poisson
PairLogit
PairLogitPairwise
QueryRMSE
QuerySoftMax
GroupQuantile
Tweedie
YetiRank
YetiRankPairwise
StochasticFilter
StochasticRank
For example, use the following construction to calculate the value of Quantile with the coefficientÂ
α
=
0.1
\alpha = 0.1
:
Quantile:alpha=0.1
custom_loss
Parameters
Metric
values to output during training. These functions are not optimized and are displayed for informational purposes only. Some metrics support optional parameters (see theÂ
Objectives and metrics
section for details on each metric).
Format:
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
Supported metrics
Examples:
Calculate the value of CrossEntropy
c('CrossEntropy')
Or simply:
'CrossEntropy'
Calculate the values of Logloss and AUC
c('Logloss', 'AUC')
Calculate the value of Quantile with the coefficientÂ
α
=
0.1
\alpha = 0.1
c('Quantilealpha=0.1')
Values of all custom metrics for learn and validation datasets are saved to theÂ
Metric
output files (
learn_error.tsv
and
test_error.tsv
respectively). The directory for these files is specified in theÂ
--train-dir
(
train_dir
) parameter.
use-best-model
If this parameter is set, the number of trees that are saved in the resulting model is defined as follows:
Build the number of trees defined by the training parameters.
Use the validation dataset to identify the iteration with the optimal value of the metric specified in Â
--eval-metric
(
--eval-metric
).
No trees are saved after this iteration.
This option requires a validation dataset to be provided.
eval-metric
Parameters
The metric used for overfitting detection (if enabled) and best model selection (if enabled). Some metrics support optional parameters (see theÂ
Objectives and metrics
section for details on each metric).
Format:
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
Supported metrics
Quantile:alpha=0.3
Command-line version
The following command keys can be specified for the corresponding commands and are used when the model is trained or applied.
Params for the
catboost fit
command:
--loss-function
The
metric
to use in training. The specified value also determines the machine learning problem to solve. Some metrics support optional parameters (see theÂ
Objectives and metrics
section for details on each metric).
Format:
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
Supported metrics
RMSE
Logloss
MAE
CrossEntropy
Quantile
LogLinQuantile
Lq
MultiRMSE
MultiClass
MultiClassOneVsAll
MultiLogloss
MultiCrossEntropy
MAPE
Poisson
PairLogit
PairLogitPairwise
QueryRMSE
QuerySoftMax
GroupQuantile
Tweedie
YetiRank
YetiRankPairwise
StochasticFilter
StochasticRank
For example, use the following construction to calculate the value of Quantile with the coefficientÂ
α
=
0.1
\alpha = 0.1
:
Quantilealpha=0.1
--custom-metric
Metric
values to output during training. These functions are not optimized and are displayed for informational purposes only. Some metrics support optional parameters (see theÂ
Objectives and metrics
section for details on each metric).
Format:
<Metric 1>[:<parameter 1>=<value>;..;<parameter N>=<value>],<Metric 2>[:<parameter 1>=<value>;..;<parameter N>=<value>],..,<Metric N>[:<parameter 1>=<value>;..;<parameter N>=<value>]
Supported metrics
Examples:
Calculate the value of CrossEntropy
CrossEntropy
Calculate the value of Quantile with the coefficientÂ
α
=
0.1
\alpha = 0.1
Quantilealpha=0.1
Values of all custom metrics for learn and validation datasets are saved to theÂ
Metric
output files (
learn_error.tsv
and
test_error.tsv
respectively). The directory for these files is specified in theÂ
--train-dir
(
train_dir
) parameter.
--use-best-model
If this parameter is set, the number of trees that are saved in the resulting model is defined as follows:
Build the number of trees defined by the training parameters.
Use the validation dataset to identify the iteration with the optimal value of the metric specified in Â
--eval-metric
(
--eval-metric
).
No trees are saved after this iteration.
This option requires a validation dataset to be provided.
--eval-metric
The metric used for overfitting detection (if enabled) and best model selection (if enabled). Some metrics support optional parameters (see theÂ
Objectives and metrics
section for details on each metric).
Format:
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
Supported metrics
Examples:
R2
Quantile:alpha=0.3
--logging-level
The logging level to output to stdout.
Possible values:
Silent — Do not output any logging information to stdout.
Verbose — Output the following data to stdout:
optimized metric
elapsed time of training
remaining time of training
Info — Output additional information and the number of trees.
Debug — Output debugging information. |
| Markdown | [](https://catboost.ai/ "CatBoost")
- Installation
- [Overview](https://catboost.ai/docs/en/features/en/concepts/installation)
- Python package installation
- CatBoost for Apache Spark installation
- R package installation
- Command-line version binary
- Build from source
- Key Features
- [Training](https://catboost.ai/docs/en/features/en/features/training)
- [Training on GPU](https://catboost.ai/docs/en/features/en/features/training-on-gpu)
- [Regular prediction](https://catboost.ai/docs/en/features/en/features/prediction)
- [Staged prediction](https://catboost.ai/docs/en/features/en/features/staged-prediction)
- [Cross-validation](https://catboost.ai/docs/en/features/en/features/cross-validation)
- [Feature importances](https://catboost.ai/docs/en/features/en/features/feature-importances-calculation)
- [User-defined metrics](https://catboost.ai/docs/en/features/en/features/custom-loss-functions)
- [Using the overfitting detector](https://catboost.ai/docs/en/features/en/features/overfitting-detector-desc)
- [Export a model to CoreML](https://catboost.ai/docs/en/features/en/features/export-model-to-core-ml)
- [Pre-trained data](https://catboost.ai/docs/en/features/en/features/proceed-training)
- [Calculate metrics](https://catboost.ai/docs/en/features/en/features/eval-metrics)
- [Categorical features](https://catboost.ai/docs/en/features/en/features/categorical-features)
- [Text features](https://catboost.ai/docs/en/features/en/features/text-features)
- [Embeddings features](https://catboost.ai/docs/en/features/en/features/embeddings-features)
- [Aggregated graph features](https://catboost.ai/docs/en/features/en/features/graph-aggregated-features)
- [Implemented metrics](https://catboost.ai/docs/en/features/en/features/loss-functions-desc)
- [Export a model to Python or C++](https://catboost.ai/docs/en/features/en/features/export-model-to-python)
- [Export a model to JSON](https://catboost.ai/docs/en/features/en/features/export-model-to-json)
- [Object importances](https://catboost.ai/docs/en/features/en/features/object-importances-calcution)
- Training parameters
- Python package
- CatBoost for Apache Spark
- R package
- Command-line version
- Applying models
- Objectives and metrics
- Model analysis
- Data format description
- [Parameter tuning](https://catboost.ai/docs/en/features/en/concepts/parameter-tuning)
- [Speeding up the training](https://catboost.ai/docs/en/features/en/concepts/speed-up-training)
- Data visualization
- Algorithm details
- [FAQ](https://catboost.ai/docs/en/features/en/concepts/faq)
- Educational materials
- [Development and contributions](https://catboost.ai/docs/en/features/en/concepts/development-and-contributions)
- [Contacts](https://catboost.ai/docs/en/features/en/concepts/contacts)
Implemented metrics
## In this article:
- [Python package](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#python-package)
- [Parameters for trained model](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#parameters-for-trained-model)
- [Parameters for trained or applied model](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#parameters-for-trained-or-applied-model)
- [R package](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#r-package)
- [loss\_function](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#loss_function)
- [custom\_loss](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#custom_loss)
- [use-best-model](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#use-best-model1)
- [eval-metric](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#eval-metric1)
- [Command-line version](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#command-line-version)
- [\--loss-function](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#loss-function1)
- [\--custom-metric](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#custom-metric)
- [\--use-best-model](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#use-best-model2)
- [\--eval-metric](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#eval-metric2)
- [\--logging-level](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#logging-level)
1. Key Features
2. Implemented metrics
# Implemented metrics
- [Python package](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#python-package)
- [Parameters for trained model](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#parameters-for-trained-model)
- [Parameters for trained or applied model](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#parameters-for-trained-or-applied-model)
- [R package](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#r-package)
- [loss\_function](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#loss_function)
- [custom\_loss](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#custom_loss)
- [use-best-model](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#use-best-model1)
- [eval-metric](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#eval-metric1)
- [Command-line version](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#command-line-version)
- [\--loss-function](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#loss-function1)
- [\--custom-metric](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#custom-metric)
- [\--use-best-model](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#use-best-model2)
- [\--eval-metric](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#eval-metric2)
- [\--logging-level](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#logging-level)
CatBoost provides built-in metrics for various machine learning problems. These functions can be used for model optimization or reference purposes. See the [Objectives and metrics](https://catboost.ai/docs/en/features/en/concepts/loss-functions) section for details on the calculation principles.
Choose the implementation for more details.
- [python](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#python)
- [r-package](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#r-package)
- [cli](https://catboost.ai/docs/en/features/en/features/loss-functions-desc#command-line-version)
## Python package
The following parameters can be set for the corresponding classes and are used when the model is trained.
### Parameters for trained model
Classes:
- [CatBoost](https://catboost.ai/docs/en/features/en/concepts/python-reference_catboost)
- [CatBoostClassifier](https://catboost.ai/docs/en/features/en/concepts/python-reference_catboostclassifier)
- [CatBoostRegressor](https://catboost.ai/docs/en/features/en/concepts/python-reference_catboostregressor)
#### loss-function
The [metric](https://catboost.ai/docs/en/features/en/concepts/loss-functions) to use in training. The specified value also determines the machine learning problem to solve. Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/features/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
Supported metrics
- RMSE
- Logloss
- MAE
- CrossEntropy
- Quantile
- LogLinQuantile
- Lq
- MultiRMSE
- MultiClass
- MultiClassOneVsAll
- MultiLogloss
- MultiCrossEntropy
- MAPE
- Poisson
- PairLogit
- PairLogitPairwise
- QueryRMSE
- QuerySoftMax
- GroupQuantile
- Tweedie
- YetiRank
- YetiRankPairwise
- StochasticFilter
- StochasticRank
A custom python object can also be set as the value of this parameter (see an [example](https://catboost.ai/docs/en/features/en/concepts/python-usages-examples)).
For example, use the following construction to calculate the value of Quantile with the coefficient α \= 0\.1 \\alpha = 0.1 α\=0\.1:
```
Quantile:alpha=0.1
```
#### custom\_metric
[Metric](https://catboost.ai/docs/en/features/en/concepts/loss-functions) values to output during training. These functions are not optimized and are displayed for informational purposes only. Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/features/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
[Supported metrics](https://catboost.ai/docs/en/features/en/references/custom-metric__supported-metrics)
Examples:
- Calculate the value of CrossEntropy
```
CrossEntropy
```
- Calculate the value of Quantile with the coefficient α \= 0\.1 \\alpha = 0.1 α\=0\.1
```
Quantile:alpha=0.1
```
- Calculate the values of Logloss and AUC
```
['Logloss', 'AUC']
```
Values of all custom metrics for learn and validation datasets are saved to the [Metric](https://catboost.ai/docs/en/features/en/concepts/output-data_loss-function) output files (`learn_error.tsv` and `test_error.tsv` respectively). The directory for these files is specified in the `--train-dir` (`train_dir`) parameter.
Use the [visualization tools](https://catboost.ai/docs/en/features/en/features/visualization) to see a live chart with the dynamics of the specified metrics.
#### use-best-model
If this parameter is set, the number of trees that are saved in the resulting model is defined as follows:
1. Build the number of trees defined by the training parameters.
2. Use the validation dataset to identify the iteration with the optimal value of the metric specified in `--eval-metric` (`--eval-metric`).
No trees are saved after this iteration.
This option requires a validation dataset to be provided.
#### eval-metric
The metric used for overfitting detection (if enabled) and best model selection (if enabled). Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/features/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
[Supported metrics](https://catboost.ai/docs/en/features/en/references/eval-metric__supported-metrics)
A user-defined function can also be set as the value (see an [example](https://catboost.ai/docs/en/features/en/concepts/python-usages-examples)).
Examples:
```
R2
```
The following parameters can be set for the corresponding methods and are used when the model is trained or applied.
### Parameters for trained or applied model
The following parameters can be set for the corresponding methods and are used when the model is trained or applied.
Classes:
- [fit](https://catboost.ai/docs/en/features/en/concepts/python-reference_catboost_fit) ([CatBoost](https://catboost.ai/docs/en/features/en/concepts/python-reference_catboost))
- [fit](https://catboost.ai/docs/en/features/en/concepts/python-reference_catboostclassifier_fit) ([CatBoostClassifier](https://catboost.ai/docs/en/features/en/concepts/python-reference_catboostclassifier))
- [fit](https://catboost.ai/docs/en/features/en/concepts/python-reference_catboostregressor_fit) ([CatBoostRegressor](https://catboost.ai/docs/en/features/en/concepts/python-reference_catboostregressor))
#### use\_best\_model
If this parameter is set, the number of trees that are saved in the resulting model is defined as follows:
1. Build the number of trees defined by the training parameters.
2. Use the validation dataset to identify the iteration with the optimal value of the metric specified in `--eval-metric` (`--eval-metric`).
No trees are saved after this iteration.
This option requires a validation dataset to be provided.
#### verbose
Output the measured evaluation metric to stderr.
#### plot
Plot the following information during training:
- the metric values;
- the custom loss values;
- the loss function change during feature selection;
- the time has passed since training started;
- the remaining time until the end of training.
This [option can be used](https://catboost.ai/docs/en/features/en/features/visualization_jupyter-notebook) if training is performed in Jupyter notebook.
## R package
The following parameters can be set for the corresponding methods and are used when the model is trained or applied.
Method: [catboost.train](https://catboost.ai/docs/en/features/en/concepts/r-reference_catboost-train)
### loss\_function
**Description**
The [metric](https://catboost.ai/docs/en/features/en/concepts/loss-functions) to use in training. The specified value also determines the machine learning problem to solve. Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/features/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
Supported metrics
- RMSE
- Logloss
- MAE
- CrossEntropy
- Quantile
- LogLinQuantile
- Lq
- MultiRMSE
- MultiClass
- MultiClassOneVsAll
- MultiLogloss
- MultiCrossEntropy
- MAPE
- Poisson
- PairLogit
- PairLogitPairwise
- QueryRMSE
- QuerySoftMax
- GroupQuantile
- Tweedie
- YetiRank
- YetiRankPairwise
- StochasticFilter
- StochasticRank
For example, use the following construction to calculate the value of Quantile with the coefficient α \= 0\.1 \\alpha = 0.1 α\=0\.1:
```
Quantile:alpha=0.1
```
### custom\_loss
**Parameters**
[Metric](https://catboost.ai/docs/en/features/en/concepts/loss-functions) values to output during training. These functions are not optimized and are displayed for informational purposes only. Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/features/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
[Supported metrics](https://catboost.ai/docs/en/features/en/references/custom-metric__supported-metrics)
Examples:
- Calculate the value of CrossEntropy
```
c('CrossEntropy')
```
Or simply:
```
'CrossEntropy'
```
- Calculate the values of Logloss and AUC
```
c('Logloss', 'AUC')
```
- Calculate the value of Quantile with the coefficient α \= 0\.1 \\alpha = 0.1 α\=0\.1
```
c('Quantilealpha=0.1')
```
Values of all custom metrics for learn and validation datasets are saved to the [Metric](https://catboost.ai/docs/en/features/en/concepts/output-data_loss-function) output files (`learn_error.tsv` and `test_error.tsv` respectively). The directory for these files is specified in the `--train-dir` (`train_dir`) parameter.
### use-best-model
If this parameter is set, the number of trees that are saved in the resulting model is defined as follows:
1. Build the number of trees defined by the training parameters.
2. Use the validation dataset to identify the iteration with the optimal value of the metric specified in `--eval-metric` (`--eval-metric`).
No trees are saved after this iteration.
This option requires a validation dataset to be provided.
### eval-metric
**Parameters**
The metric used for overfitting detection (if enabled) and best model selection (if enabled). Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/features/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
[Supported metrics](https://catboost.ai/docs/en/features/en/references/eval-metric__supported-metrics)
```
Quantile:alpha=0.3
```
## Command-line version
The following command keys can be specified for the corresponding commands and are used when the model is trained or applied.
Params for the [catboost fit](https://catboost.ai/docs/en/features/en/references/training-parameters/) command:
### \--loss-function
The [metric](https://catboost.ai/docs/en/features/en/concepts/loss-functions) to use in training. The specified value also determines the machine learning problem to solve. Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/features/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
Supported metrics
- RMSE
- Logloss
- MAE
- CrossEntropy
- Quantile
- LogLinQuantile
- Lq
- MultiRMSE
- MultiClass
- MultiClassOneVsAll
- MultiLogloss
- MultiCrossEntropy
- MAPE
- Poisson
- PairLogit
- PairLogitPairwise
- QueryRMSE
- QuerySoftMax
- GroupQuantile
- Tweedie
- YetiRank
- YetiRankPairwise
- StochasticFilter
- StochasticRank
For example, use the following construction to calculate the value of Quantile with the coefficient α \= 0\.1 \\alpha = 0.1 α\=0\.1:
```
Quantilealpha=0.1
```
### \--custom-metric
[Metric](https://catboost.ai/docs/en/features/en/concepts/loss-functions) values to output during training. These functions are not optimized and are displayed for informational purposes only. Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/features/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric 1>[:<parameter 1>=<value>;..;<parameter N>=<value>],<Metric 2>[:<parameter 1>=<value>;..;<parameter N>=<value>],..,<Metric N>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
[Supported metrics](https://catboost.ai/docs/en/features/en/references/custom-metric__supported-metrics)
Examples:
- Calculate the value of CrossEntropy
```
CrossEntropy
```
- Calculate the value of Quantile with the coefficient α \= 0\.1 \\alpha = 0.1 α\=0\.1
```
Quantilealpha=0.1
```
Values of all custom metrics for learn and validation datasets are saved to the [Metric](https://catboost.ai/docs/en/features/en/concepts/output-data_loss-function) output files (`learn_error.tsv` and `test_error.tsv` respectively). The directory for these files is specified in the `--train-dir` (`train_dir`) parameter.
### \--use-best-model
If this parameter is set, the number of trees that are saved in the resulting model is defined as follows:
1. Build the number of trees defined by the training parameters.
2. Use the validation dataset to identify the iteration with the optimal value of the metric specified in `--eval-metric` (`--eval-metric`).
No trees are saved after this iteration.
This option requires a validation dataset to be provided.
### \--eval-metric
The metric used for overfitting detection (if enabled) and best model selection (if enabled). Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/features/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
[Supported metrics](https://catboost.ai/docs/en/features/en/references/eval-metric__supported-metrics)
Examples:
```
R2
```
```
Quantile:alpha=0.3
```
### \--logging-level
The logging level to output to stdout.
Possible values:
- Silent — Do not output any logging information to stdout.
- Verbose — Output the following data to stdout:
- optimized metric
- elapsed time of training
- remaining time of training
- Info — Output additional information and the number of trees.
- Debug — Output debugging information.
### Was the article helpful?
Yes
No
Previous
[Aggregated graph features](https://catboost.ai/docs/en/features/en/features/graph-aggregated-features)
Next
[Export a model to Python or C++](https://catboost.ai/docs/en/features/en/features/export-model-to-python)
 |
| Readable Markdown | CatBoost provides built-in metrics for various machine learning problems. These functions can be used for model optimization or reference purposes. See the [Objectives and metrics](https://catboost.ai/docs/en/concepts/loss-functions) section for details on the calculation principles.
Choose the implementation for more details.
- [python](https://catboost.ai/docs/en/features/loss-functions-desc#python)
- [r-package](https://catboost.ai/docs/en/features/loss-functions-desc#r-package)
- [cli](https://catboost.ai/docs/en/features/loss-functions-desc#command-line-version)
## Python package
The following parameters can be set for the corresponding classes and are used when the model is trained.
### Parameters for trained model
Classes:
- [CatBoost](https://catboost.ai/docs/en/concepts/python-reference_catboost)
- [CatBoostClassifier](https://catboost.ai/docs/en/concepts/python-reference_catboostclassifier)
- [CatBoostRegressor](https://catboost.ai/docs/en/concepts/python-reference_catboostregressor)
#### loss-function
The [metric](https://catboost.ai/docs/en/concepts/loss-functions) to use in training. The specified value also determines the machine learning problem to solve. Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
Supported metrics
- RMSE
- Logloss
- MAE
- CrossEntropy
- Quantile
- LogLinQuantile
- Lq
- MultiRMSE
- MultiClass
- MultiClassOneVsAll
- MultiLogloss
- MultiCrossEntropy
- MAPE
- Poisson
- PairLogit
- PairLogitPairwise
- QueryRMSE
- QuerySoftMax
- GroupQuantile
- Tweedie
- YetiRank
- YetiRankPairwise
- StochasticFilter
- StochasticRank
A custom python object can also be set as the value of this parameter (see an [example](https://catboost.ai/docs/en/concepts/python-usages-examples)).
For example, use the following construction to calculate the value of Quantile with the coefficient α \= 0\.1 \\alpha = 0.1:
```
Quantile:alpha=0.1
```
#### custom\_metric
[Metric](https://catboost.ai/docs/en/concepts/loss-functions) values to output during training. These functions are not optimized and are displayed for informational purposes only. Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
[Supported metrics](https://catboost.ai/docs/en/references/custom-metric__supported-metrics)
Examples:
- Calculate the value of CrossEntropy
```
CrossEntropy
```
- Calculate the value of Quantile with the coefficient α \= 0\.1 \\alpha = 0.1
```
Quantile:alpha=0.1
```
- Calculate the values of Logloss and AUC
```
['Logloss', 'AUC']
```
Values of all custom metrics for learn and validation datasets are saved to the [Metric](https://catboost.ai/docs/en/concepts/output-data_loss-function) output files (`learn_error.tsv` and `test_error.tsv` respectively). The directory for these files is specified in the `--train-dir` (`train_dir`) parameter.
Use the [visualization tools](https://catboost.ai/docs/en/features/visualization) to see a live chart with the dynamics of the specified metrics.
#### use-best-model
If this parameter is set, the number of trees that are saved in the resulting model is defined as follows:
1. Build the number of trees defined by the training parameters.
2. Use the validation dataset to identify the iteration with the optimal value of the metric specified in `--eval-metric` (`--eval-metric`).
No trees are saved after this iteration.
This option requires a validation dataset to be provided.
#### eval-metric
The metric used for overfitting detection (if enabled) and best model selection (if enabled). Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
[Supported metrics](https://catboost.ai/docs/en/references/eval-metric__supported-metrics)
A user-defined function can also be set as the value (see an [example](https://catboost.ai/docs/en/concepts/python-usages-examples)).
Examples:
```
R2
```
The following parameters can be set for the corresponding methods and are used when the model is trained or applied.
### Parameters for trained or applied model
The following parameters can be set for the corresponding methods and are used when the model is trained or applied.
Classes:
- [fit](https://catboost.ai/docs/en/concepts/python-reference_catboost_fit) ([CatBoost](https://catboost.ai/docs/en/concepts/python-reference_catboost))
- [fit](https://catboost.ai/docs/en/concepts/python-reference_catboostclassifier_fit) ([CatBoostClassifier](https://catboost.ai/docs/en/concepts/python-reference_catboostclassifier))
- [fit](https://catboost.ai/docs/en/concepts/python-reference_catboostregressor_fit) ([CatBoostRegressor](https://catboost.ai/docs/en/concepts/python-reference_catboostregressor))
#### use\_best\_model
If this parameter is set, the number of trees that are saved in the resulting model is defined as follows:
1. Build the number of trees defined by the training parameters.
2. Use the validation dataset to identify the iteration with the optimal value of the metric specified in `--eval-metric` (`--eval-metric`).
No trees are saved after this iteration.
This option requires a validation dataset to be provided.
#### verbose
Output the measured evaluation metric to stderr.
#### plot
Plot the following information during training:
- the metric values;
- the custom loss values;
- the loss function change during feature selection;
- the time has passed since training started;
- the remaining time until the end of training.
This [option can be used](https://catboost.ai/docs/en/features/visualization_jupyter-notebook) if training is performed in Jupyter notebook.
## R package
The following parameters can be set for the corresponding methods and are used when the model is trained or applied.
Method: [catboost.train](https://catboost.ai/docs/en/concepts/r-reference_catboost-train)
### loss\_function
**Description**
The [metric](https://catboost.ai/docs/en/concepts/loss-functions) to use in training. The specified value also determines the machine learning problem to solve. Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
Supported metrics
- RMSE
- Logloss
- MAE
- CrossEntropy
- Quantile
- LogLinQuantile
- Lq
- MultiRMSE
- MultiClass
- MultiClassOneVsAll
- MultiLogloss
- MultiCrossEntropy
- MAPE
- Poisson
- PairLogit
- PairLogitPairwise
- QueryRMSE
- QuerySoftMax
- GroupQuantile
- Tweedie
- YetiRank
- YetiRankPairwise
- StochasticFilter
- StochasticRank
For example, use the following construction to calculate the value of Quantile with the coefficient α \= 0\.1 \\alpha = 0.1:
```
Quantile:alpha=0.1
```
### custom\_loss
**Parameters**
[Metric](https://catboost.ai/docs/en/concepts/loss-functions) values to output during training. These functions are not optimized and are displayed for informational purposes only. Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
[Supported metrics](https://catboost.ai/docs/en/references/custom-metric__supported-metrics)
Examples:
- Calculate the value of CrossEntropy
```
c('CrossEntropy')
```
Or simply:
```
'CrossEntropy'
```
- Calculate the values of Logloss and AUC
```
c('Logloss', 'AUC')
```
- Calculate the value of Quantile with the coefficient α \= 0\.1 \\alpha = 0.1
```
c('Quantilealpha=0.1')
```
Values of all custom metrics for learn and validation datasets are saved to the [Metric](https://catboost.ai/docs/en/concepts/output-data_loss-function) output files (`learn_error.tsv` and `test_error.tsv` respectively). The directory for these files is specified in the `--train-dir` (`train_dir`) parameter.
### use-best-model
If this parameter is set, the number of trees that are saved in the resulting model is defined as follows:
1. Build the number of trees defined by the training parameters.
2. Use the validation dataset to identify the iteration with the optimal value of the metric specified in `--eval-metric` (`--eval-metric`).
No trees are saved after this iteration.
This option requires a validation dataset to be provided.
### eval-metric
**Parameters**
The metric used for overfitting detection (if enabled) and best model selection (if enabled). Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
[Supported metrics](https://catboost.ai/docs/en/references/eval-metric__supported-metrics)
```
Quantile:alpha=0.3
```
## Command-line version
The following command keys can be specified for the corresponding commands and are used when the model is trained or applied.
Params for the [catboost fit](https://catboost.ai/docs/en/references/training-parameters/) command:
### \--loss-function
The [metric](https://catboost.ai/docs/en/concepts/loss-functions) to use in training. The specified value also determines the machine learning problem to solve. Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
Supported metrics
- RMSE
- Logloss
- MAE
- CrossEntropy
- Quantile
- LogLinQuantile
- Lq
- MultiRMSE
- MultiClass
- MultiClassOneVsAll
- MultiLogloss
- MultiCrossEntropy
- MAPE
- Poisson
- PairLogit
- PairLogitPairwise
- QueryRMSE
- QuerySoftMax
- GroupQuantile
- Tweedie
- YetiRank
- YetiRankPairwise
- StochasticFilter
- StochasticRank
For example, use the following construction to calculate the value of Quantile with the coefficient α \= 0\.1 \\alpha = 0.1:
```
Quantilealpha=0.1
```
### \--custom-metric
[Metric](https://catboost.ai/docs/en/concepts/loss-functions) values to output during training. These functions are not optimized and are displayed for informational purposes only. Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric 1>[:<parameter 1>=<value>;..;<parameter N>=<value>],<Metric 2>[:<parameter 1>=<value>;..;<parameter N>=<value>],..,<Metric N>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
[Supported metrics](https://catboost.ai/docs/en/references/custom-metric__supported-metrics)
Examples:
- Calculate the value of CrossEntropy
```
CrossEntropy
```
- Calculate the value of Quantile with the coefficient α \= 0\.1 \\alpha = 0.1
```
Quantilealpha=0.1
```
Values of all custom metrics for learn and validation datasets are saved to the [Metric](https://catboost.ai/docs/en/concepts/output-data_loss-function) output files (`learn_error.tsv` and `test_error.tsv` respectively). The directory for these files is specified in the `--train-dir` (`train_dir`) parameter.
### \--use-best-model
If this parameter is set, the number of trees that are saved in the resulting model is defined as follows:
1. Build the number of trees defined by the training parameters.
2. Use the validation dataset to identify the iteration with the optimal value of the metric specified in `--eval-metric` (`--eval-metric`).
No trees are saved after this iteration.
This option requires a validation dataset to be provided.
### \--eval-metric
The metric used for overfitting detection (if enabled) and best model selection (if enabled). Some metrics support optional parameters (see the [Objectives and metrics](https://catboost.ai/docs/en/concepts/loss-functions) section for details on each metric).
Format:
```
<Metric>[:<parameter 1>=<value>;..;<parameter N>=<value>]
```
[Supported metrics](https://catboost.ai/docs/en/references/eval-metric__supported-metrics)
Examples:
```
R2
```
```
Quantile:alpha=0.3
```
### \--logging-level
The logging level to output to stdout.
Possible values:
- Silent — Do not output any logging information to stdout.
- Verbose — Output the following data to stdout:
- optimized metric
- elapsed time of training
- remaining time of training
- Info — Output additional information and the number of trees.
- Debug — Output debugging information. |
| Shard | 169 (laksa) |
| Root Hash | 17435841955170310369 |
| Unparsed URL | ai,catboost!/docs/en/features/loss-functions-desc s443 |