Accuracy Score#

  • Cómputa la cantidad o porcentaje de predicciones correctas.

  • Se calcula como:

         | Pronóstico
         |  PP    PN
---------|------------
      P  |  TP    FN
Real     |
      N  |  FP    TN

\text{accuracy} =\frac{\text{TP} +\text{TN}}{\text{P} + \text{N}} =\frac{\text{TP}+\text{TN}}{\text{TP}+\text{TN}+\text{FP}+\text{FN}}

\text{accuracy}(y, \hat{y})=\frac{1}{N} \sum_{i=0}^{N-1} 1(\hat{y}_i = y_i)

donde 1(x) es la función indicador, definida como:

1_A(x)=\left\{ \begin{array}{c l} 1 & \quad \textrm{if } x \in A \\ 0 & \quad \textrm{otherwise} \end{array} \right.

[1]:
from sklearn.metrics import accuracy_score

y_true = [0, 1, 2, 3]
y_pred = [0, 2, 1, 3]

accuracy_score(
    # -------------------------------------------------------------------------
    # Ground truth (correct) labels.
    y_true=y_true,
    # -------------------------------------------------------------------------
    # Predicted labels, as returned by a classifier.
    y_pred=y_pred,
    # -------------------------------------------------------------------------
    # If False, return the number of correctly classified samples. Otherwise,
    # return the fraction of correctly classified samples.
    normalize=True,
    # -------------------------------------------------------------------------
    # Sample weights.
    sample_weight=None,
)
[1]:
0.5
[2]:
accuracy_score(
    y_true=y_true,
    y_pred=y_pred,
    normalize=False,
)
[2]:
2