# Understanding ROC

## TL; DR¶

In this post explore Receiver Operator Characteristic ROC curve and how can be used to evalute a predictive model.

I started looking at more detail into ROC curve and AOC after I was tasked to evaluate an existing model to verify how good it was and if there was a quick improvement to be made (as they were not happy with the accuracy).

## AUC and ROC¶

So I won't go into the basic definition of ROC and AUC, you can find it on Wikipedia.

I am going to start instead with a example. Let's imagine we have a binary classifiar that outputs probabilities. So for each prediction our classifier outpouts a probability and is for us to pick a threshold on where to consider the prediction as belonging to one of another table.

ROC curve help us determine the quality of a classifier when varying that threshold. The AUC this curve

```
true_labels = [0, 0, 0, 0, 1, 1, 1, 1]
prediction = [0.1, 0.35, 0.5, 0.7, 0.8, 0.85, 0.9, 0.9, 0.85]
```

```
```

## Comments

Comments powered by Disqus