图像检测经典的评估方式——PR曲线,ROC曲线

Keywords: PR curve, ROC curve, Machine Learning, image processing

为了帮助大家理解,比如我们需要检测一个图像中的人,分类器将图像上的每个像素划分为人和非人像素,目标是人,所以检测为人的像素用Positives表示,检测为非人的像素用Negatives来表示,检测到了需要报告,检测到不是需要拒绝,检测到了实际不是则为误报(错误地报告)或错检(非目标当成目标了),该检测到的没有检测到则为漏报或漏检

  • True/False = 正确地/错误地
  • Positives/Negatives = 识别为目标/识别为非目标
    这样就容易理解了:

  • True positives (TP) = 正确地被识别为目标 的像素 (正确检出)

  • False positives (FP) = 错误地被识别为目标 的像素(错检,报多了)
  • True negatives (TN) = 正确地被识别为非目标 的像素 (正确拒绝)
  • False negatives (FN) = 错误地被识别为非目标 的像素 (漏检,报少了)

这些参数是常用的评估方法的基础。通过这些参数的值我们可以计算出ROC空间和PR空间的一个点,多个图像就可以得到多个点,连成曲线就是所谓的ROC曲线和PR曲线。

  1. ROC空间(面向真实结果Gound Truth)
    横轴 False Positive Rate (FPR) = 非目标像素中错检为目标的比例(越小越好)
    纵轴 True Positive Rate (TPR) = 目标像素中正确检出的比例(越大越好)

  2. PR空间(面向检测结果的正确性)
    横轴 Recall = TPR 实际目标像素中正确检出的比例(越大越好)
    纵轴 Precision = 检测出的目标像素中正确的比例,检测精度(越大越好)

总结图

来自论文The Relationship Between Precision-Recall and ROC Curves

default

详细讨论可以看论文,下面引出关键段落

2. Review of ROC and Precision-Recall

In a binary decision problem, a classifier labels ex-
amples as either positive or negative.The decision
made by the classifier can be represented in a struc-
ture known as a confusion matrix or contingency ta-
ble. The confusion matrix has four categories: True
positives (TP) are examples correctly labeled as posi-
tives. False positives (FP) refer to negative examples
incorrectly labeled as positive. True negatives (TN)
correspond to negatives correctly labeled as negative.
Finally, false negatives (FN) refer to positive examples
incorrectly labeled as negative.
A confusion matrix is shown in Figure 2(a). The con-
fusion matrix can be used to construct a point in either
ROC space or PR space. Given the confusion matrix,
we are able to define the metrics used in each space
as in Figure 2(b). In ROC space, one plots the False
Positive Rate (FPR) on the x-axis and the True Pos-
itive Rate (TPR) on the y-axis. The FPR measures
the fraction of negative examples that are misclassi-
fied as positive.The TPR measures the fraction of
positive examples that are correctly labeled. In PR
space, one plots Recall on the x-axis and Precision on
the y-axis. Recall is the same as TPR, whereas Pre-
cision measures that fraction of examples classified as
positive that are truly positive. Figure 2(b) gives the
definitions for each metric. We will treat the metrics
as functions that act on the underlying confusion ma-
trix which defines a point in either ROC space or PR
space. Thus, given a confusion matrix A, RECALL(A)
returns the Recall associated with A.

你可能感兴趣的:(computer-vision)