A Lazy Man's Approach to Benchmarking: Semisupervised Classifier Evaluation and Recalibration
| Authors |
|
|---|---|
| Publication date | 2013 |
| Book title | Proceedings: 2013 IEEE Conference on Computer Vision and Pattern Recognition |
| Book subtitle | CVPR 2013 : 23-28 June 2013, Portland, Oregon, USA |
| ISBN |
|
| ISBN (electronic) |
|
| Event | IEEE Conference on Computer Vision and Pattern Recognition: CVPR 2013 |
| Pages (from-to) | 3262-3269 |
| Publisher | Los Alamitos, CA: IEEE Computer Society, Conference Publishing Services |
| Organisations |
|
| Abstract |
How many labeled examples are needed to estimate a classifier's performance on a new dataset? We study the case where data is plentiful, but labels are expensive. We show that by making a few reasonable assumptions on the structure of the data, it is possible to estimate performance curves, with confidence bounds, using a small number of ground truth labels. Our approach, which we call Semi supervised Performance Evaluation (SPE), is based on a generative model for the classifier's confidence scores. In addition to estimating the performance of classifiers on new datasets, SPE can be used to recalibrate a classifier by re-estimating the class-conditional confidence distributions.
|
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.1109/CVPR.2013.419 |
| Downloads |
CVPR2013
(Accepted author manuscript)
|
| Permalink to this page | |