4.7 Article

The true sample complexity of active learning

Journal

MACHINE LEARNING
Volume 80, Issue 2-3, Pages 111-139

Publisher

SPRINGER
DOI: 10.1007/s10994-010-5174-y

Keywords

Active learning; Sample complexity; Selective sampling; Sequential design; Learning theory; Classification

Ask authors/readers for more resources

We describe and explore a new perspective on the sample complexity of active learning. In many situations where it was generally believed that active learning does not help, we show that active learning does help in the limit, often with exponential improvements in sample complexity. This contrasts with the traditional analysis of active learning problems such as non-homogeneous linear separators or depth-limited decision trees, in which Omega(1/epsilon) lower bounds are common. Such lower bounds should be interpreted carefully; indeed, we prove that it is always possible to learn an epsilon-good classifier with a number of samples asymptotically smaller than this. These new insights arise from a subtle variation on the traditional definition of sample complexity, not previously recognized in the active learning literature.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available