4.6 Article

Semi-supervised learning of hidden conditional random fields for time-series classification

Journal

NEUROCOMPUTING
Volume 119, Issue -, Pages 339-349

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2013.03.024

Keywords

Semi-supervised learning; Hidden conditional random fields; Time-series classification; Large-margin methods

Ask authors/readers for more resources

Annotating class labels of a large number of time-series data is generally an expensive task. We propose novel semi-supervised learning algorithms that can improve the classification accuracy significantly by exploiting a relatively larger amount of unlabeled data in conjunction with a few labeled samples. Our algorithms utilize the unlabeled data as regularizers for opting for classifiers with stronger certainty on the unlabeled data. For the state-of-the-art conditional probabilistic sequence model called the hidden conditional random field, we first suggest the entropy minimization algorithm that was previously applied for static classification setups. More sophisticated margin-based approaches are then introduced, motivated by the semi-supervised support vector machines originally aimed for non-sequential data. We provide effective ways to incorporate and minimize the hat loss function for sequence data via probabilistic treatment in a principled manner. We show the performance improvement achieved by our methods on several semi-supervised time-series data classification scenarios. (C) 2013 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available