4.8 Article

Bayes optimality in linear discriminant analysis

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/TPAMI.2007.70717

Keywords

linear discriminant analysis; feature extraction; Bayes optimal; convex optimization; pattern recognition; data mining; data visualization

Funding

  1. NIDCD NIH HHS [R01 DC 005241] Funding Source: Medline

Ask authors/readers for more resources

We present an algorithm that provides the one-dimensional subspace, where the Bayes error is minimized for the C class problem with homoscedastic Gaussian distributions. Our main result shows that the set of possible one-dimensional spaces v, for which the order of the projected class means is identical, defines a convex region with associated convex Bayes error function g(v). This allows for the minimization of the error function using standard convex optimization algorithms. Our algorithm is then extended to the minimization of the Bayes error in the more general case of heteroscedastic distributions. This is done by means of an appropriate kernel mapping function. This result is further extended to obtain the d-dimensional solution for any given d by iteratively applying our algorithm to the null space of the (d - 1)-dimensional solution. We also show how this result can be used to improve upon the outcomes provided by existing algorithms and derive a low-computational cost, linear approximation. Extensive experimental validations are provided to demonstrate the use of these algorithms in classification, data analysis and visualization.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available