期刊
PATTERN RECOGNITION
卷 47, 期 9, 页码 3046-3059出版社
ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2014.03.006
关键词
Clustering; Cross-entropy; Memory compression
资金
- National Center of Science (Poland) [2011/01/B/ST6/01887, 2013/09/N/ST6/01178]
We build a general and easily applicable clustering theory, which we call cross-entropy clustering (shortly CEC), which joins the advantages of classical k-means (easy implementation and speed) with those of EM (affine invariance and ability to adapt to clusters of desired shapes). Moreover, contrary to k-means and EM, CEC finds the optimal number of clusters by automatically removing groups which have negative information cost. Although CEC, like EM, can be built on an arbitrary family of densities, in the most important case of Gaussian CEC the division into clusters is affine invariant. (C) 2014 Elsevier Ltd. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据