This title appears in the Scientific Report :
2005
Please use the identifier:
http://hdl.handle.net/2128/22927 in citations.
Please use the identifier: http://dx.doi.org/10.1209/epl/i2004-10483-y in citations.
Hierarchical clustering using mutual information
Hierarchical clustering using mutual information
We present a conceptually simple method for hierarchical clustering of data called mutual information clustering ( MIC) algorithm. It uses mutual information (MI) as a similarity measure and exploits its grouping property: The MI between three objects X, Y, and Z is equal to the sum of the MI betwee...
Saved in:
Personal Name(s): | Kraskov, A. |
---|---|
Stoegbauer, H. / Andrzejak, R. G. / Grassberger, P. | |
Contributing Institute: |
John von Neumann - Institut für Computing; NIC |
Published in: | epl, 70 (2005) S. 278 - 284 |
Imprint: |
Les Ulis
EDP Sciences
2005
|
Physical Description: |
278 - 284 |
DOI: |
10.1209/epl/i2004-10483-y |
Document Type: |
Journal Article |
Research Program: |
Betrieb und Weiterentwicklung des Höchstleistungsrechners |
Series Title: |
Europhysics Letters
70 |
Subject (ZB): | |
Link: |
Get full text OpenAccess OpenAccess |
Publikationsportal JuSER |
Please use the identifier: http://dx.doi.org/10.1209/epl/i2004-10483-y in citations.
We present a conceptually simple method for hierarchical clustering of data called mutual information clustering ( MIC) algorithm. It uses mutual information (MI) as a similarity measure and exploits its grouping property: The MI between three objects X, Y, and Z is equal to the sum of the MI between X and Y, plus the MI between Z and the combined object (XY). We use this both in the Shannon (probabilistic) version of information theory and in the Kolmogorov ( algorithmic) version. We apply our method to the construction of phylogenetic trees from mitochondrial DNA sequences and to the output of independent components analysis (ICA) as illustrated with the ECG of a pregnant woman. |