Next Article in Journal
Primitive Membrane Formation, Characteristics and Roles in the Emergent Properties of a Protocell
Next Article in Special Issue
Estimating Neuronal Information: Logarithmic Binning of Neuronal Inter-Spike Intervals
Previous Article in Journal
The Entropy of Progressively Censored Samples
Previous Article in Special Issue
Information Theory in Scientific Visualization
Article Menu

Export Article

Open AccessArticle
Entropy 2011, 13(2), 450-465; doi:10.3390/e13020450

Information Theoretic Hierarchical Clustering

1
Control and Intelligent Processing Center of Excellence, School of Electrical and Computer Engineering, University of Tehran, PO Box 1439957131, Tehran 14395-515, Iran
2
Department of Electrical and Computer Engineering, Michigan State University, East Lansing, MI 48824, USA
3
Radiology Image Analysis Laboratory, Henry Ford Health System, Detroit, MI 48202, USA
4
School of Cognitive Sciences, Institute for Research in Fundamental Sciences (IPM), PO Box 1954856316, Tehran, Iran
*
Author to whom correspondence should be addressed.
Received: 8 December 2010 / Revised: 31 December 2010 / Accepted: 27 January 2011 / Published: 10 February 2011
(This article belongs to the Special Issue Advances in Information Theory)
View Full-Text   |   Download PDF [298 KB, uploaded 24 February 2015]   |  

Abstract

Hierarchical clustering has been extensively used in practice, where clusters can be assigned and analyzed simultaneously, especially when estimating the number of clusters is challenging. However, due to the conventional proximity measures recruited in these algorithms, they are only capable of detecting mass-shape clusters and encounter problems in identifying complex data structures. Here, we introduce two bottom-up hierarchical approaches that exploit an information theoretic proximity measure to explore the nonlinear boundaries between clusters and extract data structures further than the second order statistics. Experimental results on both artificial and real datasets demonstrate the superiority of the proposed algorithm compared to conventional and information theoretic clustering algorithms reported in the literature, especially in detecting the true number of clusters.
Keywords: information theory; Rényi’s entropy; quadratic mutual information; hierarchical clustering; proximity measure information theory; Rényi’s entropy; quadratic mutual information; hierarchical clustering; proximity measure
This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Aghagolzadeh, M.; Soltanian-Zadeh, H.; Araabi, B.N. Information Theoretic Hierarchical Clustering. Entropy 2011, 13, 450-465.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top