Hybrid sampling on mutual information entropy-based clustering ensembles for optimizations

Feng Yeh Wang, Cheng Yang, Zhiyi Lin, Yuanxiang Li, Yuan Yuan

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper, we focus on the design of bivariate EDAs for discrete optimization problems and propose a new approach named HSMIEC. While the current EDAs require much time in the statistical learning process as the relationships among the variables are too complicated, we employ the Selfish gene theory (SG) in this approach, as well as a Mutual Information and Entropy based Cluster (MIEC) model is also set to optimize the probability distribution of the virtual population. This model uses a hybrid sampling method by considering both the clustering accuracy and clustering diversity and an incremental learning and resample scheme is also set to optimize the parameters of the correlations of the variables. Compared with several benchmark problems, our experimental results demonstrate that HSMIEC often performs better than some other EDAs, such as BMDA, COMIT, MIMIC and ECGA.

Original languageEnglish
Pages (from-to)1457-1464
Number of pages8
JournalNeurocomputing
Volume73
Issue number7-9
Early online date16 Dec 2009
DOIs
Publication statusPublished - Mar 2010
Event17th European Symposium on Artificial Neural Networks: Advances in Computational Intelligence and Learning - Bruges, Belgium
Duration: 22 Apr 200924 Apr 2009

Bibliographical note

17th European Symposium on Artificial Neural Networks - Advances in Computational Intelligence and Learning, ESANN 2009, 22-24 April 2009
Bruges, Belgium.

Keywords

  • clustering ensembles
  • estimation of distribution algorithm
  • mutual information entropy
  • selfish gene theory

Fingerprint

Dive into the research topics of 'Hybrid sampling on mutual information entropy-based clustering ensembles for optimizations'. Together they form a unique fingerprint.

Cite this