A boosting algorithm based on cellular genetic programming to build an ensemble of predictors is proposed. The method evolves a population of trees for a fixed number of rounds and, after each round, it chooses the predictors to include into the ensemble by applying a clustering algorithm to the population of classifiers. Clustering the population allows the selection of the most diverse and fittest trees that best contribute to improve classification accuracy. The method proposed runs on a distributed hybrid environment that combines the island and cellular models of parallel genetic programming. The combination of the two models provides an efficient implementation of distributed GP, and, at the same time, the generation of low sized and accurate decision trees. The large amount of memory required to store the ensemble makes the method heavy to deploy. The paper shows that, by applying suitable pruning strategies, it is possible to select a subset of the classifiers without increasing misclassification errors; indeed, for some data sets, up to 30\% of pruning, ensemble accuracy increases. Experimental results show that the combination of clustering and pruning enhances classification accuracy of the ensemble approach.

Training Distributed GP Ensemble with a Selective Algorithm based on Clustering and Pruning for Pattern Classification

Folino Gianluigi;Pizzuti Clara;Spezzano Giandomenico
2008

Abstract

A boosting algorithm based on cellular genetic programming to build an ensemble of predictors is proposed. The method evolves a population of trees for a fixed number of rounds and, after each round, it chooses the predictors to include into the ensemble by applying a clustering algorithm to the population of classifiers. Clustering the population allows the selection of the most diverse and fittest trees that best contribute to improve classification accuracy. The method proposed runs on a distributed hybrid environment that combines the island and cellular models of parallel genetic programming. The combination of the two models provides an efficient implementation of distributed GP, and, at the same time, the generation of low sized and accurate decision trees. The large amount of memory required to store the ensemble makes the method heavy to deploy. The paper shows that, by applying suitable pruning strategies, it is possible to select a subset of the classifiers without increasing misclassification errors; indeed, for some data sets, up to 30\% of pruning, ensemble accuracy increases. Experimental results show that the combination of clustering and pruning enhances classification accuracy of the ensemble approach.
2008
Istituto di Calcolo e Reti ad Alte Prestazioni - ICAR
Ensemble learning
genetic programming
clustering
classification
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/118970
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 40
  • ???jsp.display-item.citation.isi??? ND
social impact