Improving backpropagation learning with feature selection

Rudy Setiono, Huan Liu

Research output: Contribution to journalArticlepeer-review

20 Scopus citations


There exist redundant, irrelevant and noisy data. Using proper data to train a network can speed up training, simplify the learned structure, and improve its performance. A two-phase training algorithm is proposed. In the first phase, the number of input units of the network is determined by using an information base method. Only those attributes that meet certain criteria for inclusion will be considered as the input to the network. In the second phase, the number of hidden units of the network is selected automatically based on the performance of the network on the training data. One hidden unit is added at a time only if it is necessary. The experimental results show that this new algorithm can achieve a faster learning time, a simpler network and an improved performance.

Original languageEnglish (US)
Pages (from-to)129-139
Number of pages11
JournalApplied Intelligence
Issue number2
StatePublished - 1996
Externally publishedYes


  • Backpropagation
  • Feature selection
  • Feedforward neural network
  • Information theory

ASJC Scopus subject areas

  • Artificial Intelligence


Dive into the research topics of 'Improving backpropagation learning with feature selection'. Together they form a unique fingerprint.

Cite this