Adaptive Construction of Hierarchical Neural Network Classifiers: New Modification of the Algorithmстатья
Информация о цитировании статьи получена из
Web of Science,
Scopus
Статья опубликована в журнале из списка Web of Science и/или Scopus
Дата последнего поиска статьи во внешних источниках: 29 марта 2018 г.
Аннотация:Multiple classification problems are usually hard to solve.
With increasing number of classes, classification algorithms rapidly degrade,
both by error rate and by computational cost. Multi-layer perceptron
(MLP) type neural networks (NN) solving such problems are
subject to the same effect: greater number of classes requires increasing
the number of neurons in the hidden layer(s) (HL) to build a more
complex separation surface, making the NN more prone to overtraining.
An alternative way is to build a hierarchical classifier system, uniting the
target classes into several groups and solving the recognition problem
within each group recursively at the lower-lying levels of hierarchy.
The authors of this study are developing an algorithm for adaptive construction
of such a hierarchical NN classifier (HNNC) [1]. Each node of
the constructed hierarchical tree is an MLP with a single HL consisting
of only a few neurons. Such an MLP is knowingly unable to recognize
all the required classes in a multiple classification problem. So after it
has been trained for a specified number of epochs, it is applied to all the
samples of the training set, and its output is analyzed. If the majority
of samples from some class k ”vote” at the MLP output as belonging to
another class m, the desired output for class k is changed to be the same
as for class m. In this way, classes are united into groups, and as this
modification is performed in the way favorable for the MLP, we obtain a
system with positive feedback, rapidly converging to a trained NN with a
high rate of recognition into several adaptively formed groups of classes.
Afterwards, each group is subject to further recognition in an iterative
way, thus providing adaptive construction of a HNNC.
In this study, a new modification has been introduced into the algorithm.
Now the target classes may not only unite, but under some conditions
any target class may split into two new classes, possibly simplifying class
separation borders, increasing efficiency and stability of the algorithm.
The presentation displays the results of the algorithm without and with
the new modification on several well-known benchmark problems.