The disadvantage of the error backpropagation algorithm of multi-layer feedforward neural networks is overlearning or over adaptation. We discussed this issue and obtained the necessary and sufficient experiments and conditions for the over-learning problem. Using these conditions and the concept of replication, this paper proposes methods for selecting training sets to prevent overlearning.
For a classifier, besides its classification ability, its size is another fundamental aspect. In pursuit of high performance, many classifiers do not consider their size and contain a large number of rules, both necessary and irrelevant. However, this may bring disadvantages to the classifier because redundant rules seriously affect the efficiency of the classifier.
Therefore, it is necessary to eliminate these unnecessary rules. We discuss various experiments with and without overlearning or overfitting issues. In this paper, we propose ParFeatArch Generator, a new algorithm for generating neural network architectures with optimal features and parameters via particle swarm optimization.
Choosing the best architecture for a neural network is usually done through a trial-and-error process, where the number of layers is often chosen based on previous experience, and the network is then trained and tested. When a neural network is used as a classifier in a feature selection algorithm, the number of layers of the neural network is usually chosen before using the neural network as a classifier in the feature selection algorithm.
In this work, we propose a new generation algorithm based on PSO, ParFeatArch Generator, which combines the feature selection process with the neural network architecture selection process and parameter optimization to simultaneously generate the neural network topology with optimal parameters in the algorithm.
Perform feature selection and evaluate the topology of the neural network to determine its quality. Using the proposed algorithm, given a data set, the optimal features in the data set and the optimal neural network classifier with optimal parameters for these features can be obtained.