ISBN-13: 9783639843217 / Angielski / Miękka / 2015 / 60 str.
Nowadays there are many learning algorithms with which neural networks can learn. One of the known algorithms is Cascade Correlation algorithm resulting in networks with cascade architecture. Weight updates for neurons in Cascade Correlation algorithm follows iteratively through gradient descent method. The duration of "time for learning" for such neural networks could be improved by directly calculating the weights of an inserted hidden neuron through logit transformation and least squares method. It is expected that by directly calculating the weights, learning will be faster. Performance evaluation of different versions of this new algorithm is carried out by comparison of mean squared activity error with the addition of new hidden units during training for classifying a set of benchmarks such as the two spirals problem.
Nowadays there are many learning algorithms with which neural networks can learn. One of the known algorithms is Cascade Correlation algorithm resulting in networks with cascade architecture. Weight updates for neurons in Cascade Correlation algorithm follows iteratively through gradient descent method. The duration of "time for learning" for such neural networks could be improved by directly calculating the weights of an inserted hidden neuron through logit transformation and least squares method. It is expected that by directly calculating the weights, learning will be faster. Performance evaluation of different versions of this new algorithm is carried out by comparison of mean squared activity error with the addition of new hidden units during training for classifying a set of benchmarks such as the two spirals problem.