Nowadays there are many learning algorithms with which neural networks can learn. One of the known algorithms is Cascade Correlation algorithm resulting in networks with cascade architecture. Weight updates for neurons in Cascade Correlation algorithm follows iteratively through gradient descent method. The duration of "time for learning" for such neural networks could be improved by directly calculating the weights of an inserted hidden neuron through logit transformation and least squares method. It is expected that by directly calculating the weights, learning will be faster. Performance...
Nowadays there are many learning algorithms with which neural networks can learn. One of the known algorithms is Cascade Correlation algorithm resulti...