ISBN-13: 9786200240651 / Angielski
Sigmoidal feedforward neural networks with one (or more) hidden layer are one of the popular methods for the solution to learning problems. This book proposes a new approach for the training of sigmoidal feedforward network based on random learning rate/momentum constant. As we know the usage of random weight initialization is recommended because if the weights/thresholds are initialized to equal values, the weights/thresholds have a tendency of moving together and thereby restricting the degree of freedom (the number of distinct weights/ thresholds in the network), whereas random weights allow all the weights to have different values. The curvature of error surface is not uniform therefore it is found that symmetry breaking capability of random learning rate/momentum constant with or without the random weight initialization mechanism for the training of networks achieves least minima of the error function. So the aim of the book is to emphasize the optimal choice for random learning rates/momentum constants.