"Neural Networks and Statistical Learning by Ke-Lin Du and M. N. S. Swamy can be seen as a central reference point for the mathematical understanding and implementation of the core ideas of neuronal networks and statistical learning techniques." (Jan Pablo Burgard, SIAM Review, Vol. 62 (4), 2020)
Introduction.- Fundamentals of Machine Learning.- Perceptrons.- Multilayer perceptrons: architecture and error backpropagation.- Multilayer perceptrons: other learing techniques.- Hopfield networks, simulated annealing and chaotic neural networks.- Associative memory networks.- Clustering I: Basic clustering models and algorithms.- Clustering II: topics in clustering.- Radial basis function networks.- Recurrent neural networks.- Principal component analysis.- Nonnegative matrix factorization and compressed sensing.- Independent component analysis.- Discriminant analysis.- Support vector machines.- Other kernel methods.- Reinforcement learning.- Probabilistic and Bayesian networks.- Combining multiple learners: data fusion and emsemble learning.- Introduction of fuzzy sets and logic.- Neurofuzzy systems.- Neural circuits.- Pattern recognition for biometrics and bioinformatics.- Data mining.- Appenidx A. Mathematical Preliminaries.- Appendix B. Benchmarks and resources.
Ke-Lin Du is currently the founder and CEO at Xonlink Inc., China. He is also an Affiliate Associate Professor at the Department of Electrical and Computer Engineering, Concordia University, Canada. In the past, he held positions at Huawei Technologies, the China Academy of Telecommunication Technology, the Chinese University of Hong Kong, the Hong Kong University of Science and Technology, Concordia University, and Enjoyor Inc. He has published four books and over 50 papers, and filed over 30 patents. A Senior Member of the IEEE, his current research interests include signal processing, neural networks, intelligent systems, and wireless communications.
MNS Swamy is currently a Research Professor and holder of the Concordia Tier I Research Chair of Signal Processing at the Department of Electrical and Computer Engineering, Concordia University, where he was Dean of the Faculty of Engineering and Computer Science from 1977 to 1993 and the founding Chair of the EE department. He has published extensively in the areas of circuits, systems and signal processing, and co-authored nine books and holds five patents. Professor Swamy is a Fellow of the IEEE, IET (UK) and EIC (Canada), and has received many IEEE-CAS awards, including the Guillemin-Cauer award in 1986, as well as the Education Award and the Golden Jubilee Medal, both in 2000. He has been the Editor-in-Chief of the journal Circuits, Systems and Signal Processing (CSSP) since 1999. Recently, CSSP has instituted the Best Paper Award in his name.
This book provides a broad yet detailed introduction to neural networks and machine learning in a statistical framework. A single, comprehensive resource for study and further research, it explores the major popular neural network models and statistical learning approaches with examples and exercises and allows readers to gain a practical working understanding of the content. This updated new edition presents recently published results and includes six new chapters that correspond to the recent advances in computational learning theory, sparse coding, deep learning, big data and cloud computing.
Each chapter features state-of-the-art descriptions and significant research findings. The topics covered include:
• multilayer perceptron; • the Hopfield network; • associative memory models; • clustering models and algorithms; • t he radial basis function network; • recurrent neural networks; • nonnegative matrix factorization; • independent component analysis; •probabilistic and Bayesian networks; and • fuzzy sets and logic.
Focusing on the prominent accomplishments and their practical aspects, this book provides academic and technical staff, as well as graduate students and researchers with a solid foundation and comprehensive reference on the fields of neural networks, pattern recognition, signal processing, and machine learning.