The real world is perceived and broken down as data, models and algorithms in the eyes of physicists and engineers. Data is noisy by nature and classical statistical tools have so far been successful in dealing with relatively smaller levels of randomness. The recent emergence of Big Data and the required computing power to analyse them have rendered classical tools outdated and insufficient. Tools such as random matrix theory and the study of large sample covariance matrices can efficiently process these big data sets and help make sense of modern, deep learning algorithms. Presenting an...
The real world is perceived and broken down as data, models and algorithms in the eyes of physicists and engineers. Data is noisy by nature and classi...