EUSIPCO'18 tutorial

I will be giving a tutorial at the 26th European Signal Processing Conference (EUSIPCO'18) in Rome, the Eternal City, Italy, together with my Ph.D. supervisor Prof. Romain Couillet and my colleague Xiaoyi Mai on the topic of “Random Matrix Advances in Machine Learning and Neural Nets”.

For more information please visit EUSIPCO'18.

Abstract

The advent of the Big Data era has triggered a renewed interest for machine learning and (deep) neural networks. These methods however suffer a double plague (i) as they involve nonlinear operators, they are difficult to fathom and offer little guarantees, limits, and hyperparameter control and (ii) they were often developed from small dimensional intuitions and tend to be inefficient to deal with large dimensional datasets. Recent advances in random matrix theory manage to simultaneously deal with both problems; in assuming both dimension and size of the datasets to be simultaneously large, concentration phenomena arise that allow for a renewed understanding and the possibility to control and improve machine learning approaches, sometimes opening the door to completely new paradigms.

The objective of the tutorial is twofold. It will first provide a simple and didactic introduction to the basic notions of random matrix theory for the audience to get accustomed to the insights and necessary tools of the domain (∼1h). In a second longer part (∼2h), recent advances in applied random matrix theory to machine learning (kernel methods, classification and clustering, semi-supervised learning, etc.) as well as to neural networks (random features and extreme learning machines, backpropagation dynamics) will be investigated. In the end, the audience will get a good grasp on the non-trivial phenomena arising when dealing with large dimensional datasets and on the solutions and methods offered by random matrix theory to embrace large dimensional machine learning.

Please find the slides here.