My Erdős number is 4, via Romain Couillet, Zhidong Bai, and Gutti Jogesh Babu.
I am looking for self-motivated research interns with strong background in math/stats and a general interest in machine learning, on the following topics:
A Random Matrix Approach to Graph Convolutional Networks: This project aims to study the theoretical properties of the popular Graph convolutional networks with RMT and to eventually improve their practical implementation.
So excited to announce the 1st Workshop on High-dimensional Learning Dynamics (HiLD Workshop) at ICML 2023, Honolulu, Hawaii!
The workshop aims to bring together experts from random matrix theory, optimization, high-dimensional statistics/probability, and statistical physics to share their perspectives while leveraging crossover experts in ML.
We will have Sanjeev Arora, SueYeon Chung, Murat A. Erdogdu, Surya Ganguli, and Andrea Montanari to present their insights in high-dimensional learning dynamics.
Please come and submit your papers! See Call for Papers here
I will be talking about the interface between RMT and ML at School of Physical & Mathematical Sciences, Nanyang Technological University (NTU), see slides here.
I will be talking about some recent work on the interaction between RMT and machine learning at the SDS Workshop on “Topics in Random Matrix Theory” at CUHK-Shenzhen, see more details here!
One paper at NeurIPS'2022 on the eigenspectral structure of Neural Tangent Kernel (NTK) of fully-connected Deep neural networks for Gaussian mixture input data, with a compelling application to “lossless” sparsification and quantization of DNN models! This extends our previous paper at ICLR'2022. See more details here.