My Erdős number is 4, via Romain Couillet, Zhidong Bai, and Gutti Jogesh Babu.
News
Slides for short course on “Random Matrix Theory and Its Applications in ML” @ Jiangsu Normal University: part 1 on Random Matrix Theory, and part 2 on Deep Learning Applications. See recording: part 1 and part 2.
Happy to share some of my thoughts on the interplay between deep learning (theory), random matrix theory, and high-dimensional statistics at Northeast Normal University, see slides here.
Excited to announce the 1st Workshop on High-dimensional Learning Dynamics (HiLD Workshop) at ICML 2023, Honolulu, Hawaii!
The workshop aims to bring together experts from random matrix theory, optimization, high-dimensional statistics/probability, and statistical physics to share their perspectives while leveraging crossover experts in ML.
We will have Sanjeev Arora, SueYeon Chung, Murat A. Erdogdu, Surya Ganguli, and Andrea Montanari to present their insights in high-dimensional learning dynamics.
Please come and submit your papers! See Call for Papers here and playback here!
One paper at NeurIPS'2022 on the eigenspectral structure of Neural Tangent Kernel (NTK) of fully-connected Deep neural networks for Gaussian mixture input data, with a compelling application to “lossless” sparsification and quantization of DNN models! This extends our previous paper at ICLR'2022. See more details here.
Upcoming book “Random Matrix Methods for Machine Learning” with Cambridge University Press, see more details here!
One paper at ICLR'2022 on an incredibly efficient and theoretically guaranteed random feature compression technique, see more details here.
I’m very grateful to be supported by the CCF-Hikvision Open Fund on my research on the theory of neural network model compression (together with Dr. Kai Wan), see more details here.
One oral paper (top 1% submission) at NeurIPS'2021 on the Hessian eigenspectra of generalized linear models, check our preprint here!
One paper on “Sparse sketches with small inversion bias” accepted at COLT'2021, check our preprint here!
For more information please refer to my detailed CV.