[October 2024] I will be talk on “Understanding and Scaling Large and Deep Neural Networks or Random Matrix Theory for Extremely Large-Scale ML” at Shanghai Jiao Tong University. See slides here (please use Adobe Reader for the animation).
[August 2024] I will be serving as an Area Chair of ICLR 2025.
[July 2024] Short course at IRIT (UT3 campus, 118 Route de Narbonne, Toulouse, France) from 1st to 4th July, 2024. See slides here: Part 1, Part 2, Part 3, and Part 4.
[June 2024] One paper at EUSIPCO'2024 on a novel and efficient RMT-improved Direction of Arrival estimation method. We show that standard ESPRIT (Estimation of Signal Parameters via Rotational Invariance Technique) approach is biased in the case of large arrays, but can be effectively debiased using RMT. The paper is listed as a Best Student Paper Candidate of EUSIPCO'2024! Check our preprint here.
[May 2024] One paper at ICML'2024 on RMT for Deep Equilibrium Model (DEQ, a typical implicit NN model) that provides very explicit connections between implicit and explicit NNs. Given a DEQ, check our preprint here for the recipe to design an “equivalent” simple explicit NN!
[Apr 2024] Our paper on achievable analytic solution to Information Bottleneck for Gaussian mixtures will be presented at ISIT'2024, check here for details!
[Jan 2023] I’m very grateful to be supported by the National Natural Science Foundation of China (youth program) on my research on the Fundamental Limit of Pruning Deep Neural Network Models via Random Matrix Methods.
[Aug 2022] One paper at NeurIPS'2022 on the eigenspectral structure of Neural Tangent Kernel (NTK) of fully-connected Deep neural networks for Gaussian mixture input data, with a compelling application to “lossless” sparsification and quantization of DNN models! This extends our previous paper at ICLR'2022. See more details here.
[Jun 2022] Our book “Random Matrix Methods for Machine Learning” with Cambridge University Press, see more details here!
For more information please refer to my detailed CV.