[Jan 2025] I will be talking on “Examples and Counterexamples of Gaussian Universality in Large-dimensional Machine Learning” at RMTA 2025 Conference, Changchun, Jilin, China. See slides here.
[Dec 2024] I will be serving as an Area Chair for IJCNN 2025.
[Nov 2024] I will be serving as an Area Chair for ICML 2025.
[Oct 2024] Our work on communication-efficient and robust federated domain adaptation using random features technique accepted at IEEE Trans. KDE, check out our preprint here and code here!
[Oct 2024] Our work on efficient high-dimensional approximate nearest neighbor (ANN) search accepted at SIGMOD'2025, check out our preprint here and code [here]!
[Aug 2024] I will be serving as an Area Chair for ICLR 2025.
[Jun 2024] One paper at EUSIPCO'2024 on a novel and efficient RMT-improved Direction of Arrival estimation method. We show that standard ESPRIT (Estimation of Signal Parameters via Rotational Invariance Technique) approach is biased in the case of large arrays, but can be effectively debiased using RMT. The paper is listed as a Best Student Paper Candidate of EUSIPCO'2024! Check the extended version here.
[May 2024] One paper at ICML'2024 on RMT for Deep Equilibrium Model (DEQ, a typical implicit NN model) that provides very explicit connections between implicit and explicit NNs. Given a DEQ, check our preprint here for the recipe to design an “equivalent” simple explicit NN!
[Jan 2023] I’m very grateful to be supported by the National Natural Science Foundation of China (youth program) on my research on the Fundamental Limit of Pruning Deep Neural Network Models via Random Matrix Methods.
[Aug 2022] One paper at NeurIPS'2022 on the eigenspectral structure of Neural Tangent Kernel (NTK) of fully-connected Deep neural networks for Gaussian mixture input data, with a compelling application to “lossless” sparsification and quantization of DNN models! This extends our previous paper at ICLR'2022. See more details here.
[Jun 2022] Our book “Random Matrix Methods for Machine Learning” with Cambridge University Press, see more details here!