One paper at NeurIPS'2022 on the eigenspectral structure of Neural Tangent Kernel (NTK) of fully-connected Deep neural networks for Gaussian mixture input data, with a compelling application to “lossless” sparsification and quantization of DNN models! This extends our previous paper at ICLR'2022. See more details here.
Upcoming book “Random Matrix Methods for Machine Learning” with Cambridge University Press, see more details here!
One paper at ICLR'2022 on an incredibly efficient and theoretically guaranteed random feature compression technique, see more details here.
I’m very grateful to be supported by the CCF-Hikvision Open Fund on my research on the theory of neural network model compression (together with Dr. Kai Wan), see more details here.
For more information please refer to my detailed CV.