NSFC62206101: Fundamental Limits of Pruning Deep Neural Network Models via Random Matrix Methods
This project (2023.012025.12) is led by myself, and focuses on the fundamental theoretical limits of pruning as well as quantization of deep neural networks. The objective of this project is to propose, by developing the mathematical tools of random matrix theory, highdimensional statistics, and optimization theory, a quantitative theory to characterize the “performance and complexity tradeoff” in modern deep neural nets. This project leads to the following scientific publications:

Z. Ling, L. Li, Z. Feng, Y. Zhang, F. Zhou, R. C. Qiu, Z. Liao “Deep Equilibrium Models are Almost Equivalent to Notsodeep Explicit Models for Highdimensional Gaussian Mixtures”, 2024.

Y. Du, Z. Ling, R. C. Qiu, Z. Liao, “Highdimensional Learning Dynamics of Deep Neural Nets in the Neural Tangent Regime”, Highdimensional Learning Dynamics Workshop, The Fortieth International Conference on Machine Learning (ICML'2023), 2023.

Z. Ling, Z. Liao, R. C. Qiu, “On the Equivalence Between Implicit and Explicit Neural Networks: A Highdimensional Viewpoint”, Highdimensional Learning Dynamics Workshop, The Fortieth International Conference on Machine Learning (ICML'2023), 2023.

J. Wang, S. Zhang, J. Cai, Z. Liao, C. Arenz, R. Betzholz, “Robustness of randomcontrol quantumstate tomography”, Physical Review A 108 (2 Aug. 2023), 022408. preprint

Y. Chitour, Z. Liao, R. Couillet, “A geometric approach of gradient descent algorithms in linear neural networks”, Mathematical Control and Related Fields, 13(3) (2023), 918–945. preprint

L. Gu, Y. Du, Y. Zhang, D. Xie, S. Pu, R. C. Qiu, Z. Liao, ““Lossless” Compression of Deep Neural Networks: A Highdimensional Neural Tangent Kernel Approach”, The 36th Conference on Neural Information Processing Systems (NeurIPS'2022), 2022.
CCFHikvision Open Fund 20210008: Random Matrix Theory and Information Bottleneck for Neural Network Compression
This project is led by Prof. Kai Wan and myself as PI, and investigates efficient compression schemes of largescale neural network models with strong theoretical guarantees. The project leads to the following scientific publications:

H. Tiomoko, Z. Liao, R. Couillet, “Random matrices in service of ML footprint: ternary random features with no performance loss”, The Tenth International Conference on Learning Representations (ICLR'2022), 2022. preprint

L. Gu, Y. Du, Y. Zhang, D. Xie, S. Pu, R. C. Qiu, Z. Liao, ““Lossless” Compression of Deep Neural Networks: A Highdimensional Neural Tangent Kernel Approach”, The 36th Conference on Neural Information Processing Systems (NeurIPS'2022), 2022.

Y. Song, K. Wan, Z. Liao, G. Caire, “An Achievable and Analytic Solution to Information Bottleneck for Gaussian Mixtures”, 2023.
See more details of the project in Chinese here.
NSFC12141107: Mathematical theory and methods for Reconfigurable Intelligent Surface (RIS) assisted wireless communication
This project (2022.012025.12) is led by Prof. R. C. Qiu and investigates the (information) theoretical limits of RIS assisted wireless communication system, dynamical system, as well as the theory of large dimensional random matrices.
See the project homepage here.