Projects

NSFC-62206101: Fundamental Limits of Pruning Deep Neural Network Models via Random Matrix Methods

This project (2023.01-2025.12) is led by myself, and focuses on the fundamental theoretical limits of pruning as well as quantization of deep neural networks. The objective of this project is to propose, by developing the mathematical tools of random matrix theory, high-dimensional statistics, and optimization theory, a quantitative theory to characterize the “performance and complexity tradeoff” in modern deep neural nets. This project leads to the following scientific publications:

  1. Z. Ling, L. Li, Z. Feng, Y. Zhang, F. Zhou, R. C. Qiu, Z. LiaoDeep Equilibrium Models are Almost Equivalent to Not-so-deep Explicit Models for High-dimensional Gaussian Mixtures”, 2024.

  2. Y. Du, Z. Ling, R. C. Qiu, Z. Liao, “High-dimensional Learning Dynamics of Deep Neural Nets in the Neural Tangent Regime”, High-dimensional Learning Dynamics Workshop, The Fortieth International Conference on Machine Learning (ICML'2023), 2023.

  3. Z. Ling, Z. Liao, R. C. Qiu, “On the Equivalence Between Implicit and Explicit Neural Networks: A High-dimensional Viewpoint”, High-dimensional Learning Dynamics Workshop, The Fortieth International Conference on Machine Learning (ICML'2023), 2023.

  4. J. Wang, S. Zhang, J. Cai, Z. Liao, C. Arenz, R. Betzholz, “Robustness of random-control quantum-state tomography”, Physical Review A 108 (2 Aug. 2023), 022408. preprint

  5. Y. Chitour, Z. Liao, R. Couillet, “A geometric approach of gradient descent algorithms in linear neural networks”, Mathematical Control and Related Fields, 13(3) (2023), 918–945. preprint

  6. L. Gu, Y. Du, Y. Zhang, D. Xie, S. Pu, R. C. Qiu, Z. Liao, ““Lossless” Compression of Deep Neural Networks: A High-dimensional Neural Tangent Kernel Approach”, The 36th Conference on Neural Information Processing Systems (NeurIPS'2022), 2022.


CCF-Hikvision Open Fund 20210008: Random Matrix Theory and Information Bottleneck for Neural Network Compression

This project is led by Prof. Kai Wan and myself as PI, and investigates efficient compression schemes of large-scale neural network models with strong theoretical guarantees. The project leads to the following scientific publications:

  1. H. Tiomoko, Z. Liao, R. Couillet, “Random matrices in service of ML footprint: ternary random features with no performance loss”, The Tenth International Conference on Learning Representations (ICLR'2022), 2022. preprint

  2. L. Gu, Y. Du, Y. Zhang, D. Xie, S. Pu, R. C. Qiu, Z. Liao, ““Lossless” Compression of Deep Neural Networks: A High-dimensional Neural Tangent Kernel Approach”, The 36th Conference on Neural Information Processing Systems (NeurIPS'2022), 2022.

  3. Y. Song, K. Wan, Z. Liao, G. Caire, “An Achievable and Analytic Solution to Information Bottleneck for Gaussian Mixtures”, 2023.

See more details of the project in Chinese here.


NSFC-12141107: Mathematical theory and methods for Reconfigurable Intelligent Surface (RIS) assisted wireless communication

This project (2022.01-2025.12) is led by Prof. R. C. Qiu and investigates the (information) theoretical limits of RIS assisted wireless communication system, dynamical system, as well as the theory of large dimensional random matrices.

See the project homepage here.