Home on Zhenyu Liao's Pagehttps://zhenyu-liao.github.io/Recent content in Home on Zhenyu Liao's PageHugo -- gohugo.ioen-usSat, 01 Sep 2018 00:00:00 +0000EUSIPCO'18 tutorialhttps://zhenyu-liao.github.io/posts/eusipco18/Sat, 01 Sep 2018 00:00:00 +0000https://zhenyu-liao.github.io/posts/eusipco18/I will be giving a tutorial at the 26th European Signal Processing Conference (EUSIPCO'18) in Rome, the Eternal City, Italy, together with my Ph.D. supervisor Prof. Romain Couillet and my colleague Xiaoyi Mai on the topic of “Random Matrix Advances in Machine Learning and Neural Nets”.
For more information please visit EUSIPCO'18.
Abstract
The advent of the Big Data era has triggered a renewed interest for machine learning and (deep) neural networks.EUSIPCO'18 tutorialhttps://zhenyu-liao.github.io/posts/gretsi17/Sat, 01 Sep 2018 00:00:00 +0000https://zhenyu-liao.github.io/posts/gretsi17/I will be presenting our work about Random Feature Maps at Colloque GRETSI'17 this September at Juan Les Pins, France. The slides (in French) are available here.EUSIPCO'18 tutorialhttps://zhenyu-liao.github.io/posts/icassp17/Sat, 01 Sep 2018 00:00:00 +0000https://zhenyu-liao.github.io/posts/icassp17/I will be presenting my work on LS-SVM at the 42nd IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2017), New Orleans, USA.
Here is the link to this four-paper conference paper. A extended journal version which contains all proof in detail is available here.
The slides presented on ICASSP 2017 are available here.EUSIPCO'18 tutorialhttps://zhenyu-liao.github.io/posts/icassp19/Sat, 01 Sep 2018 00:00:00 +0000https://zhenyu-liao.github.io/posts/icassp19/My colleague Xiaoyi Mai will be presenting our paper on high dimensional logistic regression at ICASSP'19, 12-17 May, Brighton, UK.
The slides are available here.EUSIPCO'18 tutorialhttps://zhenyu-liao.github.io/posts/icml18/Sat, 01 Sep 2018 00:00:00 +0000https://zhenyu-liao.github.io/posts/icml18/I will be presenting our works on Random Feature-based Clustering as well as Gradient Descent Dynamics at ICML 2018 this July, please find the two papers as follows:
[On the Spectrum of Random Features Maps of High Dimensional Data] and the slides here [The Dynamics of Learning: A Random Matrix Approach] and the slides here.Ph.D. Mid-termhttps://zhenyu-liao.github.io/posts/phd_mid/Sat, 01 Sep 2018 00:00:00 +0000https://zhenyu-liao.github.io/posts/phd_mid/I’ve just finished my Ph.D. mid-term evaluation at CentraleSupelec. Here is the link of my mid-term report as well the slides.Activitieshttps://zhenyu-liao.github.io/posts/activities/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/posts/activities/Talks Invited talk on “A Data-dependent Theory of Overparameterization: Phase Transition, Double Descent, and Beyond” at Workshop on the Theory of Over-parameterized Machine Learning (TOPML) 2021, April 20-21, 2021. See slides, two-page abstract, and paper. Invited talk on “Performance-complexity Trade-off in Large Dimensional Spectral Clustering” at Statistics Seminar, Research School of Finance, Actuarial Studies and Statistics, Australian National University, Canberra, March 4, 2021. See slides here. Invited talk on “Performance-complexity Trade-off in Large Dimensional Spectral Clustering”, STA 290 Seminar, Department of Statistics, University of California, Davis, January 21, 2021.Bookhttps://zhenyu-liao.github.io/book/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/book/Random Matrix Methods for Machine Learning Romain Couillet and Zhenyu Liao, Cambridge University Press, 2022.
Online Ordering: Cambridge University Press Amazon If you purchase on the CUP webpage above, a 20% discount flyer is available here.
Additional Resources for Readers A pre-production version of the book is available here, with exercise solutions available here. MATLAB and Python codes to reproduce the figures in the book are publicly available in this repository.Collaboratorshttps://zhenyu-liao.github.io/collaborators/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/collaborators/I have been very fortunate to work with a number of great collaborators over the years.
Senior collaborators Prof. Romain Couillet: University Grenoble Alpes, Inria, CNRS, Grenoble INP, LIG, France. Holder of the UGA MIAI LargeDATA Chair. Prof. Michael Mahoney: Department of Statistics, International Computer Science Institute (ICSI), and Lawrence Berkeley National Laboratory (LBNL) at UC Berkeley, USA. Director of the UC Berkeley FODA (Foundations of Data Analysis) Institute grant.Projectshttps://zhenyu-liao.github.io/projects/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/projects/NSFC-62206101: Fundamental Limits of Pruning Deep Neural Network Models via Random Matrix Methods This project (2023.01-2025.12) is led by myself, and focuses on the fundamental theoretical limits of pruning as well as quantization of deep neural networks. The objective of this project is to propose, by developing the mathematical tools of random matrix theory, high-dimensional statistics, and optimization theory, a quantitative theory to characterize the “performance and complexity tradeoff” in modern deep neural nets.Publicationshttps://zhenyu-liao.github.io/publications/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/publications/Conferences: L. Gu, Y. Du, Y. Zhang, D. Xie, S. Pu, R. C. Qiu, Z. Liao, ““Lossless” Compression of Deep Neural Networks: A High-dimensional Neural Tangent Kernel Approach”, The 36th Conference on Neural Information Processing Systems (NeurIPS'22), 2022. preprint
H. Tiomoko, Z. Liao, R. Couillet, “Random matrices in service of ML footprint: ternary random features with no performance loss”, The Tenth International Conference on Learning Representations (ICLR'2022), 2022.Teachinghttps://zhenyu-liao.github.io/teaching/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/teaching/Undergraduate level Deep Learning and Computer Vision I am teaching the undergraduate level course “Deep Learning and Computer Vision” this 2021 Fall semester, together with Prof. Xinggang Wang (https://xinggangw.info), below are the assignments/mini-projects aiming to improve your theoretical understanding and practical (coding) skills.
mini-project 1: training a linear model with gradient descent, see description here mini-project 2: training a single-hidden-layer neural network model, see description here mini-project 3: training a convolutional neural network, see description here mini-project 4: build your own MNIST-GAN, see description here Graduate level Random Matrix Theory and Its Application in Large-Scale Systems I am teaching the graduate (and Ph.Useful Linkshttps://zhenyu-liao.github.io/links/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/links/Research and Scientific Writing 现代科研指北 Advising Statement, by Marc F. Bellemare Learn LaTeX for LaTeX beginners scientific writing for non-native speakersL Linggle, Thesaurus, and DeepL Mathematics WolframAlpha, Mathematics Stack Exchange, ProofWiki, and Matrix Calculus NIST Digital Library of Mathematical Functions AIM Approved Textbooks for freely available math textbooks Computer Science NumPy for MATLAB user How To Start Open Source AI-research-tools (in Chinese)