Home on Zhenyu Liao's Pagehttps://zhenyu-liao.github.io/Recent content in Home on Zhenyu Liao's PageHugo -- gohugo.ioen-usSat, 01 Sep 2018 00:00:00 +0000EUSIPCO'18 tutorialhttps://zhenyu-liao.github.io/posts/archive/eusipco18/Sat, 01 Sep 2018 00:00:00 +0000https://zhenyu-liao.github.io/posts/archive/eusipco18/I will be giving a tutorial at the 26th European Signal Processing Conference (EUSIPCO'18) in Rome, the Eternal City, Italy, together with my Ph.D. supervisor Prof. Romain Couillet and my colleague Xiaoyi Mai on the topic of “Random Matrix Advances in Machine Learning and Neural Nets”.
For more information please visit EUSIPCO'18.
Abstract
The advent of the Big Data era has triggered a renewed interest for machine learning and (deep) neural networks.EUSIPCO'18 tutorialhttps://zhenyu-liao.github.io/posts/archive/gretsi17/Sat, 01 Sep 2018 00:00:00 +0000https://zhenyu-liao.github.io/posts/archive/gretsi17/I will be presenting our work about Random Feature Maps at Colloque GRETSI'17 this September at Juan Les Pins, France. The slides (in French) are available here.EUSIPCO'18 tutorialhttps://zhenyu-liao.github.io/posts/archive/icassp17/Sat, 01 Sep 2018 00:00:00 +0000https://zhenyu-liao.github.io/posts/archive/icassp17/I will be presenting my work on LS-SVM at the 42nd IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2017), New Orleans, USA.
Here is the link to this four-paper conference paper. A extended journal version which contains all proof in detail is available here.
The slides presented on ICASSP 2017 are available here.EUSIPCO'18 tutorialhttps://zhenyu-liao.github.io/posts/archive/icassp19/Sat, 01 Sep 2018 00:00:00 +0000https://zhenyu-liao.github.io/posts/archive/icassp19/My colleague Xiaoyi Mai will be presenting our paper on high dimensional logistic regression at ICASSP'19, 12-17 May, Brighton, UK.
The slides are available here.EUSIPCO'18 tutorialhttps://zhenyu-liao.github.io/posts/archive/icml18/Sat, 01 Sep 2018 00:00:00 +0000https://zhenyu-liao.github.io/posts/archive/icml18/I will be presenting our works on Random Feature-based Clustering as well as Gradient Descent Dynamics at ICML 2018 this July, please find the two papers as follows:
[On the Spectrum of Random Features Maps of High Dimensional Data] and the slides here [The Dynamics of Learning: A Random Matrix Approach] and the slides here.Ph.D. Mid-termhttps://zhenyu-liao.github.io/posts/archive/phd_mid/Sat, 01 Sep 2018 00:00:00 +0000https://zhenyu-liao.github.io/posts/archive/phd_mid/I’ve just finished my Ph.D. mid-term evaluation at CentraleSupelec. Here is the link of my mid-term report as well the slides.Activitieshttps://zhenyu-liao.github.io/posts/archive/activities/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/posts/archive/activities/Bookhttps://zhenyu-liao.github.io/book/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/book/Random Matrix Methods for Machine Learning Romain Couillet and Zhenyu Liao, Cambridge University Press, 2022.
Online Ordering: Cambridge University Press Amazon If you purchase on the CUP webpage above, a 20% discount flyer is available here.
Additional Resources for Readers A pre-production version of the book is available here, with exercise solutions available here. MATLAB and Python codes to reproduce the figures in the book are publicly available in this repository.Collaboratorshttps://zhenyu-liao.github.io/collaborators/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/collaborators/I have been very fortunate to work with a number of great collaborators over the years.
Senior collaborators Prof. Romain Couillet: University Grenoble Alpes, Inria, CNRS, Grenoble INP, LIG, France. Holder of the UGA MIAI LargeDATA Chair. Prof. Michael Mahoney: Department of Statistics, International Computer Science Institute (ICSI), and Lawrence Berkeley National Laboratory (LBNL) at UC Berkeley, USA. Director of the UC Berkeley FODA (Foundations of Data Analysis) Institute grant.Joint Workshop on "Math for Data Science"https://zhenyu-liao.github.io/posts/workshop_math_data/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/posts/workshop_math_data/Time and Place: Sep 22-23 2022, Online, 9:00-12:00 Paris time and 15:00-18:00 Beijing Time.
Detailed program (Paris time):
Program for Thurs 22 Sep 2022 (Chair: Zhenyu Liao)
9:00–9:10: Welcome speech, Robert C. Qiu, IEEE Fellow, Dean of EIC, HUST 9:10–9:50: Robust statistics and clustering - Application to signal and image processing, Frédéric Pascal, CentraleSupélec, Paris-Saclay 9:50–10:30: Context-Tree-Based Lossy Compression, Sheng Yang, CentraleSupélec, Paris-Saclay 10:30–10:40: Virtual Coffee Break 10:40–11:20: Learning without labels on multivariate biosignals: From unsupervised to self-supervised learning, Alexandre Gramfort, Inria Saclay, Paris-Saclay 11:20–12:00: Identifying, prediction, and control in non-Gaussian stochastic dynamical systems, Ting Gao, Dept.Postshttps://zhenyu-liao.github.io/post_page/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/post_page/ 2022 Joint Workshop on “Mathematics for Data Science”" between HUST and University Paris-Saclay
Reading group on “Random Matrix Theory and Machine Learning”
Reading group on “Modern Deep Learning Theory and Practice”Projectshttps://zhenyu-liao.github.io/projects/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/projects/NSFC-62206101: Fundamental Limits of Pruning Deep Neural Network Models via Random Matrix Methods This project (2023.01-2025.12) is led by myself, and focuses on the fundamental theoretical limits of pruning as well as quantization of deep neural networks. The objective of this project is to propose, by developing the mathematical tools of random matrix theory, high-dimensional statistics, and optimization theory, a quantitative theory to characterize the “performance and complexity tradeoff” in modern deep neural nets.Publicationshttps://zhenyu-liao.github.io/publications/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/publications/Conferences: Y. Song, K. Wan, Z. Liao, H. Xu, G. Caire, S. Shamai, “An Achievable and Analytic Solution to Information Bottleneck for Gaussian Mixtures”, 2024 IEEE International Symposium on Information Theory (ISIT 2024), 2024.
Y. Wang, Z. Feng, Z. Liao, “FedRF-Adapt: Robust and Communication-Efficient Federated Domain Adaptation via Random Features”, Workshop on Timely and Private Machine Learning over Networks, 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2024), 2024.Reading group on Modern Deep Learning Theory and Practicehttps://zhenyu-liao.github.io/posts/dl_theory_reading/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/posts/dl_theory_reading/List of papers [A] Tensor Program [][A-1] Yang, Greg. “Wide feedforward or recurrent neural networks of any architecture are gaussian processes.” Advances in Neural Information Processing Systems 32 (2019).
[][A-2] Yang, Greg. “Tensor programs ii: Neural tangent kernel for any architecture.” arXiv preprint arXiv:2006.14548 (2020).
[][A-3] Yang, Greg, and Etai Littwin. “Tensor programs iib: Architectural universality of neural tangent kernel training dynamics.” International Conference on Machine Learning.Reading group on RMT and Machine Learninghttps://zhenyu-liao.github.io/posts/rmt4ml_reading/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/posts/rmt4ml_reading/Schedule Date Speaker Papers to be presented 1 Dec. 27, 2023 Zhaorui Dong [B-1-1] 2 Jan. 3, 2024 Zhuofan Xu [B-3-5] 3 Jan. 10, 2024 Xuran Meng [C-2] 4 Jan. 17, 2024 Jing Chen [F-1] 5 Jan. 24, 2024 Xingkai Wen [B-3-6] 6 Jan. 31, 2024 Tingting Zou [C-5] 7 Feb.Teachinghttps://zhenyu-liao.github.io/teaching/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/teaching/Undergraduate level Introduction to Machine Learning See an introductory lecture on machine learning here
Deep Learning and Computer Vision I am teaching the undergraduate level course “Deep Learning and Computer Vision” this 2021 Fall semester, together with Prof. Xinggang Wang (https://xinggangw.info), below are the assignments/mini-projects aiming to improve your theoretical understanding and practical (coding) skills.
mini-project 1: training a linear model with gradient descent, see description here mini-project 2: training a single-hidden-layer neural network model, see description here mini-project 3: training a convolutional neural network, see description here mini-project 4: build your own MNIST-GAN, see description here Graduate level Probability and Stochastic Processes I I am teaching the graduate (and Ph.Useful Linkshttps://zhenyu-liao.github.io/links/Mon, 01 Jan 0001 00:00:00 +0000https://zhenyu-liao.github.io/links/Research and Scientific Writing 现代科研指北 Advising Statement, by Marc F. Bellemare Learn LaTeX for LaTeX beginners and a short intro to LaTex (in Chinese) scientific writing for non-native speakers Linggle, Thesaurus, and DeepL Math WolframAlpha, Mathematics Stack Exchange, ProofWiki, and Matrix Calculus Math via visualization: 3Blue1Brown and Seeing Theory NIST Digital Library of Mathematical Functions AIM Approved Textbooks for freely available math textbooks Computer Science and AI NumPy for MATLAB user How To Start Open Source AI-research-tools (in Chinese) Awesome Machine Learning Resources Google Dataset Search ChatGPT for writing, and Prompt Library from University of Pennsylvania Gamma for presentation