联系我们
意见反馈

关注公众号

获得最新科研资讯

统计物理与神经计算

简介 Physics, Machine and Intelligence

分享到
实验室简介

欢迎报考研究生(2024级招新)或申请博士后或专职科研人员/opening for students or post-docs, please contact huanghp7@mail.sysu.edu.cn

长期招募大二或以上且对神经网络的工作原理有纯粹探索兴趣的本科生。

  • 硕士生李婵(学位论文盲审得分91&98)被加州大学圣地亚哥分校录取为博士生;
  • 硕士生蒋子健(学位论文盲审得分97&99)被普林斯顿大学录取为神经科学方向博士生;
  • 硕士生邹文轩被杜克大学(理论脑科学最高奖获得者Brunel小组)录取为博士生。

PMI's Landmark

If you want to know us, please go through these papers.

1. Entropy landscape of solutions in the binary perceptron problem (2013)

2. Origin of the computational hardness for learning with binary synapses (2014)

3. Statistical mechanics of unsupervised feature learning in a restricted Boltzmann machine with binary synapses (2017)

4. Minimal model of permutation symmetry in unsupervised learning (2019)

5. Weakly correlated synapses promote dimension reduction in deep neural networks (2021)

6. Associative memory model with arbitrary Hebbian length (2021)

7. Eigenvalue spectrum of neural networks with arbitrary Hebbian length (2021)

8. Spectrum of non-Hermitian deep-Hebbian neural networks (2022)

9. Statistical mechanics of continual learning: variational principle and mean-field potential (2022)

Our goal

The research group focuses on theoretical bases of various kinds of neural computation, including associative neural networks, restricted Boltzmann Machines, recurrent neural networks, and their deep variants. We are aslo interested in developing theory-grounded algorithms for real-world applications, and relating the theoretical study to neural mechanisms. Our long-term goal is to uncover basic principles of machine/brain intelligence using physics-based approximations. The plenary talk at the 6th CACSPCS about the scientific questions of machine learning is out here. A complete story about the mechanism of unsupervised learning is out here. My CV is HERE.

  • 第二届神经网络新年交叉论坛即将于2022年12月31日举办(线上),欢迎申请学生报告,见会议网站
  • Promoted to Full Professor in Apr 27, 2022
  • 2022秋季《神经网络的统计力学》在线课程 (2022-09~2023-06)。正式学员62名,学生专业背景覆盖数学、物理、计算机、心理学、认知科学等。课程特色:学以致用+前沿探索+思想碰撞(creative learning)。第二季招募新学员,2月8日截止,附简历到huanghp7@mail.sysu.edu.cn
  • 实验室的科普公众号“PMI Lab”开始使用
  • The first New-Year-Forum of interdisciplinary studies of neural networks will start in Jan 3, 2022.
  • We are organizing a regular (e.g. monthly) on-line seminar "INTheory", focusing on exchange of ideas about the interplay between physics, machine learning and brain sciences. If you are interested in giving us a talk, please contact me!
  • My book: Statistical Mechanics of Neural Networks (SMNN) has been published on line, see Kindle eBooks and Hardcover in Amazon. The hardcover version has been published by Springer (oversea version) and higher education press (mainland version, 可在京东商城购买或参看高教社推文). eBook in Springer Link.
  • A review of statistical physics and neural networks was just published in 《科学》(上海科学技术出版社,1915年创刊)2022年74卷01期40页,pdf is HERE, an on-line read of the text is HERE. Viewpoints On promotion of interdisciplinary studies.

News

Referee Services

Physical Review Letters (acrv 20; rjrv 2), Nature Communications (1), Phys Rev X (3), eLife (1), Physical Review E (19), Phys Rev B (2), Phys Rev Research (1), PLoS Comput Biol (2), Communication in Theoretical Physics (4), Journal of Physics: Conference Series (2), Journal of Statistical Mechanics: Theory and Experiment (8), Journal of Physics A: Mathematical and Theoretical (8), J. Stat. Phys (1), Eur. Phys. J. B (1), Neural Networks (2), Scientific Reports (2), Network Neuroscience (1), Frontiers in Computational Neuroscience(1), Neurocomputing (1), MLST (1), Physica A (3), IEEE Transactions on Neural Networks and Learning Systems (2), Chin. Phys. Lett (1), Chin. Phys. B (2), Chaos (1)

Program Committee of international machine learning conferences

Mathematical and Scientific Machine Learning (MSML2022)

Referee Services for Phd thesis

---Sydney University (2022), USTC (2023)

Referee Services for Grant Proposals

--NSFC(2023)

Honors and awards

  • Aug 2021 Excellent Young Scientists Fund, NSFC of China
  • Nov 2020 Fulan Research Incentive Award, School of Physics, SYSU
  • Mar 2017 8th RIKEN Research Incentive Award, RIKEN
  • Jan 2012 JSPS Postdoctoral Fellowship for Foreign Researchers, Japan Society for the Promotion of Science (JSPS)

Grants

  • 中山大学百人计划青年学术骨干启动经费(2018-2019)
  • 中山大学高层次国际会议专项资助《统计物理与神经计算国际研讨会(SPNC-2019)》(2019)
  • 国家青年科学基金项目:神经网络无监督学习的相关统计物理研究 (2019-2021)
  • 国家优秀青年基金项目:神经网络的统计物理(2022-2024)

Teaching

  • General Physics, for mathematics and applied mathematics undergraduate students (2018 Fall)
  • Thermodynamics and Statistical Physics, for physics undergraduate students (2018-2021 Fall)
  • College Physics, for computer science, mathematics, psychology undergraduate students (2019-2021 Spring)
  • Nonlinear physics and complex systems, for physics undergraduate students (2022 Spring)
  • Statistical Mechanics of Neural Networks, to be scheduled (2023 Fall); on line course 2022 Fall

Our previous representative achievements

我们研究小组的一个简短介绍:我与合作者于2014年理论上给出了感知学习计算困难性的物理起源 (被2016年昂萨格奖得主多次引用);2015年率先研究了受限玻尔兹曼机的统计力学,进而在2016到2017年间提出了无监督学习的最简单物理模型,引起了同行的广泛关注;2016年也提出了视网膜神经编码的相变理论,并被普林斯顿大学实验小组从不同角度证实。2017年至2018年间,我们构建了深度神经网络降维和退相关的物理模型并阐述了其机制 (被Cosyne 19大会主题报告推荐)。2019年, 我们理论上证明了包含有限隐层神经元的无监督学习自发对称性破缺的相变数据量并不依赖于隐神经元的数目,并且隐神经元接受野的(弱)关联显著降低相变(概念形成)的数据量(达50%以上)!更重要的,该理论预言了无监督学习本质上是数据流驱动了一系列对称性破缺(自发对称性破缺,两重交换对称性破缺)(物理评论快报发表了该理论)。同年,我们提出了带离散权重的受限玻尔兹曼机的训练算法,并且把无监督学习三必要元素---感知输入,神经突触和神经元状态纳入单一物理方程(物理评论E快速通讯发表了该理论)。2020年我们还提出了深度学习的信用分配物理模型(物理评论快报发表了该结果). 本研究组长期从统计物理角度关注神经计算的理论基础.

You can download these papers from arXiv (links to these published versions are also given there). Codes are HERE (under construction).

a. Origin of the computational hardness of the binary perceptron model (2014)

b. A first-order phase transition reveals clustering of neural codewords in retina (2016)

c. Unsupervised feature learning has a second-order phase transition at a critical data size (2016, 2017)

d. Theory of dimension reduction in deep neural networks (2017, 2018)

e. Minimal model of permutation symmetry in unsupervised learning (2019)---weakly-correlated receptive fields significantly reduce the data size that triggers the concept formation (spontaneous symmetry breaking predicted in our previous works (2016, 2017));unsupervised learning in neural neworks is explained as breaking a series of symmetries driven by data streams (observations)! We further demonstrate in a new preprint arXiv: 1911.02344 the computational role of prior knowledge in unsupervised learning, by claiming that the piror further reduces the minimal data size and reshapes inherent-symmetry broken transitions.

f. A variational principle for unsupervised learning interpreting three elements of learning (2019, arXiv:1911.07662):

Learning in neural networks involves data, synapses and neurons. Understanding the interaction among these three elements is thus important. Previous studies are limited to very simple networks with only a few hidden neurons, due to challenging computation obstacle. Here, we propose a variational principle going beyond the limitation of previous studies, being capable of treating arbitrary many hidden neurons. The theory furthermore interprets the interaction among data, synapses and neurons as an expectation-maximization process, thereby opening a new path towards understanding the black-box mechanisms of learning in a generic architecture.

g. A statistical ensemble model of deep learning (PRL 2020)

h. Emergence of hierarchical modes from deep learning (Phys Rev Research Letter, 2023)

Training deep neural networks in mode space rather than traditional weight space not only saves a significant (greater than 50%) training expense but also leads to more disentangled and scale-invariant emergent representations in the hierarchy, which thereby makes deep learning fast and more transparent.

Research for what?

“Why should we study this problem, if not because we have fun solving it?"---Nicola Cabibbo (known for Cabibbo angle, and one of his students is Giorgio Parisi)

“If you don't work on important problems, it's not likely that you will do important works”---Richard Hamming

The Back Page

Supervision of Bachelor Thesis

2019:4 students, among them Mr Quan scored 91

2020: 6 students, among them Ms Li scored 95, graduated with honor

2021: 4 students, among them Mr Chen scored 95, best thesis of SYSU

2022:3 students, among them Mr Zou scored 91.6

Supervision of Master Thesis

2021: Jianwen Zhou, applications of statistical physics to neural dimensionality reduction and associative memory of arbitary hebbian length

2023: Wenxuan Zou, mean-field theory of deep learning

2023: Chan Li, Mean-field algorithmic design of deep networks

2023: Zijiang Jiang, Generalized Hopfield model

Prizes won by student

2023, 中山大学优秀硕士学位论文,李婵

2022,三星奖学金,邹文轩

2021, 国家奖学金/National Scholarship and CN Yang (杨振宁)prize,Chan Li

2021, 芙兰优秀研究生奖学金,蒋子健

2021, 第六届全国统计物理与复杂系统学术会议最佳海报, 李婵

2021, 第三届中国计算与认知神经科学大会最佳海报二等奖/Chinese conference on computational and cognitive neuroscience best poster, 邹文轩

2020, Master Student Entrance Prize, Chan Li

2020, Fulan Master Student Prize, Wenxuan Zou

2020, Three-min Talk Competition Prize (Organized by IOP press), Chan Li, Talk Title: Learning Credit Assignment

2020, Best Poster Prize, Annual Physics Conference in Guang-dong Province, Chan Li

Former Members

2018-2020, Dongjie Zhou (Chinese Academy Science, Shanghai, Phd)

2018-2020, Zhenye Huang (Chinese Academy Science, Beijing, Phd)

2018-2020, Nancheng Zheng (Company, Guangzhou)

2018-2020, Tianqi Hou (Now at HUAWEI theory lab Hong Kong)

2019-2021, Zimin Chen (Tsinghua University, Phd)

2018-2021, Jianwen Zhou (ITP, CAS, Beijing, Phd)

2020-2022,Yang Zhao (ZJU, Phd)

2018-2023, Wenxuan Zou (Duke U, Phd)

2018-2023, Chan Li (UCSD, Phd)

2018-2023, Zijiang Jiang (Princeton U, Phd)

If you want to become a member of PMI, please pay attention to the following two questions:

1. Are you really interested in theory of neural networks?

2. Are you self-confident in (the potential of) your math and coding ability?

INTheory on-line Seminar

INTheory on-line Seminar Series

Introduction

We hold a regular (e.g. monthly) on-line seminar "INTheory", focusing on exchange of ideas about the interplay between physics, machine learning and brain sciences. The name is a short-hand for theory of intelligence, or integrating theory towards understanding intelligence.

Speakers announced here, the following time is Beijing time, and we use the on-line meeting software of voovmeeting (click Free Trials to download the application) or Zoom. The video of past talks are Here.

Seminar Schedule (updating)

*********2022 Fall Season*********

Sept 7, 21:00-22:30, INTheory-8

Talk title: Role of gating in recurrent neural networks

Speaker: Dr. Kamesh Krishnamurthy (talk poster here)

Zoom ID: 828 2975 8872

Oct 27, 16:00-17:15, INTheory-9

Talk title: Representation-assisted sampling with Restricted Boltzmann Machines: from disentanglement to deep tempering

Speaker: Prof. Remi Monasson

Zoom ID: 861 6166 8050

Nov 10, 16:00-17:15, INTheory-10

Talk title: Statistical mechanics of information processing in shallow networks

Speaker: Prof. Adriano Barra

Zoom ID: 831 9944 5505 (pc: 221110)

Previous speaker list

**********2022 Spring Season*********

May 5, 16:00-17:30, INTheory-7

Talk title: Understanding the structure-function mapping in recurrent neural networks

Speaker: Prof. Srdjan Ostojic (talk poster Here)

Apr 14 2022, 17:00-18:30, INTheory-6

Talk title: Optimization with input from Spin Glasses and RMT

Speaker: Prof. Yan Fyodorov (talk poster here)

**********2021 Fall Season**********

Dec 10 2021, 17:00-18:30, INTheory-5

Talk title: Geometric landscapes and phase transitions in deep neural network models

Speaker: Prof.  Riccardo Zecchina (talk poster INTheory-05)

Nov 18 2021, 16:00-17:30, INTheory-4

Talk title: Data structure and learning dynamics 
 in neural networks

Speaker: Prof. Sebastian Goldt (talk poster INTheory-04)

Nov 4 2021, 8:00-9:30, INTheory-3

Talk title: White-Box Deep (Convolution) Networks from the Principle of Rate Reduction

Speaker: Prof. Yi Ma (talk poster INTheory-03)

Oct 22 2021, 15:00~16:30, INTheory-2

Talk title: Statistical physics approach to neuronal dynamics

Speakers: Prof. Moritz Helias and Dr. Alexander van Meegen (talk poster INTheory-02)

Oct 12 2021, 14:00-15:30, INTheory-1

Talk title: Fractional diffusion theory of neural circuits

Speaker: Prof. Pulin Gong  (talk poster INTheory-01)

Group Meeting Schedule

ProjProg: ; ReadNote: May 10 (uploaded to FeiShu).

X-111: 冼为坚堂111会议室

Apr chair (preparing each GM and PMI tree): ZLR

Apr 25 (PMI-226) X-112  Room, 9am~

(1) Spectrum dissucssion

(2) Paper Talk: Theory of FBM, MS

May 12 (PMI-227) X-111  Room, 2pm~

(1) Chalk talk: from the problem set

(2) 1s3min:  DWK; BRR; WYC

(3) JC:  WSS; HWZ; QJB

 

Statistical Mechanics of Neural Networks online course

We have published a book about basic tools of statistitical mechanics and applications to understand inner workings of neural networks. The book contains 16 chapters, with extra chapters for a brief history and perspectives for promising frontiers. For details of contents, please visit The Amazon page for the book. The book can also be bought from the Chinese website or 京东商城. If you want to learn more about the book or have any comments or corrections, please email me: huanghp7@mail.sysu.edu.cn

The on-line course about the book is held from Sept, 2022 to June, 2023; it is better to have the book at hand when the course starts.  We have 62 attendees.

Schedule:

  • SMNN05:  982 909 046, Oct 29, 14:00~

Target applicant: undergraduate/graduate students or post-docs.

Prerequisite background: advanced mathematics, probability, linear algebra,python/C language, basics of statistical mechanics (for example this book). Knowledge about neural networks or computational neuroscience is a plus.

Most important point: you show passion toward the theory of neural networks and more generally about the black box of brain.

Tentative course schedule: Every Sat 14:00-15:30 (except Holiday) starting from the third week of Sept, 2022.

Lecture form: details of the book + homework+ papers reading/presentation.

///////////////Back Cover of the Book////////////////

This book covers basic knowledge of statistical mechanics applied to understand inner workings of neural networks. Important concepts or techniques, such as cavity method, mean-field approximation, replica trick, Nishimori condition, variational method, dynamical mean-field theory, unsupervised learning, associative memory models, perceptron models, chaos theory of recurrent neural networks, and eigen- spectrums of neural networks, are introduced in detail, offering a pedagogical guideline for pedestrians who get interested in theory of neural networks. The book focuses on quantitative frameworks of neural network models where the underlying mechanisms can be precisely isolated by physics of mathematical beauty and theoretical predictions.

///////////Content ////////////////

Chapter 1:  Introduction

Chapter 2:  Spin Glass Models and Cavity Method

Chapter 3:  Variational Mean-Field Theory and Belief Propagation

Chapter 4:  Monte-Carlo Simulation Methods

Chapter 5:  High-Temperature Expansion Techniques

Chapter 6: Nishimori Model

Chapter 7: Random Energy Model

Chapter 8:  Statistical Mechanics of Hopfield Model

Chapter 9:  Replica Symmetry and Symmetry Breaking

Chapter 10: Statistical Mechanics of Restricted Boltzmann Machine

Chapter 11: Simplest Model of Unsupervised Learning with Binary Synapses

Chapter 12: Inherent-Symmetry Breaking in Unsupervised Learning

Chapter 13: Mean-Field Theory of Ising Perceptron

Chapter 14: Mean-Field Model of Multi-Layered Perceptron

Chapter 15: Mean-Field Theory of Dimension Reduction in Neural Networks

Chapter 16: Chaos Theory of Random Recurrent Networks

Chapter 17: Statistical Mechanics of Random Matrices

Chapter 18: Perspectives

//////////////Corrections (of typos) in the book///////////////////

1. 

访问量:4128