## Advanced Theory

**Special Virtual Edition, Summer/Fall 2020**

**Meetings:** Wednesdays, 10.15 - 11.45 am

**Location:** Zoom, contact **re2365@columbia.edu** for details

**Schedule:**

7/15/2020 Time-dependent mean-field theory for mathematical streetfighters **(Rainer Engelken)**

7/22/2020 Predictive coding in balanced neural networks with noise, chaos, and delays, Article **(Everyone) **

7/29/2020 A solution to the learning dilemma for recurrent networks of spiking neurons, Article **(Everyone) **

8/5/2020 A perturbing approach to random matrix theory applications in neuroscience **(Agostina Palmigiano)**

8/12/2020 How single rate unit properties shape chaotic dynamics and signal transmission in random neural networks **(Samuel Muscinelli)**

8/19/2020 Correlation fractures and other breaking news in theoretical developmental neuroscience **(Bettina Hein)**

8/26/2020 Monkey Data **(Kaushik Lakshminarasimhan)**

9/2/2020 Neural Control Movement **(Shreya Saxena)**

9/9/2020 Modeling neurophysiological mechanisms of language production **(Serena Di Santo)**

Advanced Theory Seminar (Spring 2020) website

Advanced Theory Seminar (Spring 2019) website

## Intro to Theory

**Spring 2020**

**Faculty: Larry Abbott, Stefano Fusi, Ashok Litwin Kumar, Ken Miller**

**TAs: Matteo Alleman, Dan Biderman, Salomon Muller, Amin Nejatbakhsh, Marjorie Xie**

**Meetings: **Tuesdays & Thursdays, JLGSC L5-084, Lecture 2.00 - 3.30pm

**Text:** Theoretical Neuroscience by P. Dayan and L.F. Abbott (MIT Press)

**January **

21 (Larry) Introduction to the Course and to Theoretical Neuroscience

23 (Larry) Mathematics Review: Notes

28 (Larry) Electrical Properties of Neurons, Integrate-and-Fire Model (Assignment 1, neuron models)

30 (Larry) Adaptation, Synapses, Spiking Networks (Numerical methods)

**February **

4 (Larry) Numerical Methods, Filtering (Assignment 2)

5 Assignment 1 Due

6 (Larry) The Hodgkin-Huxley Model (I&F Model, White Noise, Synapses-Networks)

11 (Larry) Types of Neuron Models and Networks (Assignment 3, Poisson Spiking, Networks)

12 Assignment 2 Due

13 (Ashok) Linear Algebra I (Notes)

18 (Ashok) Linear Algebra II (Notes, Assignment 4, Solutions)

19 Assignment 3 Due

20 (Ashok) Introduction to Probability, Encoding, Decoding (Notes)

25 (Ashok) GLMs (Notes, Assignment 5)

26 Assignment 4 Due

27 COSYNE

**March**

3 COSYNE

5 (Ashok) Decoding, Fisher Information I (Notes)

10 (Ashok) Canceled

12 (Ashok) Information Theory (Notes, Assignment 6, google-1000-english.txt, Recitation Notes)

14 Assignment 5 Due

17 Spring Break

19 Spring Break

24 (Ken) Canceled – PCA and Dimensionality Reduction I

26 (Ken) – PCA and Dimensionality Reduction II (Notes)

27 Assignment 6 Due

31 (Ken) – Rate Networks/E-I networks I (Notes, Assignment 7, Codes)

**April**

2 (Ken) – Rate Networks/E-I networks II (Notes)

7 (Ken) – Unsupervised/Hebbian Learning, Developmental Models (Notes, Assignment 8, Ring-Model)

8 Assignment 7 Due

9 (Ken) – Optimization (Notes)

14 (Ken) – Optimization II (Notes, Assignment 9, Assignment 9 Addendum, Code, Gaussian Problem, Note on Lyapunov functions, Review of simple developmental models, News & Views on feature-map approach, MacKay and Miller, 1990, Miller and MacKay, 1994)

15 Assignment 8 Due

16 (Ashok) Optimization (Notes)

21 (Stefano) Perceptron (Notes, Assignment 10)

23 (Stefano) Multilayer Perceptrons and Mixed Selectivity (Notes)

28 (Stefano) – Deep Learning I (backpropagation) (Assignment 11, Notes, Codes - please note more codes are available on courseworks)

30 (Stefano) – Deep Learning II (convolutional networks) (Notes, Visualizing and Understanding Convolutional Networks, YaminsDiCarlo)

**May**

1 Assignment 9 & 10 Due

5 (Stefano) Learning in Recurrent Networks (Notes)

7 (Stefano) Continual Learning and Catastrophic Forgetting (Notes)

12 (Stefano) Reinforcement Learning (Notes)~~14 Research Topic~~

15 Assignment 11 Due

## Mathematical Tools

**Mathematical Tools for Theoretical Neuroscience (NBHV GU4359)**

**Spring 2020**

**Faculty**: Ken Miller (kdm2103@columbia.edu)**Instructor**: Dan Tyulmankov (dt2586@columbia.edu)**Teaching Assistant**: Amin Nejatbakhsh (mn2822@columbia.edu)

**Time**: Tuesdays, Thursdays 8:40a-10:00a**Place**: JLGSC L5-084**Webpage**: https://ctn.zuckermaninstitute.columbia.edu/courses**Credits**: 3 credits, pass/fail only

**Description**: An introduction to mathematical concepts used in theoretical neuroscience aimed to give a minimal requisite background for NBHV G4360, Introduction to Theoretical Neuroscience. The target audience is students with limited mathematical background who are interested in rapidly acquiring the vocabulary and basic mathematical skills for studying theoretical neuroscience, or who wish to gain a deeper exposure to mathematical concepts than offered by NBHV G4360. Topics include single- and multivariable calculus, linear algebra, differential equations, dynamical systems, and probability. Examples and applications are drawn primarily from theoretical and computational neuroscience.

**Prerequisites**: Basic prior exposure to trigonometry, calculus, and vector/matrix operations at the high school level

**Grading**: 100% attendance-based

**Readings** **and exercises**: Lecture notes and optional practice exercises will be provided for each lecture, and supplementary readings will be assigned from various textbooks.

*Page Last Updated: 7/15/2020*