**Faculty: Larry Abbott, Stefano Fusi, Ashok Litwin Kumar, Ken Miller****TAs: Matteo Alleman, Dan Biderman, Salomon Muller, Amin Nejatbakhshesfahani, Marjorie Xie**

**Meetings: **Tuesdays & Thursdays, JLGSC L5-084, Lecture 2.00 - 3.30pm

**Text:** Theoretical Neuroscience by P. Dayan and L.F. Abbott (MIT Press)**January **

21 (Larry) Introduction to the Course and to Theoretical Neuroscience

23 (Larry) Mathematics Review

28 (Larry) Electrical Properties of Neurons, Integrate-and-Fire Model (Assignment 1)

30 (Larry) Adaptation, Synapses, Spiking Networks

**February **

4 (Larry) Numerical Methods, Filtering (Assignment 2)

5 Assignment 1 Due

6 (Larry) The Hodgkin-Huxley Model

11 (Larry) Types of Neuron Models and Networks (Assignment 3)

12 Assignment 2 Due

13 (Ken) Linear Algebra I

18 (Ken) Linear Algebra II (Assignment 4)

19 Assignment 3 Due

20 (Ken) Linear Algebra III

25 (Ken) PCA and Dimensionality Reduction (Assignment 5)

26 Assignment 4 Due

27 COSYNE

**March**

3 COSYNE

5 (Ken) Rate Networks/E-I networks I

10 (Ken) Rate Networks/E-I networks II (Assignment 6)

11 Assignment 5 Due

12 (Ken) Unsupervised/Hebbian Learning, Developmental Models

17 Spring Break

19 Spring Break

24 (Ashok) – Introduction to Probability, Encoding, Decoding

25 Assignment 6 Due

26 (Ashok) – GLMs

31 (Ashok) – Decoding, Fisher Information (Assignment 7)

**April**

2 (Ashok) – Decoding, Fisher Information II

7 (Ashok) – Information Theory (Assignment 8)

8 Assignment 7 Due

9 (Ashok) – Optimization

14 (Ashok) – Optimization II (Assignment 9)

15 Assignment 8 Due

16 Research Topic

21 (Stefano) Perceptron (Assignment 10)

22 Assignment 9 Due

23 (Stefano) Multilayer Perceptrons and Mixed Selectivity

28 (Stefano) – Deep Learning I (backpropagation) (Assignment 11)

29 Assignment 10 Due

30 (Stefano) – Deep Learning II (convolutional networks)

**May**

5 (Stefano) Learning in Recurrent Networks (Assignment 12)

6 Assignment 11 Due

7 (Stefano) Continual Learning and Catastrophic Forgetting

12 (Stefano) Reinforcement Learning

13 Assignment 12 Due

14 Research Topic

**Meetings: **Wednesdays, 10.15 - 11.45am

**Location** – Jerome L Greene Science Center, L6-087

See course website for updated schedule

Control Theory and Reinforcement Learning

Jan 29 Examples of control theory in neuroscience (Bettina Hein, Laureline Logiaco)

Feb 5 Methods in control theory (Bettina Hein, Laureline Logiaco)

Feb 12 Reinforcement learning (James Murray)

Feb 19 From optimal control theory to reinforcement learning (Samuel Muscinelli)

Feb 26 Cosyne break

Mar 4 Cosyne break

Latent Variable Models

Mar 11 Hidden Markov Models (Matt Whiteway)

Mar 18 Latent Dynamical Systems (Josh Glaser)

Mar 25 Gaussian Processes (Rainer Engelken)

Apr 1 Modeling of Behavior (Juri Minxha)

Apr 8 Hackathon on Latent Variable Models

Miscellaneous

Apr 15 Phenomenological Renormalization Group (Serena Di Santo)

Apr 22 Additional and diverse perspectives in neuroscience (Laureline Logiaco, Matt Whiteway, Juri Minxha)

Apr 29 Introduction to replica theory (SueYeon Chung)

May 6 Classification and Geometry of General Perceptual Manifolds (SueYeon Chung)

**Mathematical Tools for Theoretical Neuroscience (NBHV GU4359)**

**Faculty**: Ken Miller (kdm2103@columbia.edu)**Instructor**: Dan Tyulmankov (dt2586@columbia.edu)**Teaching Assistant**: Amin Nejatbakhsh (mn2822@columbia.edu)

**Time**: Tuesdays, Thursdays 4:10pm-5:25pm (tentative, may change based on demand)**Place**: JLGSC L5-084 (Jan 21, 23, and 28 only, permanent room TBA)**Webpage**: https://ctn.zuckermaninstitute.columbia.edu/courses**Credits**: 3 credits, pass/fail only (registration details TBA)

**Description**: An introduction to mathematical concepts used in theoretical neuroscience aimed to give a minimal requisite background for NBHV G4360, Introduction to Theoretical Neuroscience. The target audience is students with limited mathematical background who are interested in rapidly acquiring the vocabulary and basic mathematical skills for studying theoretical neuroscience, or who wish to gain a deeper exposure to mathematical concepts than offered by NBHV G4360. Topics include single- and multivariable calculus, linear algebra, differential equations, dynamical systems, and probability. Examples and applications are drawn primarily from theoretical and computational neuroscience.

**Prerequisites**: Basic prior exposure to trigonometry, calculus, and vector/matrix operations at the high school level

**Grading**: 100% attendance-based

**Readings** **and exercises**: Lecture notes and optional practice exercises will be provided for each lecture, and supplementary readings will be assigned from various textbooks.

**Topics**:

1. Vocabulary (sets, functions, limits, define e=lim(1+1/n)^n, complex numbers, Euler’s)

2. Calculus (derivatives, integrals, fundamental theorem)

3. ODEs (first-order)

4. Taylor series (prove Euler’s)

5. Linear algebra 0 (vectors, matrices, add/multiply/inverse)

6. Linear algebra 1 (vector spaces, independence, span, basis, orthonormal, linear transformations, projections)

7. Linear algebra 2 (rank, range, nullspace, SVD)

8. Linear algebra 3 (change of basis, trace, determinant, eigenvector primer)

9. Linear system dynamics (phase plane, linear systems, eigenvectors)

10. Nonlinear system dynamics 1 (fixed points, stability, linearization, nullclines)

11. Nonlinear system dynamics 2 (bifurcations, chaos)

12. Combinatorics (set algebra, permutations, combinations)

13. Probability 1 (probability space, random vars)

14. Probability 2 (pdfs, expectation, variance, conditioning)

15. Probability 3 (joints, multivariate Gaussian)

16. Multivariable calculus 1 (partial derivative, gradient)

17. Multivariable calculus 2 (Hessian, multivar Taylor series)

18. Multivariable calculus 3 (chain rule, Lagrange Multipliers)

19. Convolution (signals & systems, impulse response, LTI systems, filters)

20. Fourier series

21. Discrete Fourier Transform (filters, linear algebra notation)

22. Random signals (stationarity, cross-correlation, auto-correlation, signal detection theory, ROC)

23. Stats 1 (summary stats, convergence, MLE, MAP)

24. Stats 2 (regression, classification)

25. Stats 3 (machine learning basics)

26. Information Theory (entropy, KL, mutual info)

**Schedule**:

Tue Jan 21 Vocabulary

Thu Jan 23 Single-variable calculus

Tue Jan 28 ODEs, Amin substituting (Dan away)

Thu Jan 30 Taylor series, Amin substituting (Dan away)

Tue Feb 4 Linear algebra 0

Thu Feb 6 Linear algebra 1

Tue Feb 11 Linear algebra 2

Thu Feb 13 Linear algebra 3

Tue Feb 18 Linear dynamics

Thu Feb 20 Nonlinear dynamics 1

Tue Feb 25 Nonlinear dynamics 2

Thu Feb 27 No class, Cosyne (Dan, Amin away)

Tue Mar 3 No class, Cosyne (Dan, Amin away)

Thu Mar 5 Combinatorics, Amin substituting (Dan away)

Tue Mar 10 Probability 1

Thu Mar 12 Probability 2

Tue Mar 17 No class, Spring break

Thu Mar 19 No class, Spring break

Tue Mar 24 No class, NAISys (Dan, Amin away)

Thu Mar 26 No class, NAISys (Dan, Amin away)

Tue Mar 31 Probability 3

Thu Apr 2 Multivariable calculus 1

Tue Apr 7 Multivariable calculus 2

Thu Apr 9 Multivariable calculus 3

Tue Apr 14 Convolution

Thu Apr 16 Fourier series

Tue Apr 21 Discrete Fourier Transform

Thu Apr 23 Random signals

Tue Apr 28 Statistics 1

Thu Apr 30 Statistics 2