## Spring 2024

**Mathematical Tools for Theoretical Neuroscience**

**Spring 2024: ****Download schedule**

**Lecturers**:

- Ines Aitsahalia ([email protected])
- Christine Liu ([email protected])
- Ben Antin ([email protected])
- Krishan Kumar ([email protected])
- Erfan Zabeh ([email protected])

**Faculty contact**: Prof. Ken Miller* [email protected]

*(*****Please contact Prof. Miller to sign add/drop forms and other items which require faculty permission)*

**Time**:

*Lectures: Tuesdays and Thursdays*12:10pm – 1:25pm*Office hours:*TBD*Recitations:*TBD

**Location**: Jerome L Greene Science Center, room L5-084

**Webpage:** CourseWorks

**Credits:** 3

**Call Num:** 18535

**Description**: An introduction to mathematical concepts used in theoretical neuroscience aimed to give a minimal requisite background for NBHV G4360, Introduction to Theoretical Neuroscience. The target audience is students with limited mathematical background who are interested in rapidly acquiring the vocabulary and basic mathematical skills for studying theoretical neuroscience, or who wish to gain a deeper exposure to mathematical concepts than offered by NBHV G4360. Topics include single- and multivariable calculus, linear algebra, differential equations, signals and systems, and probability. Examples and applications are drawn primarily from theoretical and computational neuroscience.

**Prerequisites**: Basic prior exposure to trigonometry, calculus, and vector/matrix operations at the high school level.

**Registration**:

*Undergraduate and graduate students*: Must register** on SSOL.

**Audit Interest Form/Courseworks Access Request: **Link

**Grading:**

- 50% homeworks (approximately bi-weekly)
- 50% participation (attendance, asking/answering questions, office hours, comments on notes, etc.)

Extra credit: +1% on your next homework assignment for finding a typo and +10% for finding an error in the typed lecture notes. (Please add comments directly to the posted files.)

Computational Statistics (Stat GR6104)

Spring 2024:** Schedule**

**This is a Ph.D.-level course in computational statistics. A link to the most recent previous iteration of this course is here.****Note: instructor permission is required to take this class for students outside of the Statistics Ph.D. program.****Time**: Tu 2:10**Place**: Jerome L Greene Science Center, room L5-084**Professor**: Liam Paninski; Email: **liam at stat dot columbia dot edu**. Hours by appointment.**Course goals**: (partially adapted from the preface of Givens' and Hoeting's book): Computation plays a central role in modern statistics and machine learning. This course aims to cover topics needed to develop a broad working knowledge of modern computational statistics. We seek to develop a practical understanding of how and why existing methods work, enabling effective use of modern statistical methods. Achieving these goals requires familiarity with diverse topics in statistical computing, computational statistics, computer science, and numerical analysis. Our choice of topics reflects our view of what is central to this evolving field, and what will be interesting and useful. A key theme is scalability to problems of high dimensionality, which are of most interest to many recent applications.

Some important topics will be omitted because high-quality solutions are already available in most software. For example, the generation of pseudo-random numbers is a classic topic, but existing methods built in to standard software packages will suffice for our needs. On the other hand, we will spend a bit of time on some classical numerical linear algebra ideas, because choosing the right method for solving a linear equation (for example) can have a huge impact on the time it takes to solve a problem in practice, particularly if there is some special structure that we can exploit.**Audience**: The course will be aimed at first- and second-year students in the Statistics Ph.D. program. Students from other departments or programs are welcome, space permitting; instructor permission required.**Background**: The level of mathematics expected does not extend much beyond standard calculus and linear algebra. Breadth of mathematical training is more helpful than depth; we prefer to focus on the big picture of how algorithms work and to sweep under the rug some of the nitty-gritty numerical details. The expected level of statistics is equivalent to that obtained by a graduate student in his or her first year of study of the theory of statistics and probability. An understanding of maximum likelihood methods, Bayesian methods, elementary asymptotic theory, Markov chains, and linear models is most important.**Programming**: With respect to computer programming, good students can learn as they go. We'll forgo much language-specific examples, algorithms, or coding; I won't be teaching much programming per se, but rather will focus on the overarching ideas and techniques. For the projects, I recommend you choose a high-level, interactive package that permits the flexible design of graphical displays and includes supporting statistics and probability functions, e.g., R, Python, or MATLAB.**Evaluation**: Final grades will be based on class participation and a student project.**Topics**:

Deterministic optimization

- Newton-Raphson, conjugate gradients, preconditioning, quasi-Newton methods, Fisher scoring, EM and its various derivatives

- Numerical recipes for linear algebra: matrix inverse, LU, Cholesky decompositions, low-rank updates, SVD, banded matrices, Toeplitz matrices and the FFT, Kronecker products (separable matrices), sparse matrix solvers

- Convex analysis: convex functions, duality, KKT conditions, interior point methods, projected gradients, augmented Lagrangian methods, convex relaxations

- Applications: support vector machines, splines, Gaussian processes, isotonic regression, LASSO and LARS regression

Graphical models: dynamic programming, hidden Markov models, forward-backward algorithm, Kalman filter, Markov random fields

Stochastic optimization: Robbins-Monro and Kiefer-Wolfowitz algorithms, simulated annealing, stochastic gradient methods

Deterministic integration: Gaussian quadrature, quasi-Monte Carlo. Application: expectation propagation

Monte Carlo methods

- Rejection sampling, importance sampling, variance reduction methods (Rao-Blackwellization, stratified sampling)

- MCMC methods: Gibbs sampling, Metropolis-Hastings, Langevin methods, Hamiltonian Monte Carlo, slice sampling. Implementation issues: burnin, monitoring convergence

- Sequential Monte Carlo (particle filtering)

- Variational and stochastic variational inference**References**:

Givens and Hoeting (2005) Computational statistics

Robert and Casella (2004) Monte Carlo Statistical Methods

Boyd and Vandenberghe (2004), Convex Optimization.

Press et al, Numerical Recipes

Sun and Yuan (2006), Optimization theory and methods

Fletcher (2000) Practical methods of optimization

Searle (2006) Matrix Algebra Useful for Statistics

Spall (2003), Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control

Shewchuk (1994), An Introduction to the Conjugate Gradient Method Without the Agonizing Pain

Boyd et al (2011), Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers

** Advanced Topics in Theoretical Neuroscience** (Spring 2024)

Led by: Rainer Engelken** (**[email protected]) & Manuel Beiran ( [email protected])

You may register via Courseworks, or contact us if you do not have a Columbia ID.

The classes will take place in person every** Wednesday, 10:00-11:30 AM**, location: **L5-084** at Jerome L. Greene building. The course will begin next Wednesday January 24th and will continue on a weekly basis until May 1st, except for the winter break and the weeks overlapping with the Cosyne conference. The full schedule will be shared shortly.

The course is taught by postdocs, and is intended to be interactive and oriented towards recent research advances. The topics will include: deep learning models for neuroscience, recurrent neural networks -dynamics and computation-, synaptic plasticity rules, and information theory.

**Schedule: **

Neural chaos

Jan 24th - JLG L5-084

**Rainer Engelken***: Lyapunov exponents in spiking networks*

Learning in neural networks

*Jan 31st - JLG L5-084*

**Kaushik Lakshminarasimhan***: Reinforcement learning in recurrent neural networks*

*Feb 7th - JLG L5-084*

**Owen Marschall**: *Algorithms for training recurrent neural networks*

Neural processing in feed-forward networks

*Feb 14th - JLG L5-084:** *

**W Jeffrey Johnston***: Information theory for sensory processing*

*Feb 21st - JLG L5-084:*

**Samuel Muscinelli***: Random expansion in biological neural networks*

*Feb 28st - JLG L5-084:** no class (CoSyNe)*

*Mar 6th - JLG L5-084:** no class (CoSyNe)*

*Mar 13th - JLG L5-084:** no class (spring break)*

Control theory

*Mar 20th - JLG L5-084:** *

**Bin Wang: ***Predictive Coding and its Network Models *

Computations in recurrent networks

*Mar 27th - JLG L5-084:*

**Ji Xia: ***Non-normal dynamics*

*Apr 3rd - JLG L5-084:** *

**Manuel Beiran: ***Ring attractor models*

Neural representations

*Apr 10th - JLG L5-084:*

**Tahereh Toosi: **Deep neural networks as models in neuroscience

*Apr 17rd - JLG L5-084:*

**Valeria Fascianelli & Lorenzo Posani: **Geometry of neural representations

Student presentations

*Apr 24th - JLG L5-084*

## Fall 2023

**Intro to Theory**

**Computational Neuroscience, Fall 2023**

**Faculty: **Larry Abbott, Stefano Fusi, Ashok Litwin-Kumar, Ken Miller, Kim Stachenfeld

**TAs:** Ching Fang, Ishani Ganguly, Francisco Sacadura, Erica Shook

**Meetings: **Tuesdays & Thursdays 2:00-3:30 (JLG L5-084)

**Text: **Theoretical Neuroscience by P. Dayan and L.F. Abbott (MIT Press)

**Extra TA sessions: **

- Math Review I: Calculus (Francisco)
- Monday, September 11, 6 PM. Location L6-086

- Coding Review 1: (Erica)
- Tuesday, September 12, 6 PM. Location L5-084

- Math Review II: Linear Algebra (Ching)
- Wednesday, September 13, 5 PM. Location L6-086

- Coding Review 2: (Ishani)
- Wednesday, September 13, 6 PM. Location L6-086

**September**

5 (Larry) Introduction to Course and to Computational Neuroscience

7 (Larry) Electrical Properties of Neurons, Integrate-and-Fire Model

12 (Larry) Numerical Methods, Filtering (Assignment 1)

14 (Larry) The Hodgkin-Huxley Model

19 (Larry) Adaptation, Synapses, Synaptic Plasticity (Assignment 2)

21 (Larry) Types of Neuron Models and Networks

22 Assignment 1 Due

26 (Stefano) Generalized Linear Models

28 (Ken) Linear Algebra I (Assignment 3)

29 Assignment 2 Due

**October**

3 (Ken) Linear Algebra II

5 (Ken) PCA and Dimensionality Reduction

10 (Ken) Rate Networks/E-I networks I (Assignment 4)

11 Assignment 3 Due

12 (Ken) Rate Networks/E-I networks II

17 (Ken) Unsupervised/Hebbian Learning, Developmental Models (Assignment 5; extra info on ring models; extra info on gaussian distributions)

19 (Ashok) Chaotic Networks

20 Assignment 4 Due

24 (Ashok) Low Rank Networks

26 (Ashok) Introduction to Probability, Encoding, Decoding (Assignment 6)

27 Assignment 5 Due

31 (Ashok) Fisher Information

**November**

2 (Ashok) Optimization (Assignment 7)

7 Holiday

8 Assignment 6 Due

9 (Stefano) Perceptrons

14 (Stefano) Multilayer Perceptrons and Mixed Selectivity (Assignment 8)

15 Assignment 7 Due

16 (Ashok) Dimensionality and kernel methods

21 (Stefano) Deep Learning (Assignment 9)

22 Assignment 8 Due

23 Holiday

28 (Stefano) Learning in Recurrent Networks

30 (Stefano) Continual Learning and Catastrophic Forgetting

**December**

5 (Kim) Reinforcement Learning (Assignment 10)

7 Course Wrapup

13 Assignment 9 Due

## Spring 2023

**Computational Statistics (Stat GR6104), Spring 2023**

**Time**: Th 2:30 - 4.00 pm**Place**: Jerome L. Greene Science Center, L3-079**Professor**: Liam Paninski; Email: **liam at stat dot columbia dot edu**. Hours by appointment.

This is a Ph.D.-level course in computational statistics.** ****Note: **instructor permission is required to take this class for students outside of the Statistics Ph.D. program.

See this page for additional course details.

**Advanced Theory, Spring 2023**

Please register via Courseworks, the course is also open to external guests

**Time:** Wednesdays, 10.00-11.30 am**Place: **Jerome L. Greene Science Center, L4.078 (will eventually be L5.084, but check email announcements)

For the zoom link contact **[email protected]**

**Schedule:****Information theory**

Jan 25 Information theory (Jeff Johnston) Material

Feb 01 Fisher information (Jeff Johnston)

Feb 08 Gaussian Information bottleneck (Rainer Engelken)

**Learning systems**

Feb 15 Expectation Maximization (Ji Xia)

Feb 22 Reinforcement learning I (Kaushik Lakshminarasimhan)

Mar 01 Reinforcement learning II (Kaushik Lakshminarasimhan)

Mar 08 No session (Cosyne)

Mar 15 No session (Spring break)

Mar 22 Feedforward architectures I (Samuel Muscinelli)

29 Mar Feedforward architectures II (Samuel Muscinelli)

**Neural dynamics and computations**

Apr 05 Oscillations and Wilson-Cowan model (Patricia Cooney)

Apr 12 Mean-field theory and perturbative approaches (Agostina Palmigiano)

Apr 19 Low-rank neural networks I (Manuel Beiran)

Apr 26 Low-rank neural networks II (Manuel Beiran)

May 3 Single-neuron computations (Salomon Muller)

May 10 Student presentations

**Mathematical Tools for Theoretical Neuroscience (NBHV GU4359)**

**Spring 2023: ****Download schedule**

**Class Assistants**:

Ines Aitsahalia ([email protected])

Elom Amematsro ([email protected])

Ching Fang ([email protected])

Christine Liu ([email protected])

Amol Pasarkar ([email protected])

**Faculty contact**: Prof. Ken Miller ([email protected])

**Time**: Tuesdays, Thursdays 12:10 - 1:25pm**Place**: L5-084**Webpage:** CourseWorks (announcements, assignments, readings)**Credits:** 3

**Description**: An introduction to mathematical concepts used in theoretical neuroscience aimed to give a minimal requisite background for NBHV G4360, Introduction to Theoretical Neuroscience. The target audience is students with limited mathematical background who are interested in rapidly acquiring the vocabulary and basic mathematical skills for studying theoretical neuroscience, or who wish to gain a deeper exposure to mathematical concepts than offered by NBHV G4360. Topics include single- and multivariable calculus, linear algebra, differential equations, signals and systems, and probability. Examples and applications are drawn primarily from theoretical and computational neuroscience.

**Prerequisites**: Basic prior exposure to trigonometry, calculus, and vector/matrix operations at the high school level.

**Registration**: Undergraduate and graduate students must register on SSOL.* *****If you’re only interested in attending a subset of lectures, register Pass/Fail and contact the TAs.*

**Neuroscience and Philosophy (GU4500), Spring 2023**

**Time**: Thursdays, 2:10-4:00**Place**: Jerome L. Greene Science Center, L5-084**Professor**: John Morrison, Email: [email protected], office hours

**Description: **This course is about the philosophical foundations of cognitive neuroscience. Cognitive neuroscientists often describe the brain as representing and inferring. It is their way of describing the overall function of the brain’s activity, an abstraction away from detailed neural recordings. But, because there are no settled definitions of representation and inference, there are often no objective grounds for these descriptions. As a result, they are often treated as casual glosses rather than as rigorous analyses. Philosophers have proposed a number of (somewhat) rigorous definitions, but they rarely say anything about the brain. The goal of this course is to survey the philosophical literature and consider which definitions, if any, might be useful to cognitive neuroscientists. I will begin each class with a description of a different definition from the philosophical literature. We will then rely on our collective expertise to assess its potential applications to the brain. This course is for graduate students in Neurobiology and Behavior. No prior background in philosophy will be assumed.

**Format: **Each lecture will begin with a summary of the books and articles listed below. But I don’t expect you to read any of them. Many of them are (unnecessarily!) dense and inaccessible. Plus, they total hundreds of pages. It’s my job to distill them into a few basic points. If you want to learn more, reach out and I will recommend particular chapters, papers, etc.

**Evaluation: **Graduate students in Neurobiology and Behavior should write a 15-page term paper applying one of the philosophical definitions to a particular experiment, preferably an experiment relevant to their own research. Other students should consult with the professor to identify an alternative form of evaluation. Undergraduates will be required to write at least two papers.

## Fall 2022

**Computational Neuroscience, Fall 2022**

**Faculty: **Larry Abbott, Sean Escola, Stefano Fusi, Ashok Litwin-Kumar, Ken Miller

**TAs: **Elom Amematsro, Ho Yin Chau, David Clark, Zhenrui Liao

**Meetings: **Tuesdays & Thursdays 2:00-3:30 (JLG L5-084)

**Text: **Theoretical Neuroscience by P. Dayan and L.F. Abbott (MIT Press)

Download Schedule

**September**

06 (Larry) Introduction to Course and to Computational Neuroscience

08 (Larry) Electrical Properties of Neurons, Integrate-and-Fire Model

13 (Larry) Numerical Methods, Filtering (Assignment 1)

15 (Larry) The Hodgkin-Huxley Model

20 (Larry) Types of Neuron Models and Networks (Assignment 2)

21 Assignment 1 Due

22 (Stefano) Adaptation, Synapses, Synaptic Plasticity

27 (Sean) Generalized Linear Models

28 Assignment 2 Due

29 (Ken) Linear Algebra I (Assignment 3)**October**

04 (Ken) Linear Algebra II

06 (Ken) PCA and Dimensionality Reduction

11 (Ken) Rate Networks/E-I networks I (Assignment 4)

12 Assignment 3 Due

13 (Ken) Rate Networks/E-I networks II

18 (Ken) Unsupervised/Hebbian Learning, Developmental Models (Assignment 5)

19 Assignment 4 Due

20 (Ashok) Introduction to Probability, Encoding, Decoding

25 (Ashok) Decoding, Fisher Information I

26 Assignment 5 Due

27 (Ashok) Decoding, Fisher Information II (Assignment 6)

**November**

01 (Ashok) Information Theory

03 (Ashok) Optimization I (Assignment 7)

08 Holiday

09 Assignment 6 Due

10 (Ashok) Optimization II

15 (Stefano) The Perceptron (Assignment 8)

16 Assignment 7 Due

17 (Stefano) Multilayer Perceptrons and Mixed Selectivity

22 (Stefano) Deep Learning (Assignment 9)

23 Assignment 8 Due

24 Holiday

29 (Sean) Learning in Recurrent Networks

**December**

01 (Stefano) Continual Learning and Catastrophic Forgetting

06 (Stefano) Reinforcement Learning (Assignment 10)

07 Assignment 9 Due

08 (Larry) Course Wrap up

14 Assignment 10 Due

## Spring 2022

**Computational Statistics (Stat GR6104), Spring 2022****Time**: Tu 2:10 - 3:40pm**Place**: Zoom for a while, then JLG L7-119**Professor**: Liam Paninski; Email: **liam at stat dot columbia dot edu**. Hours by appointment.**This is a Ph.D.-level course in computational statistics. ****Note:** instructor permission is required to take this class for students outside of the Statistics Ph.D. program.

See this page for additional course details.

**Advanced Theory, Spring 2022**

Time: Tuesdays, 10.00 am**Place: **Jerome L Greene Science Center, L7-119

For the zoom link contact [email protected]

**Schedule:**

**Methods for decoding and interpreting neural data**

Feb 01 Methods on calculating receptive fields (Ji Xia) Material

Feb 08 Information theory (Jeff Johnston)

Feb 15 cancelled

**Neural representations**

Feb 22 Geometry of abstractions (theory session) (Valeria Fascianelli) slides

Mar 01 Geometry of abstractions (hands-on session) (Lorenzo Posani)

**Network dynamics**

Mar 08 Solving very nonlinear problems with Homotopy Analysis Method (Serena Di Santo) notes notebook

Mar 29 Forgetting in attractor networks (Samuel Muscinelli)

Apr 05 Mean-field models of network dynamics (Alessandro Sanzeni + Mario Dipoppa)

**Dynamics of learning**

Apr 12 Learning dynamics in feedforward neural networks (Manuel Beiran + Rainer Engelken)

Apr 19 Learning dynamics recurrent neural networks (Manuel Beiran + Rainer Engelken)

**Causality**

Apr 26 Introduction to some causality issues in neuroscience (Laureline Logiaco)

May 3 Causality and latent variable models (Josh Glaser)

May 10 student presentations

May 17 Bonus session: Many methods, one problem: modern inference techniques as applied to linear regression (Juri Minxha)

**Mathematical Tools for Theoretical Neuroscience (NBHV GU4359)**

Spring 2022: Download schedule**Teaching Assistants:**

Ching Fang [email protected]

Jack Lindsey [email protected]

Zhenrui Liao [email protected]

Amol Pasarkar [email protected]

Dan Tyulmankov [email protected]

**Recitation instructor: **David Clark [email protected]

**Faculty contact**: Prof. Ken Miller* [email protected]

*(*****Please contact Prof. Miller to sign add/drop forms and other items which require faculty permission)*

**Time**:*Lectures: *Tues/Thurs 10:10 - 11:25a*Office hours: *Thurs 11:30a - 12:30p*Recitations:* Alternating Fridays 10 - 11a

**Location**: Zoom

**Webpage:** CourseWorks

**Credits:** 3

**Description**: An introduction to mathematical concepts used in theoretical neuroscience aimed to give a minimal requisite background for NBHV G4360, Introduction to Theoretical Neuroscience. The target audience is students with limited mathematical background who are interested in rapidly acquiring the vocabulary and basic mathematical skills for studying theoretical neuroscience, or who wish to gain a deeper exposure to mathematical concepts than offered by NBHV G4360. Topics include single- and multivariable calculus, linear algebra, differential equations, signals and systems, and probability. Examples and applications are drawn primarily from theoretical and computational neuroscience.

**Prerequisites**: Basic prior exposure to trigonometry, calculus, and vector/matrix operations at the high school level.

## Fall 2021

**Introduction to Theoretical Neuroscience, Fall 2021****Faculty:** Larry Abbott, Sean Escola, Stefano Fusi, Ashok Litwin-Kumar, Ken Miller

**TAs: **Elom Amematsro, Ramin Khajeh, Minni Sun, Denis Turcu

**Meetings: **Tuesdays & Thursdays 2:00-3:30

**Text:** Theoretical Neuroscience by P. Dayan and L.F. Abbott (MIT Press)

**September**

09 (Larry) Introduction to Course and to Computational Neuroscience

14 (Larry) Electrical Properties of Neurons, Integrate-and-Fire Model

16 (Stefano) Adaptation, Synapses, Synaptic Plasticity

21 (Larry) The Hodgkin-Huxley Model (Assignment 1)

23 (Larry) Types of Neuron Models and Networks

28 (Larry) Numerical Methods, Filtering (Assignment 2)

29 Assignment 1 Due

30 (Sean) Generalized Linear Models

**October**

05 (Ken) Linear Algebra (Assignment 3)

06 Assignment 2 Due

07 (Ken) PCA and Dimensionality Reduction

12 (Ken) Rate Networks/E-I networks I (Assignment 4)

13 Assignment 3 Due

14 (Ken) Rate Networks/E-I networks II

19 (Ken) Unsupervised/Hebbian Learning, Developmental Models (Assignment 5)

20 Assignment 4 Due

21 (Ashok) Introduction to Probability, Encoding, Decoding

26 (Ashok) Decoding, Fisher Information I

27 Assignment 5 Due

28 (Ashok) Decoding, Fisher Information II (Assignment 6)

**November**

02 Holiday

04 (Ashok) Information Theory

09 (Ashok) Optimization I (Assignment 7)

10 Assignment 6 Due

11 (Ashok) Optimization II

16 (Stefano) The Perceptron (Assignment 8)

17 Assignment 7 Due

18 (Stefano) Multilayer Perceptrons and Mixed Selectivity

23 Holiday

25 Holiday

30 (Stefano) Deep Learning (Assignment 8)

**December**

01 Assignment 8 Due

02 (Sean) Learning in Recurrent Networks

07 (Stefano) Continual Learning and Catastrophic Forgetting (Assignment 10)

08 Assignment 9 Due

09 (Stefano) Reinforcement Learning

15 Assignment 10 Due

## Spring 2021

**Introduction to Theoretical Neuroscience, Spring 2021**

**Faculty: **Larry Abbott, Stefano Fusi, Ashok Litwin Kumar, Ken Miller

**TAs:** Matteo Alleman, Elom Amematsro, Ramin Khajeh, Denis Turcu

**Meetings: **Tuesdays 2:00-3:30 & Thursdays 1:30-3:00**Text: **Theoretical Neuroscience by P. Dayan and L.F. Abbott (MIT Press)

**January**

12 (Larry) Introduction to the Course and to Theoretical Neuroscience

14 (Larry) Electrical Properties of Neurons, Integrate-and-Fire Model

19 (Larry) Adaptation, Synapses, Spiking Networks (Assignment 1)

21 (Larry) Numerical Methods, Filtering

26 (Larry) The Hodgkin-Huxley Model (Assignment 2)

27 Assignment 1 Due

28 (Larry) Types of Neuron Models and Networks (Assignment 3)

**February**

02 (Ken) Linear Algebra I

03 Assignment 2 Due

04 (Ken) Linear Algebra II

09 (Ken) PCA and Dimensionality Reduction (Assignment 4)

10 Assignment 3 Due

11 (Ken) Rate Networks/E-I networks I

16 (Ken) Rate Networks/E-I networks II (Assignment 5)

17 Assignment 4 Due

18 (Ken) Unsupervised/Hebbian Learning, Developmental Models

23 (Ashok) Introduction to Probability, Encoding, Decoding (Assignment 6)

24 Assignment 5 Due

25 (Sean) GLMs

**March**

02 Spring Break

04 Spring Break

09 (Ashok) Decoding, Fisher Information I

10 Assignment 6 Due

11 (Ashok) Decoding, Fisher Information II

16 (Ashok) Information Theory (Assignment 7)

18 (Ashok) – Optimization

23 (Ashok) – Optimization II (Assignment 8)

24 Assignment 7 Due

25 (Stefano) Perceptron

30 (Stefano) Multilayer Perceptrons and Mixed Selectivity (Assignment 9)

31 Assignment 8 Due

**April**

01 (Stefano) – Deep Learning I (backpropagation)

06 (Stefano) – Deep Learning II (convolutional networks) (Assignment 10)

07 Assignment 9 Due

08 (Stefano) Learning in Recurrent Networks

13 (Stefano) Continual Learning and Catastrophic Forgetting (Assignment 11)

14 Assignment 10 Due

15 (Stefano) Reinforcement Learning

21 Assignment 11 Due

**Mathematical Tools for Theoretical Neuroscience (NBHV GU4359)**

**Faculty**: Ken Miller ([email protected])

**Instructor**: Dan Tyulmankov ([email protected])

**Teaching Assistant:** Amin Nejatbakhsh ([email protected]) (**Office Hours**: Thursdays 1-2pm)

**Time**: Tuesdays, Thursdays 8:40a-10:00a

**Place**: ~~JLGSC L5-084~~ Zoom (https://columbiauniversity.zoom.us/j/489220180)

**Webpage:** CourseWorks

**Credits:** 3 credits, **pass/fail only** (Register on SSOL. **Do not register for a grade.**)

**Description**: An introduction to mathematical concepts used in theoretical neuroscience aimed to give a minimal requisite background for NBHV G4360, Introduction to Theoretical Neuroscience. The target audience is students with limited mathematical background who are interested in rapidly acquiring the vocabulary and basic mathematical skills for studying theoretical neuroscience, or who wish to gain a deeper exposure to mathematical concepts than offered by NBHV G4360. Topics include single- and multivariable calculus, linear algebra, differential equations, dynamical systems, and probability. Examples and applications are drawn primarily from theoretical and computational neuroscience.

**Prerequisites**: Basic prior exposure to trigonometry, calculus, and vector/matrix operations at the high school level

## Fall 2020

**Special Virtual Edition**

**Meetings:** Wednesdays, 10.00 am

**Location:** Zoom, contact [email protected] for details

**Schedule**

07/15/2020 Time-dependent mean-field theory for mathematical streetfighters **(Rainer Engelken)**

07/22/2020 Predictive coding in balanced neural networks with noise, chaos, and delays, Article (Everyone)

07/29/2020 A solution to the learning dilemma for recurrent networks of spiking neurons, Article (Everyone)

08/05/2020 Canceled

08/12/2020 Artificial neural networks for neuroscientists: A primer, Article **(Robert Yang)**

08/19/2020 A mechanism for generating modular activity in the cerebral cortex** (Bettina Hein)**

08/26/2020 Dynamic representations in networked neural systems, Article **(Kaushik Lakshminarasimhan)**

09/02/2020 Network principles predict motor cortex population activity across movement speeds **(Shreya Saxena)**

09/09/2020 Modeling neurophysiological mechanisms of language production **(Serena Di Santo)**

09/16/2020 How single rate unit properties shape chaotic dynamics and signal transmission in random neural networks, Article **(Samuel Muscinelli)**

09/23/2020 Shaping dynamics with multiple populations in low-rank recurrent networks, Article **(Laureline Logiaco)**

09/30/2020 Deep Graph Pose: a semi-supervised deep graphical model for improved animal pose tracking, Article **(Anqi Wu)**

10/07/2020 Decoding and mixed selectivity, Article, Article, Article **(Fabio Stefanini)**

10/14/2020 Theory of gating in recurrent neural networks, Article, Article** (Kamesh Krishnamurthy)**

10/21/2020 Decentralized motion inference and registration of neuropixel data **(Erdem Varol)**

10/28/2020 Decision, interrupted** (NaYoung So)**

11/04/2020 Abstract rules implemented via neural dynamics (**Kenny Kay)**

11/11/2020 Canceled for Holiday

11/18/2020 "Rodent paradigms for the study of volition (free will)" **(Cat Mitelut)**

11/25/2020 Gaussian process inference **(Geoff Pleiss)**

12/02/2020 Canceled

12/09/2020 Structure and variability of optogenetic responses in multiple cell-type cortical circuits **(Agostina Palmigiano)**

12/16/2020 Manifold GPLVMs for discovering non-Euclidean latent structure in neural data, Article** (Josh Glaser)**

*Page Last Updated: 08/31/2023*