## Spring 2023

**Computational Statistics (Stat GR6104), Spring 2023**

**Time**: Th 2:30 - 4.00 pm**Place**: Jerome L. Greene Science Center, L3-079**Professor**: Liam Paninski; Email: **liam at stat dot columbia dot edu**. Hours by appointment.

This is a Ph.D.-level course in computational statistics.** ****Note: **instructor permission is required to take this class for students outside of the Statistics Ph.D. program.

See this page for additional course details.

**Advanced Theory, Spring 2023**

Please register via Courseworks, the course is also open to external guests

**Time:** Wednesdays, 10.00-11.30 am**Place: **Jerome L. Greene Science Center, L4.078 (will eventually be L5.084, but check email announcements)

For the zoom link contact **[email protected]**

**Schedule:****Information theory**

Jan 25 Information theory (Jeff Johnston) Material

Feb 01 Fisher information (Jeff Johnston)

Feb 08 Gaussian Information bottleneck (Rainer Engelken)

**Learning systems**

Feb 15 Expectation Maximization (Ji Xia)

Feb 22 Reinforcement learning I (Kaushik Lakshminarasimhan)

Mar 01 Reinforcement learning II (Kaushik Lakshminarasimhan)

Mar 08 No session (Cosyne)

Mar 15 No session (Spring break)

Mar 22 Feedforward architectures I (Samuel Muscinelli)

29 Mar Feedforward architectures II (Samuel Muscinelli)

**Neural dynamics and computations**

Apr 05 Oscillations and Wilson-Cowan model (Patricia Cooney)

Apr 12 Mean-field theory and perturbative approaches (Agostina Palmigiano)

Apr 19 Low-rank neural networks I (Manuel Beiran)

Apr 26 Low-rank neural networks II (Manuel Beiran)

May 3 Single-neuron computations (Salomon Muller)

May 10 Student presentations

**Mathematical Tools for Theoretical Neuroscience (NBHV GU4359)**

**Spring 2023: ****Download schedule**

**Class Assistants**:

Ines Aitsahalia ([email protected])

Elom Amematsro ([email protected])

Ching Fang ([email protected])

Christine Liu ([email protected])

Amol Pasarkar ([email protected])

**Faculty contact**: Prof. Ken Miller ([email protected])

**Time**: Tuesdays, Thursdays 12:10 - 1:25pm**Place**: L5-084**Webpage:** CourseWorks (announcements, assignments, readings)**Credits:** 3

**Description**: An introduction to mathematical concepts used in theoretical neuroscience aimed to give a minimal requisite background for NBHV G4360, Introduction to Theoretical Neuroscience. The target audience is students with limited mathematical background who are interested in rapidly acquiring the vocabulary and basic mathematical skills for studying theoretical neuroscience, or who wish to gain a deeper exposure to mathematical concepts than offered by NBHV G4360. Topics include single- and multivariable calculus, linear algebra, differential equations, signals and systems, and probability. Examples and applications are drawn primarily from theoretical and computational neuroscience.

**Prerequisites**: Basic prior exposure to trigonometry, calculus, and vector/matrix operations at the high school level.

**Registration**: Undergraduate and graduate students must register on SSOL.* *****If you’re only interested in attending a subset of lectures, register Pass/Fail and contact the TAs.*

**Neuroscience and Philosophy (GU4500), Spring 2023**

**Time**: Thursdays, 2:10-4:00**Place**: Jerome L. Greene Science Center, L5-084**Professor**: John Morrison, Email: [email protected], office hours

**Description: **This course is about the philosophical foundations of cognitive neuroscience. Cognitive neuroscientists often describe the brain as representing and inferring. It is their way of describing the overall function of the brain’s activity, an abstraction away from detailed neural recordings. But, because there are no settled definitions of representation and inference, there are often no objective grounds for these descriptions. As a result, they are often treated as casual glosses rather than as rigorous analyses. Philosophers have proposed a number of (somewhat) rigorous definitions, but they rarely say anything about the brain. The goal of this course is to survey the philosophical literature and consider which definitions, if any, might be useful to cognitive neuroscientists. I will begin each class with a description of a different definition from the philosophical literature. We will then rely on our collective expertise to assess its potential applications to the brain. This course is for graduate students in Neurobiology and Behavior. No prior background in philosophy will be assumed.

**Format: **Each lecture will begin with a summary of the books and articles listed below. But I don’t expect you to read any of them. Many of them are (unnecessarily!) dense and inaccessible. Plus, they total hundreds of pages. It’s my job to distill them into a few basic points. If you want to learn more, reach out and I will recommend particular chapters, papers, etc.

**Evaluation: **Graduate students in Neurobiology and Behavior should write a 15-page term paper applying one of the philosophical definitions to a particular experiment, preferably an experiment relevant to their own research. Other students should consult with the professor to identify an alternative form of evaluation. Undergraduates will be required to write at least two papers.

## Fall 2022

**Computational Neuroscience, Fall 2022**

**Faculty: **Larry Abbott, Sean Escola, Stefano Fusi, Ashok Litwin-Kumar, Ken Miller

**TAs: **Elom Amematsro, Ho Yin Chau, David Clark, Zhenrui Liao

**Meetings: **Tuesdays & Thursdays 2:00-3:30 (JLG L5-084)

**Text: **Theoretical Neuroscience by P. Dayan and L.F. Abbott (MIT Press)

Download Schedule

**September**

06 (Larry) Introduction to Course and to Computational Neuroscience

08 (Larry) Electrical Properties of Neurons, Integrate-and-Fire Model

13 (Larry) Numerical Methods, Filtering (Assignment 1)

15 (Larry) The Hodgkin-Huxley Model

20 (Larry) Types of Neuron Models and Networks (Assignment 2)

21 Assignment 1 Due

22 (Stefano) Adaptation, Synapses, Synaptic Plasticity

27 (Sean) Generalized Linear Models

28 Assignment 2 Due

29 (Ken) Linear Algebra I (Assignment 3)**October**

04 (Ken) Linear Algebra II

06 (Ken) PCA and Dimensionality Reduction

11 (Ken) Rate Networks/E-I networks I (Assignment 4)

12 Assignment 3 Due

13 (Ken) Rate Networks/E-I networks II

18 (Ken) Unsupervised/Hebbian Learning, Developmental Models (Assignment 5)

19 Assignment 4 Due

20 (Ashok) Introduction to Probability, Encoding, Decoding

25 (Ashok) Decoding, Fisher Information I

26 Assignment 5 Due

27 (Ashok) Decoding, Fisher Information II (Assignment 6)

**November**

01 (Ashok) Information Theory

03 (Ashok) Optimization I (Assignment 7)

08 Holiday

09 Assignment 6 Due

10 (Ashok) Optimization II

15 (Stefano) The Perceptron (Assignment 8)

16 Assignment 7 Due

17 (Stefano) Multilayer Perceptrons and Mixed Selectivity

22 (Stefano) Deep Learning (Assignment 9)

23 Assignment 8 Due

24 Holiday

29 (Sean) Learning in Recurrent Networks

**December**

01 (Stefano) Continual Learning and Catastrophic Forgetting

06 (Stefano) Reinforcement Learning (Assignment 10)

07 Assignment 9 Due

08 (Larry) Course Wrap up

14 Assignment 10 Due

## Spring 2022

**Computational Statistics (Stat GR6104), Spring 2022****Time**: Tu 2:10 - 3:40pm**Place**: Zoom for a while, then JLG L7-119**Professor**: Liam Paninski; Email: **liam at stat dot columbia dot edu**. Hours by appointment.**This is a Ph.D.-level course in computational statistics. ****Note:** instructor permission is required to take this class for students outside of the Statistics Ph.D. program.

See this page for additional course details.

**Advanced Theory, Spring 2022**

Time: Tuesdays, 10.00 am**Place: **Jerome L Greene Science Center, L7-119

For the zoom link contact [email protected]

**Schedule:**

**Methods for decoding and interpreting neural data**

Feb 01 Methods on calculating receptive fields (Ji Xia) Material

Feb 08 Information theory (Jeff Johnston)

Feb 15 cancelled

**Neural representations**

Feb 22 Geometry of abstractions (theory session) (Valeria Fascianelli) slides

Mar 01 Geometry of abstractions (hands-on session) (Lorenzo Posani)

**Network dynamics**

Mar 08 Solving very nonlinear problems with Homotopy Analysis Method (Serena Di Santo) notes notebook

Mar 29 Forgetting in attractor networks (Samuel Muscinelli)

Apr 05 Mean-field models of network dynamics (Alessandro Sanzeni + Mario Dipoppa)

**Dynamics of learning**

Apr 12 Learning dynamics in feedforward neural networks (Manuel Beiran + Rainer Engelken)

Apr 19 Learning dynamics recurrent neural networks (Manuel Beiran + Rainer Engelken)

**Causality**

Apr 26 Introduction to some causality issues in neuroscience (Laureline Logiaco)

May 3 Causality and latent variable models (Josh Glaser)

May 10 student presentations

May 17 Bonus session: Many methods, one problem: modern inference techniques as applied to linear regression (Juri Minxha)

**Mathematical Tools for Theoretical Neuroscience (NBHV GU4359)**

Spring 2022: Download schedule**Teaching Assistants:**

Ching Fang [email protected]

Jack Lindsey [email protected]

Zhenrui Liao [email protected]

Amol Pasarkar [email protected]

Dan Tyulmankov [email protected]

**Recitation instructor: **David Clark [email protected]

**Faculty contact**: Prof. Ken Miller* [email protected]

*(*****Please contact Prof. Miller to sign add/drop forms and other items which require faculty permission)*

**Time**:*Lectures: *Tues/Thurs 10:10 - 11:25a*Office hours: *Thurs 11:30a - 12:30p*Recitations:* Alternating Fridays 10 - 11a

**Location**: Zoom

**Webpage:** CourseWorks

**Credits:** 3

**Description**: An introduction to mathematical concepts used in theoretical neuroscience aimed to give a minimal requisite background for NBHV G4360, Introduction to Theoretical Neuroscience. The target audience is students with limited mathematical background who are interested in rapidly acquiring the vocabulary and basic mathematical skills for studying theoretical neuroscience, or who wish to gain a deeper exposure to mathematical concepts than offered by NBHV G4360. Topics include single- and multivariable calculus, linear algebra, differential equations, signals and systems, and probability. Examples and applications are drawn primarily from theoretical and computational neuroscience.

**Prerequisites**: Basic prior exposure to trigonometry, calculus, and vector/matrix operations at the high school level.

## Fall 2021

**Introduction to Theoretical Neuroscience, Fall 2021****Faculty:** Larry Abbott, Sean Escola, Stefano Fusi, Ashok Litwin-Kumar, Ken Miller

**TAs: **Elom Amematsro, Ramin Khajeh, Minni Sun, Denis Turcu

**Meetings: **Tuesdays & Thursdays 2:00-3:30

**Text:** Theoretical Neuroscience by P. Dayan and L.F. Abbott (MIT Press)

**September**

09 (Larry) Introduction to Course and to Computational Neuroscience

14 (Larry) Electrical Properties of Neurons, Integrate-and-Fire Model

16 (Stefano) Adaptation, Synapses, Synaptic Plasticity

21 (Larry) The Hodgkin-Huxley Model (Assignment 1)

23 (Larry) Types of Neuron Models and Networks

28 (Larry) Numerical Methods, Filtering (Assignment 2)

29 Assignment 1 Due

30 (Sean) Generalized Linear Models

**October**

05 (Ken) Linear Algebra (Assignment 3)

06 Assignment 2 Due

07 (Ken) PCA and Dimensionality Reduction

12 (Ken) Rate Networks/E-I networks I (Assignment 4)

13 Assignment 3 Due

14 (Ken) Rate Networks/E-I networks II

19 (Ken) Unsupervised/Hebbian Learning, Developmental Models (Assignment 5)

20 Assignment 4 Due

21 (Ashok) Introduction to Probability, Encoding, Decoding

26 (Ashok) Decoding, Fisher Information I

27 Assignment 5 Due

28 (Ashok) Decoding, Fisher Information II (Assignment 6)

**November**

02 Holiday

04 (Ashok) Information Theory

09 (Ashok) Optimization I (Assignment 7)

10 Assignment 6 Due

11 (Ashok) Optimization II

16 (Stefano) The Perceptron (Assignment 8)

17 Assignment 7 Due

18 (Stefano) Multilayer Perceptrons and Mixed Selectivity

23 Holiday

25 Holiday

30 (Stefano) Deep Learning (Assignment 8)

**December**

01 Assignment 8 Due

02 (Sean) Learning in Recurrent Networks

07 (Stefano) Continual Learning and Catastrophic Forgetting (Assignment 10)

08 Assignment 9 Due

09 (Stefano) Reinforcement Learning

15 Assignment 10 Due

## Spring 2021

**Introduction to Theoretical Neuroscience, Spring 2021**

**Faculty: **Larry Abbott, Stefano Fusi, Ashok Litwin Kumar, Ken Miller

**TAs:** Matteo Alleman, Elom Amematsro, Ramin Khajeh, Denis Turcu

**Meetings: **Tuesdays 2:00-3:30 & Thursdays 1:30-3:00**Text: **Theoretical Neuroscience by P. Dayan and L.F. Abbott (MIT Press)

**January**

12 (Larry) Introduction to the Course and to Theoretical Neuroscience

14 (Larry) Electrical Properties of Neurons, Integrate-and-Fire Model

19 (Larry) Adaptation, Synapses, Spiking Networks (Assignment 1)

21 (Larry) Numerical Methods, Filtering

26 (Larry) The Hodgkin-Huxley Model (Assignment 2)

27 Assignment 1 Due

28 (Larry) Types of Neuron Models and Networks (Assignment 3)

**February**

02 (Ken) Linear Algebra I

03 Assignment 2 Due

04 (Ken) Linear Algebra II

09 (Ken) PCA and Dimensionality Reduction (Assignment 4)

10 Assignment 3 Due

11 (Ken) Rate Networks/E-I networks I

16 (Ken) Rate Networks/E-I networks II (Assignment 5)

17 Assignment 4 Due

18 (Ken) Unsupervised/Hebbian Learning, Developmental Models

23 (Ashok) Introduction to Probability, Encoding, Decoding (Assignment 6)

24 Assignment 5 Due

25 (Sean) GLMs

**March**

02 Spring Break

04 Spring Break

09 (Ashok) Decoding, Fisher Information I

10 Assignment 6 Due

11 (Ashok) Decoding, Fisher Information II

16 (Ashok) Information Theory (Assignment 7)

18 (Ashok) – Optimization

23 (Ashok) – Optimization II (Assignment 8)

24 Assignment 7 Due

25 (Stefano) Perceptron

30 (Stefano) Multilayer Perceptrons and Mixed Selectivity (Assignment 9)

31 Assignment 8 Due

**April**

01 (Stefano) – Deep Learning I (backpropagation)

06 (Stefano) – Deep Learning II (convolutional networks) (Assignment 10)

07 Assignment 9 Due

08 (Stefano) Learning in Recurrent Networks

13 (Stefano) Continual Learning and Catastrophic Forgetting (Assignment 11)

14 Assignment 10 Due

15 (Stefano) Reinforcement Learning

21 Assignment 11 Due

**Mathematical Tools for Theoretical Neuroscience (NBHV GU4359)**

**Faculty**: Ken Miller ([email protected])

**Instructor**: Dan Tyulmankov ([email protected])

**Teaching Assistant:** Amin Nejatbakhsh ([email protected]) (**Office Hours**: Thursdays 1-2pm)

**Time**: Tuesdays, Thursdays 8:40a-10:00a

**Place**: ~~JLGSC L5-084~~ Zoom (https://columbiauniversity.zoom.us/j/489220180)

**Webpage:** CourseWorks

**Credits:** 3 credits, **pass/fail only** (Register on SSOL. **Do not register for a grade.**)

**Description**: An introduction to mathematical concepts used in theoretical neuroscience aimed to give a minimal requisite background for NBHV G4360, Introduction to Theoretical Neuroscience. The target audience is students with limited mathematical background who are interested in rapidly acquiring the vocabulary and basic mathematical skills for studying theoretical neuroscience, or who wish to gain a deeper exposure to mathematical concepts than offered by NBHV G4360. Topics include single- and multivariable calculus, linear algebra, differential equations, dynamical systems, and probability. Examples and applications are drawn primarily from theoretical and computational neuroscience.

**Prerequisites**: Basic prior exposure to trigonometry, calculus, and vector/matrix operations at the high school level

## Fall 2020

**Special Virtual Edition**

**Meetings:** Wednesdays, 10.00 am

**Location:** Zoom, contact [email protected] for details

**Schedule**

07/15/2020 Time-dependent mean-field theory for mathematical streetfighters **(Rainer Engelken)**

07/22/2020 Predictive coding in balanced neural networks with noise, chaos, and delays, Article (Everyone)

07/29/2020 A solution to the learning dilemma for recurrent networks of spiking neurons, Article (Everyone)

08/05/2020 Canceled

08/12/2020 Artificial neural networks for neuroscientists: A primer, Article **(Robert Yang)**

08/19/2020 A mechanism for generating modular activity in the cerebral cortex** (Bettina Hein)**

08/26/2020 Dynamic representations in networked neural systems, Article **(Kaushik Lakshminarasimhan)**

09/02/2020 Network principles predict motor cortex population activity across movement speeds **(Shreya Saxena)**

09/09/2020 Modeling neurophysiological mechanisms of language production **(Serena Di Santo)**

09/16/2020 How single rate unit properties shape chaotic dynamics and signal transmission in random neural networks, Article **(Samuel Muscinelli)**

09/23/2020 Shaping dynamics with multiple populations in low-rank recurrent networks, Article **(Laureline Logiaco)**

09/30/2020 Deep Graph Pose: a semi-supervised deep graphical model for improved animal pose tracking, Article **(Anqi Wu)**

10/07/2020 Decoding and mixed selectivity, Article, Article, Article **(Fabio Stefanini)**

10/14/2020 Theory of gating in recurrent neural networks, Article, Article** (Kamesh Krishnamurthy)**

10/21/2020 Decentralized motion inference and registration of neuropixel data **(Erdem Varol)**

10/28/2020 Decision, interrupted** (NaYoung So)**

11/04/2020 Abstract rules implemented via neural dynamics (**Kenny Kay)**

11/11/2020 Canceled for Holiday

11/18/2020 "Rodent paradigms for the study of volition (free will)" **(Cat Mitelut)**

11/25/2020 Gaussian process inference **(Geoff Pleiss)**

12/02/2020 Canceled

12/09/2020 Structure and variability of optogenetic responses in multiple cell-type cortical circuits **(Agostina Palmigiano)**

12/16/2020 Manifold GPLVMs for discovering non-Euclidean latent structure in neural data, Article** (Josh Glaser)**

*Page Last Updated: 02/14/2023*