Mathematical Tools

Mathematical Tools for Theoretical Neuroscience (NBHV GU4359)

Download schedule

Spring 2023

Class Assistants:Amol Pasarkar ([email protected]), Christine Liu ([email protected]),  Elom Amematsro  ([email protected]), Ines Aitsahalia ([email protected]), Ching Fang ([email protected]).


Faculty contact: Prof. Ken Miller ([email protected])

Time: Tuesdays, Thursdays 12:10 - 1:25pm
Place: L5-084
Webpage: CourseWorks (announcements, assignments, readings)

Description: An introduction to mathematical concepts used in theoretical neuroscience aimed to give a minimal requisite background for NBHV G4360, Introduction to Theoretical Neuroscience. The target audience is students with limited mathematical background who are interested in rapidly acquiring the vocabulary and basic mathematical skills for studying theoretical neuroscience, or who wish to gain a deeper exposure to mathematical concepts than offered by NBHV G4360. Topics include single- and multivariable calculus, linear algebra, differential equations, signals and systems, and probability. Examples and applications are drawn primarily from theoretical and computational neuroscience. 

Prerequisites: Basic prior exposure to trigonometry, calculus, and vector/matrix operations at the high school level.


  • Undergraduate and graduate students: Must register** on SSOL.

(**If you’re only interested in attending a subset of lectures, register Pass/Fail and contact Dan) 


Mathematical Tools for Theoretical Neuroscience (Spring 2021)

Mathematical Tools for Theoretical Neuroscience (Spring 2020)

Computational Statistics

Computational Statistics (Stat GR6104), Spring 2023

This is a Ph.D.-level course in computational statistics. A link to the most recent previous iteration of this course is here.

Note: instructor permission is required to take this class for students outside of the Statistics Ph.D. program.

Time: Th 2:30 - 4.00 pm
Place: JLG L3-079
ProfessorLiam Paninski; Email: liam at stat dot columbia dot edu. Hours by appointment.

Course goals: (partially adapted from the preface of Givens' and Hoeting's book): Computation plays a central role in modern statistics and machine learning. This course aims to cover topics needed to develop a broad working knowledge of modern computational statistics. We seek to develop a practical understanding of how and why existing methods work, enabling effective use of modern statistical methods. Achieving these goals requires familiarity with diverse topics in statistical computing, computational statistics, computer science, and numerical analysis. Our choice of topics reflects our view of what is central to this evolving field, and what will be interesting and useful. A key theme is scalability to problems of high dimensionality, which are of most interest to many recent applications.
Some important topics will be omitted because high-quality solutions are already available in most software. For example, the generation of pseudo-random numbers is a classic topic, but existing methods built in to standard software packages will suffice for our needs. On the other hand, we will spend a bit of time on some classical numerical linear algebra ideas, because choosing the right method for solving a linear equation (for example) can have a huge impact on the time it takes to solve a problem in practice, particularly if there is some special structure that we can exploit.

Audience: The course will be aimed at first- and second-year students in the Statistics Ph.D. program. Students from other departments or programs are welcome, space permitting; instructor permission required.

Background: The level of mathematics expected does not extend much beyond standard calculus and linear algebra. Breadth of mathematical training is more helpful than depth; we prefer to focus on the big picture of how algorithms work and to sweep under the rug some of the nitty-gritty numerical details. The expected level of statistics is equivalent to that obtained by a graduate student in his or her first year of study of the theory of statistics and probability. An understanding of maximum likelihood methods, Bayesian methods, elementary asymptotic theory, Markov chains, and linear models is most important.

Programming: With respect to computer programming, good students can learn as they go. We'll forgo much language-specific examples, algorithms, or coding; I won't be teaching much programming per se, but rather will focus on the overarching ideas and techniques. For the exercises and projects, I recommend you choose a high-level, interactive package that permits the flexible design of graphical displays and includes supporting statistics and probability functions, e.g., R, Python, or MATLAB.

Evaluation: Final grades will be based on class participation and a student project.

Deterministic optimization
- Newton-Raphson, conjugate gradients, preconditioning, quasi-Newton methods, Fisher scoring, EM and its various derivatives
- Numerical recipes for linear algebra: matrix inverse, LU, Cholesky decompositions, low-rank updates, SVD, banded matrices, Toeplitz matrices and the FFT, Kronecker products (separable matrices), sparse matrix solvers
- Convex analysis: convex functions, duality, KKT conditions, interior point methods, projected gradients, augmented Lagrangian methods, convex relaxations
- Applications: support vector machines, splines, Gaussian processes, isotonic regression, LASSO and LARS regression

Graphical models: dynamic programming, hidden Markov models, forward-backward algorithm, Kalman filter, Markov random fields

Stochastic optimization: Robbins-Monro and Kiefer-Wolfowitz algorithms, simulated annealing, stochastic gradient methods

Deterministic integration: Gaussian quadrature, quasi-Monte Carlo. Application: expectation propagation

Monte Carlo methods
- Rejection sampling, importance sampling, variance reduction methods (Rao-Blackwellization, stratified sampling)
- MCMC methods: Gibbs sampling, Metropolis-Hastings, Langevin methods, Hamiltonian Monte Carlo, slice sampling. Implementation issues: burnin, monitoring convergence
- Sequential Monte Carlo (particle filtering)
- Variational and stochastic variational inference

Givens and Hoeting (2005) Computational statistics
Robert and Casella (2004) Monte Carlo Statistical Methods
Boyd and Vandenberghe (2004), Convex Optimization.
Press et al, Numerical Recipes
Sun and Yuan (2006), Optimization theory and methods
Fletcher (2000) Practical methods of optimization
Searle (2006) Matrix Algebra Useful for Statistics
Spall (2003), Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control
Shewchuk (1994), An Introduction to the Conjugate Gradient Method Without the Agonizing Pain
Boyd et al (2011), Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers

Neuroscience and Philosophy

Neuroscience and Philosophy (GU4500), Spring 2023

Time: Thursdays, 2:10-4:00
Place: L5-084, Jerome L Greene (Zuckerman Institute)
Professor: John Morrison, Email: [email protected], office hours  

Description: This course is about the philosophical foundations of cognitive neuroscience. Cognitive neuroscientists often describe the brain as representing and inferring. It is their way of describing the overall function of the brain’s activity, an abstraction away from detailed neural recordings. But, because there are no settled definitions of representation and inference, there are often no objective grounds for these descriptions. As a result, they are often treated as casual glosses rather than as rigorous analyses. Philosophers have proposed a number of (somewhat) rigorous definitions, but they rarely say anything about the brain. The goal of this course is to survey the philosophical literature and consider which definitions, if any, might be useful to cognitive neuroscientists. I will begin each class with a description of a different definition from the philosophical literature. We will then rely on our collective expertise to assess its potential applications to the brain. This course is for graduate students in Neurobiology and Behavior. No prior background in philosophy will be assumed.

Format: Each lecture will begin with a summary of the books and articles listed below. But I don’t expect you to read any of them. Many of them are (unnecessarily!) dense and inaccessible. Plus, they total hundreds of pages. It’s my job to distill them into a few basic points. If you want to learn more, reach out and I will recommend particular chapters, papers, etc.

Evaluation: Graduate students in Neurobiology and Behavior should write a 15-page term paper applying one of the philosophical definitions to a particular experiment, preferably an experiment relevant to their own research. Other students should consult with the professor to identify an alternative form of evaluation. Undergraduates will be required to write at least two papers.

View Schedule

January 19th: Introduction

January 26th: Representation: Information Theories

-Artiga and Sebasti, “Informational theories of content and mental representation”
-DeWeese and Meister, “How to Measure the information gained from one symbol”
-Scarantino, “Information as a probabilistic difference maker''
-Fodor, “Information and representation”
-Field, “Narrow” aspects of intentionality and the information-theoretic approach to content”
-Roche and Shogenji, “Information and inaccuracy”
-Usher, “A statistical referential theory of content: Using information theory to account for

February 2nd: Representation: Causal Theories

-Buras, “An argument against causal theories of mental content”
-Dretske, Knowledge and the Flow of Information
-Dretske, Naturalizing the Mind
-Dretske, “Misrepresentation”
-Eliasmith, “A new perspective on representational problems”
-Fodor, A Theory of Content and Other Essays
-Fodor, Psychosemantics
-Rupert, “The Best Test Theory of Extension: First Principle(s)”
-Ryder, “SINBAD neurosemantics: A Theory of Mental Representation”
-Sober and Roche, ``Disjunction and distality”
-Tye, Ten Problems of Consciousness
-Tye, Consciousness, Color, and Content

February 9th: No class

February 16th: Representation: Signaling Theories

-Birch, “Propositional content in signalling systems''
-Godfrey-Smith, Complexity and the Function of Mind in Nature
-Lewis, Convention
-Skyrms, Signals: Evolution, Learning, and Information
-Shea, Godfrey-Smith, Cao, “Content in simple signalling systems”
-Viera, “Representation without informative signaling”

February 23rd: Representation: Isomorphism Theories

-Clark, Sensory Qualities
-Cummins, Meaning and Mental Representation
-Cummins, Representations, Targets and Attitudes
-Ismael, The Situated Self
-Millikan, “Review of Cummins Representations, Targets and Attitudes”
-O’Brien and Opie, “Notes Toward a Structuralist Theory of Mental Representation”
-Newell, Unified Theories of Cognition
-Shagrir, “Structural Representations and the Brain”

March 2nd: Representation: Selection Theories

-Millikan, “Biosemantics”
-Millikan, “Pushmi-pullyu Representations”
-Millikan, Language, Thought and Other Biological Categories
-Millikan, White Queen Psychology and Other Essays for Alice
-Millikan, Beyond Concepts: Unicepts, Language, and Natural Information
-Millikan. “Neuroscience and Teleosemantics”
-Shea, On Millikan

March 9th: Representation: Selection Theories (Part II)

-Green, “Psychosemantics and The Rich/Thin Debate”
-Neander, A Mark of the Mental: A Defence of Informational Teleosemantics
-PPR symposium on A Mark of the Mental
-Neander, “Content for Cognitive Science”
-Nanay, “Teleosemantics Without Etiology”
-Schulte, “Perceiving the outside world”

March 23rd: Representation: Selection and Isomorphism Hybrid

-Shea, Representation in Cognitive Science
-PPR symposium on Representation in Cognitive Science

March 30th: Representation: Nihilists, Deflationists, Pragmatists, and Interpretationists

-Cao, “Putting representations to use'”
-Chomsky, “Language and nature”
-Churchland, Patricia Neurophilosophy
-Churchland, Paul A Neurocomputational Perspective
-Dennett, The Intentional Stance
-Egan, “How to think about mental content”
-Egan, “The nature and function of content in computational models”
-Egan, “A deflationary account of mental representation”
-Horgan and Tienson, Connectionism and the Philosophy of Psychology
-Ramsey, Representation Reconsidered
-Rudder-Baker, “Instrumental intentionality”
-Sprevak, “Fictionalism about Neural Representations”
-Williams, “Representational scepticism”

April 6th: Representation: Spillover, my own view (“comparativism”), and optional presentations

April 13th and April 20th: Inference: Mapping, Mechanistic, and Teleological Theories

-Chalmers, “A Computational Foundation for the Study of Cognition” (and replies)
-Chalmers, “Does a Rock Implement Every Finite-State Automaton?”
-Chalmers, “On Implementing a Computation”
-Godfrey-Smith, “Triviality Arguments Against Functionalism”
-Maley, “Medium Independence and the Failure of the Mechanistic Account of Computation”
-Maudlin, “Computation and Consciousness”
-Putnam, Representation and Reality
-Piccinini, Physical Computation
-Rescorla, “A Theory of Computational Implementation”
-Shagrir, The Nature of Physical Computation
-Shagrir, “In Defense of the Semantic View of Computation”
-Sprevak, “Triviality arguments about computational implementation”
-Sprevak, “Computation, individuation, and the received view on representation”

April 27th: Inference: Spillover, my own view (“learning aptitude”), and optional presentations


Advanced Theory

Spring 2023

Meetings: Wednesdays, 10.00-11.30 am

Location: L4.078 (will eventually be L5.084, but check email announcements) Jerome L. Greene Science Center, Broadway, New York, NY

For the zoom link contact [email protected]

Registration: Please register via Courseworks, the course is also open to external guests


Information theory

  • Jan 25 Information theory (Jeff Johnston) Material
  • Feb 01 Fisher information (Jeff Johnston)
  • Feb 08 Gaussian Information bottleneck (Rainer Engelken)


Learning systems

  • Feb 15 Expectation Maximization (Ji Xia)
  • Feb 22 Reinforcement learning I (Kaushik Lakshminarasimhan)
  • Mar 01 Reinforcement learning II (Kaushik Lakshminarasimhan)
  • Mar 08 No session (Cosyne)
  • Mar 15 No session (Spring break)
  • Mar 22 Feedforward architectures I (Samuel Muscinelli)
  • 29 Mar Feedforward architectures II (Samuel Muscinelli)


Neural dynamics and computations

  • Apr 05 Oscillations and Wilson-Cowan model (Patricia Cooney)
  • Apr 12 Mean-field theory and perturbative approaches (Agostina Palmigiano)
  • Apr 19 Low-rank neural networks I (Manuel Beiran)
  • Apr 26  Low-rank neural networks II (Manuel Beiran)
  • May 3 Single-neuron computations (Salomon Muller)
  • May 10 Student presentations

Advanced Theory Seminar (Spring 2022)
Advanced Theory Seminar (Summer/Fall 2020)
Advanced Theory Seminar (Spring 2020) website
Advanced Theory Seminar (Spring 2019) website

Intro to Theory

Computational Neuroscience, Fall 2022

Larry Abbott, Ken Miller, Ashok Litwin Kumar, Stefano Fusi, Sean Escola

TAs: Elom Amematsro, Ho Yin Chau, David Clark, Zhenrui Liao

Meetings: Tuesdays & Thursdays 2:00-3:30 (JLG L5-084)

Text - Theoretical Neuroscience by P. Dayan and L.F. Abbott (MIT Press)

Webpage -


Download Schedule


6         (Larry) Introduction to Course and to Computational Neuroscience   

8         (Larry)  Electrical Properties of Neurons, Integrate-and-Fire Model

13       (Larry) Numerical Methods, Filtering (Assignment 1)

15       (Larry) The Hodgkin-Huxley Model

20       (Larry) Types of Neuron Models and Networks (Assignment 2)    

21        Assignment 1 Due 

22       (Stefano) Adaptation, Synapses, Synaptic Plasticity

27       (Sean) Generalized Linear Models

28       Assignment 2 Due

29       (Ken) Linear Algebra I (Assignment 3)


4         (Ken) Linear Algebra II

6         (Ken) PCA and Dimensionality Reduction

11        (Ken) Rate Networks/E-I networks I (Assignment 4)             

12        Assignment 3 Due

13       (Ken) Rate Networks/E-I networks II

18       (Ken) Unsupervised/Hebbian Learning, Developmental Models (Assignment 5)

19        Assignment 4 Due

20       (Ashok) Introduction to Probability, Encoding, Decoding

25       (Ashok) Decoding, Fisher Information I

26        Assignment 5 Due

27        (Ashok) Decoding, Fisher Information II (Assignment 6)


1         (Ashok) Information Theory

3         (Ashok) Optimization I (Assignment 7)

8         Holiday

9         Assignment 6 Due

10       (Ashok) Optimization II

15       (Stefano) The Perceptron (Assignment 8)

16       Assignment 7 Due

17       (Stefano) Multilayer Perceptrons and Mixed Selectivity

22       (Stefano) Deep Learning (Assignment 9)

23       Assignment 8 Due

24       Holiday  

29      (Sean) Learning in Recurrent Networks        


1       (Stefano) Continual Learning and Catastrophic Forgetting

6       (Stefano) Reinforcement Learning (Assignment 10)

7        Assignment 9 Due

8        (Larry) Course Wrap up       

14      Assignment 10 Due


Introduction to Theoretical Neuroscience (Fall 2021)

Introduction to Theoretical Neuroscience (Spring 2021)

Introduction to Theoretical Neuroscience (Spring 2020)

Introduction to Theoretical Neuroscience (Spring 2019)

Introduction to Theoretical Neuroscience (Spring 2018)

Statistical analysis of neural data

Statistical analysis of neural data (GR8201), Fall 2021

This is a Ph.D.-level topics course in statistical analysis of neural data. Students from statistics, neuroscience, and engineering are all welcome to attend. A link to the last iteration of this course is here.

Time: F 10-11:30
Place: JLG L8-084
ProfessorLiam Paninski; Office: Zoom. Email: [email protected]. Hours by appointment.

View Schedule

Prerequisite: A good working knowledge of basic statistical concepts (likelihood, Bayes' rule, Poisson processes, Markov chains, Gaussian random vectors), including especially linear-algebraic concepts related to regression and principal components analysis, is necessary. No previous experience with neural data is required.

Evaluation: Final grades will be based on class participation and a student project. Additional informal exercises will be suggested, but not required. The project can involve either the implementation and justification of a novel analysis technique, or a standard analysis applied to a novel data set. Students can work in pairs or alone (if you work in pairs, of course, the project has to be twice as impressive). See this page for some links to available datasets; or talk to other students in the class, many of whom have collected their own datasets.

Course goals: We will introduce a number of advanced statistical techniques relevant in neuroscience. Each technique will be illustrated via application to problems in neuroscience. The focus will be on the analysis of single and multiple spike train and calcium imaging data, with a few applications to analyzing intracellular voltage and dendritic imaging data. Note that this class will not focus on MRI or EEG data. A brief list of statistical concepts and corresponding neuroscience applications is below.

Page Last Updated: 01/23/2023