Courses

Advanced Theory

Special Virtual Edition, Summer/Fall 2020

Meetings: Wednesdays, 10.15 - 11.45 am 

Location: Zoom, contact re2365@columbia.edu for details

Schedule:

7/15/2020     Time-dependent mean-field theory for mathematical streetfighters (Rainer Engelken)

7/22/2020     Predictive coding in balanced neural networks with noise, chaos, and delays, Article (Everyone) 

7/29/2020     A solution to the learning dilemma for recurrent networks of spiking neurons, Article (Everyone) 

8/5/2020       Canceled             

8/12/2020     Artificial neural networks for neuroscientists: A primer, Article (Robert Yang)

8/19/2020     A mechanism for generating modular activity in the cerebral cortex (Bettina Hein)

8/26/2020     Dynamic representations in networked neural systems, Article (Kaushik Lakshminarasimhan) 

9/2/2020       Network principles predict motor cortex population activity across movement speeds (Shreya Saxena)

9/9/2020       Modeling neurophysiological mechanisms of language production (Serena Di Santo)

9/16/2020     How single rate unit properties shape chaotic dynamics and signal transmission in random neural networks, Article (Samuel Muscinelli)

9/23/2020     Shaping dynamics with multiple populations in low-rank recurrent networks, Article (Laureline Logiaco)

9/30/2020     Deep Graph Pose: a semi-supervised deep graphical model for improved animal pose tracking, Article (Anqi Wu)

10/7/2020     Decoding and mixed selectivity, Article, Article, Article (Fabio Stefanini)

10/14/2020    Theory of gating in recurrent neural networks, Article, Article (Kamesh Krishnamurthy)

10/21/2020    Decentralized motion inference and registration of neuropixel data (Erdem Varol)

10/28/2020    Decision, interrupted (NaYoung So)

11/4/2020      Abstract rules implemented via neural dynamics (Kenny Kay)

11/11/2020     Canceled for Holiday

11/18/2020     "Rodent paradigms for the study of volition (free will)" (Cat Mitelut)

11/25/2020      Gaussian process inference (Geoff Pleiss)

12/2/2020        Neural mechanism of predicting sensory consequences of movements in the mammalian brain (Helen Hou)

 

Advanced Theory Seminar (Spring 2020) website
Advanced Theory Seminar (Spring 2019) website

Intro to Theory

List of recorded classes

Download Schedule

Spring 2020

Faculty: Larry Abbott, Stefano Fusi, Ashok Litwin Kumar, Ken Miller

TAs: Matteo Alleman, Dan Biderman, Salomon Muller, Amin Nejatbakhsh, Marjorie Xie

Meetings: Tuesdays & Thursdays, JLGSC L5-084, Lecture 2.00 - 3.30pm

Text: Theoretical Neuroscience by P. Dayan and L.F. Abbott (MIT Press)

January
21      (Larry) Introduction to the Course and to Theoretical Neuroscience
23      (Larry) Mathematics Review: Notes
28      (Larry) Electrical Properties of Neurons, Integrate-and-Fire Model (Assignment 1, neuron models
30      (Larry) Adaptation, Synapses, Spiking Networks (Numerical methods)

February
4       (Larry) Numerical Methods, Filtering (Assignment 2)
5       Assignment 1 Due
6       (Larry) The Hodgkin-Huxley Model (I&F Model, White Noise, Synapses-Networks)
11      (Larry) Types of Neuron Models and Networks (Assignment 3, Poisson Spiking, Networks)
12      Assignment 2 Due
13      (Ashok) Linear Algebra I (Notes)
18      (Ashok) Linear Algebra II (Notes, Assignment 4, Solutions)
19      Assignment 3 Due
20      (Ashok) Introduction to Probability, Encoding, Decoding (Notes)
25      (Ashok) GLMs (Notes, Assignment 5)
26      Assignment 4 Due
27      COSYNE

March
3       COSYNE
5       (Ashok) Decoding, Fisher Information I (Notes)
10     (Ashok) Canceled
12     (Ashok) Information Theory (Notes, Assignment 6, google-1000-english.txt, Recitation Notes)
14     Assignment 5 Due
17     Spring Break
19     Spring Break
24     (Ken) Canceled – PCA and Dimensionality Reduction I
26     (Ken) – PCA and Dimensionality Reduction II (Notes)
27     Assignment 6 Due
31     (Ken) – Rate Networks/E-I networks I (Notes, Assignment 7, Codes)

April
2       (Ken) – Rate Networks/E-I networks II (Notes)
7       (Ken) – Unsupervised/Hebbian Learning, Developmental Models (Notes, Assignment 8, Ring-Model)     
8       Assignment 7 Due
9       (Ken) – Optimization (Notes)
14     (Ken) – Optimization II (Notes, Assignment 9, Assignment 9 Addendum, Code, Gaussian Problem, Note on Lyapunov functions, Review of simple developmental models, News & Views on feature-map approach, MacKay and Miller, 1990, Miller and MacKay, 1994)
15     Assignment 8 Due
16     (Ashok) Optimization (Notes)
21     (Stefano) Perceptron (Notes, Assignment 10)
23     (Stefano) Multilayer Perceptrons and Mixed Selectivity (Notes)
28     (Stefano) – Deep Learning I (backpropagation) (Assignment 11, Notes, Codes - please note more codes are available on courseworks)
30     (Stefano) – Deep Learning II (convolutional networks) (Notes, Visualizing and Understanding Convolutional Networks, YaminsDiCarlo)

May
1       Assignment 9 & 10 Due
5       (Stefano) Learning in Recurrent Networks (Notes)
7       (Stefano) Continual Learning and Catastrophic Forgetting (Notes)
12     (Stefano) Reinforcement Learning (Notes)
14     Research Topic
15    Assignment 11 Due

 

Introduction to Theoretical Neuroscience (Spring 2019)

Introduction to Theoretical Neuroscience (Spring 2018)

Mathematical Tools

Mathematical Tools for Theoretical Neuroscience (NBHV GU4359)

Download schedule

Spring 2020

Faculty: Ken Miller (kdm2103@columbia.edu)
Instructor: Dan Tyulmankov (dt2586@columbia.edu)
Teaching Assistant: Amin Nejatbakhsh (mn2822@columbia.edu)

Time: Tuesdays, Thursdays 8:40a-10:00a
Place: JLGSC L5-084
Webpage: https://ctn.zuckermaninstitute.columbia.edu/courses
Credits: 3 credits, pass/fail only

Description: An introduction to mathematical concepts used in theoretical neuroscience aimed to give a minimal requisite background for NBHV G4360, Introduction to Theoretical Neuroscience. The target audience is students with limited mathematical background who are interested in rapidly acquiring the vocabulary and basic mathematical skills for studying theoretical neuroscience, or who wish to gain a deeper exposure to mathematical concepts than offered by NBHV G4360. Topics include single- and multivariable calculus, linear algebra, differential equations, dynamical systems, and probability. Examples and applications are drawn primarily from theoretical and computational neuroscience.

Prerequisites: Basic prior exposure to trigonometry, calculus, and vector/matrix operations at the high school level

Grading: 100% attendance-based

Readings and exercises: Lecture notes and optional practice exercises will be provided for each lecture, and supplementary readings will be assigned from various textbooks.

Page Last Updated: 11/17/2020