Courses

Advanced Theory

Advance Theory Seminar
Spring 2020


See course website for updated schedule 

Organized by Rainer Engelken

Meetings: Wednesdays, 10.15 - 11.45am


Schedule:
Control Theory and Reinforcement Learning


1/29/2020    Examples of control theory in neuroscience (Bettina Hein, Laureline Logiaco) 

2/5/2020     Methods in control theory (Bettina Hein, Laureline Logiaco)

2/12/2020    Reinforcement learning (James Murray) 

2/19/2020    From optimal control theory to reinforcement learning (Samuel Muscinelli)

2/26/2020    COSYNE Break

3/4/2020      COSYNE Break

Latent Variable Models

3/11/2020    Hidden Markov Models (Matt Whiteway)

3/18/2020    Latent Dynamical Systems (Josh Glaser)

3/25/2020    Gaussian Processes (Rainer Engelken)

4/1/2020     Modeling of Behavior (Juri Minxha)

4/8/2020     Hackathon on Latent Variable Models

Miscellaneous

4/15/2020    Phenomenological Renormalization Group (Serena Di Santo)

4/22/2020    Additional and diverse perspectives in neuroscience (Laureline Logiaco, Matt Whiteway, Juri Minxha)

4/29/2020    Introduction to replica theory (SueYeon Chung)

5/6/2020     Classification and Geometry of General Perceptual Manifolds (SueYeon Chung)

 

Advanced Theory Seminar (Spring 2020) website
Advanced Theory Seminar (Spring 2019) website

Intro to Theory

List of recorded classes

Download Schedule

Spring 2020

Faculty: Larry Abbott, Stefano Fusi, Ashok Litwin Kumar, Ken Miller

TAs: Matteo Alleman, Dan Biderman, Salomon Muller, Amin Nejatbakhsh, Marjorie Xie

Meetings: Tuesdays & Thursdays, JLGSC L5-084, Lecture 2.00 - 3.30pm

Text: Theoretical Neuroscience by P. Dayan and L.F. Abbott (MIT Press)

January
21      (Larry) Introduction to the Course and to Theoretical Neuroscience
23      (Larry) Mathematics Review: Notes
28      (Larry) Electrical Properties of Neurons, Integrate-and-Fire Model (Assignment 1, neuron models
30      (Larry) Adaptation, Synapses, Spiking Networks (Numerical methods)

February
4       (Larry) Numerical Methods, Filtering (Assignment 2)
5       Assignment 1 Due
6       (Larry) The Hodgkin-Huxley Model (I&F Model, White Noise, Synapses-Networks)
11      (Larry) Types of Neuron Models and Networks (Assignment 3, Poisson Spiking, Networks)
12      Assignment 2 Due
13      (Ashok) Linear Algebra I (Notes)
18      (Ashok) Linear Algebra II (Notes, Assignment 4, Solutions)
19      Assignment 3 Due
20      (Ashok) Introduction to Probability, Encoding, Decoding (Notes)
25      (Ashok) GLMs (Notes, Assignment 5)
26      Assignment 4 Due
27      COSYNE

March
3       COSYNE
5       (Ashok) Decoding, Fisher Information I (Notes)
10     (Ashok) Canceled
12     (Ashok) Information Theory (Notes, Assignment 6, google-1000-english.txt, Recitation Notes)
14     Assignment 5 Due
17     Spring Break
19     Spring Break
24     (Ken) Canceled – PCA and Dimensionality Reduction I
26     (Ken) – PCA and Dimensionality Reduction II (Notes)
27     Assignment 6 Due
31     (Ken) – Rate Networks/E-I networks I (Notes, Assignment 7, Codes)

April
2       (Ken) – Rate Networks/E-I networks II (Notes)
7       (Ken) – Unsupervised/Hebbian Learning, Developmental Models (Notes, Assignment 8, Ring-Model)     
8       Assignment 7 Due
9       (Ken) – Optimization (Notes)
14     (Ken) – Optimization II (Notes, Assignment 9, Assignment 9 Addendum, Code, Gaussian Problem, Note on Lyapunov functions, Review of simple developmental models, News & Views on feature-map approach, MacKay and Miller, 1990, Miller and MacKay, 1994)
15     Assignment 8 Due
16     (Ashok) Optimization (Notes)
21     (Stefano) Perceptron (Notes, Assignment 10)
23     (Stefano) Multilayer Perceptrons and Mixed Selectivity (Notes)
28     (Stefano) – Deep Learning I (backpropagation) (Assignment 11, Notes, Codes - please note more codes are available on courseworks)
30     (Stefano) – Deep Learning II (convolutional networks) (Notes, Visualizing and Understanding Convolutional Networks, YaminsDiCarlo)

May
1       Assignment 9 & 10 Due
5       (Stefano) Learning in Recurrent Networks (Notes)
7       (Stefano) Continual Learning and Catastrophic Forgetting (Notes)
12     (Stefano) Reinforcement Learning (Notes)
14     Research Topic
15    Assignment 11 Due

 

Introduction to Theoretical Neuroscience (Spring 2019)

Introduction to Theoretical Neuroscience (Spring 2018)

Mathematical Tools

Mathematical Tools for Theoretical Neuroscience (NBHV GU4359)

Download schedule

Spring 2020

Faculty: Ken Miller (kdm2103@columbia.edu)
Instructor: Dan Tyulmankov (dt2586@columbia.edu)
Teaching Assistant: Amin Nejatbakhsh (mn2822@columbia.edu)

Time: Tuesdays, Thursdays 8:40a-10:00a
Place: JLGSC L5-084
Webpage: https://ctn.zuckermaninstitute.columbia.edu/courses
Credits: 3 credits, pass/fail only

Description: An introduction to mathematical concepts used in theoretical neuroscience aimed to give a minimal requisite background for NBHV G4360, Introduction to Theoretical Neuroscience. The target audience is students with limited mathematical background who are interested in rapidly acquiring the vocabulary and basic mathematical skills for studying theoretical neuroscience, or who wish to gain a deeper exposure to mathematical concepts than offered by NBHV G4360. Topics include single- and multivariable calculus, linear algebra, differential equations, dynamical systems, and probability. Examples and applications are drawn primarily from theoretical and computational neuroscience.

Prerequisites: Basic prior exposure to trigonometry, calculus, and vector/matrix operations at the high school level

Grading: 100% attendance-based

Readings and exercises: Lecture notes and optional practice exercises will be provided for each lecture, and supplementary readings will be assigned from various textbooks.

Page Last Updated: 7/16/2020