Cengiz and Jacob are teaching APMTH 226: Theory of Neural Computation again this fall.

Description: This course is an introduction to the theory of computation with biological and artificial neural networks. We will cover selected topics from theoretical neuroscience and deep learning theory with an emphasis on topics at the research frontier. These topics include expressivity and generalization in deep learning models; infinite-width limit of neural networks and kernel machines; deep learning dynamics; biologically-plausible training of neural networks and models of synaptic plasticity; reinforcement learning in the brain; neural population codes; normative theories of sensory representations; computing with dynamics in recurrent neural networks; attractor network models of memory and spatial maps. Concrete examples of applications of these ideas to the brain will be discussed.

https://canvas.harvard.edu/courses/107176