The Harvard Machine Learning Foundations group invites applications for a postdoctoral fellowships in a multidisciplinary study of the underpinnings of deep learning. These include representation learning, transfer learning, generalization, applications to scientific computing, equivariance, and connections between artificial neural networks and natural learning systems such as human and animal brains. We are looking for exceptional junior scientists to work collaboratively. Potential advisors include Demba Ba, Boaz Barak, Lucas Janson, Sham Kakade, and Cengiz Pehlevan. Candidates...
Cengiz and Jacob are teaching APMTH 226: Theory of Neural Computation again this fall.
Description: This course is an introduction to the theory of computation with biological and artificial neural networks. We will cover selected topics from theoretical neuroscience and deep learning theory with an emphasis on topics at the research frontier. These topics include expressivity and generalization in deep learning models; infinite-width limit of neural networks and kernel machines; deep learning dynamics; biologically-plausible training of neural networks and models of...
Abdul passed his thesis defense with flying colors and became the first PhD graduate of our group! Congratulations Dr. Canatar! Best of luck in your future endeavors!