CS590NN/690NN: Neural Networks in AI and Neuroscience (Spring 2026)
This is a project based course, focusing on the connection between neuroscience and AI technology. The brain is a strong, robust, and highly efficient adaptive controller which seamlessly combines electric bursts and analog chemicals. It is capable of contextual perception, prediction, and generation, with multi-scale and multi-level activity. Many of the large leaps in AI come from the understanding of human’s cognition and animal behavior. The course is a must background for research in the intersection of AI and neuroscience. We will discuss additional learning architectures beyond the famous deep networks, such as Kohonen self-organizing networks, Hopfield memory networks, and networks that change activity by context; involve the most advanced ideas that improve AI including lifelong learning and low-energy analog computing; and further introduce neuroscientific findings to improve future AI – such as neuromodulation, dendritic trees, diversity of components, sequence generations, and multi-objectivity. While the course has some homeworks, a chief part of the grade will come from active participation, project, and presentation.
The course requires successful completion of previous courses in AI and Machine Learning (311, 389 or 589) or permission by teacher. Delivery method: Students can come to class or be on the zoom – both are allowed. Presentations will be in person.
3 Credits
Class Hours: Mondays 4-6:30PM, LGRC A104A
Schedule
| 1. Single Neuron Models | |
| February 2 | McCulloch & Pittz, Hubert & Wiesel, Learning and Adaptation, Perceptron |
| 2. Hierarchical Structures | |
| February 9 | Sensory Hierarchies (e.g. visual), Backpropagation Learning |
| February 16 | Holiday – President’s Day |
| February 19 | Forward Propagation Learning, Generalized Hierarchies for Abstraction and Generalization |
| February 23 | Generalization Predictions in AI, Generating Neural Structures |
| 3. Recurrent and Lifelong Learning | |
| March 2 | Turing Capabilities and Super Turing Adaptivity |
| March 9 | Time Series Prediction |
| March 16 | Holiday – Spring Break |
| March 23a | Lifelong Learning, Adaptive Architectures, Neuromodulations for Continual Adaptivity |
| 4. Memory in Biology and Engineering | |
| March 23b | Stable Memories and Dynamical Systems, Hopfield Memories, Generalized Hopfield for Large Capacity |
| March 30 | Dynamic Energy Landscape and Sequence-Memories, Lifelong Updates of Memories, and Biophysics |
| 5. Unlearning | |
| April 6 | Problem Learning and Hardship of Unlearning, Reconsolidation and Relaxing Fears |
| April 13a | Unlearning in AI and Data Science |
| April 13b | Projects: Concepts and Presentations |
| April 20 | Holiday – Patriot’s Day |
| 6. Low-energy Adaptive Reinforcement | |
| April 24 | Reinforcement Learning (RL), Low-energy (RL) with Multi-Step-Size, Fast RL on Sequences, RL with Adaptive Goals |
| 7. Advanced Generators | |
| April 27 | Attentions, Sequences, State-space Modeling |
| May 4 | Diffusion Models and Brain’s Chemicals, Traveling Waves in Biology |
There is no textbook required. However, some recommended books can be used that cover the first few classes:
- “Neural networks: A comprehensive foundation”. Simon Haykin
- “Introduction to the theory of neural computation”. Hertz, Krogh, Palmer
- (Fun historic read) “Parallel Distributed processing”. Rumelheart, McCelland.