Title: Physics of Learning and Computation in Natural and Artificial Neural Networks
Abstract: Neuroscience is experiencing accelerating advances in the scale and resolution of neural activity recordings from animals engaged in natural behaviors. At the same time, in machine learning, large-scale artificial neural network models are engaging in natural conversations and generating realistic videos. In this talk, I will show how we can take advantage of these technological advances and derive fundamental principles of neural learning and computation by integrating scientific methods from physics, neuroscience, and machine learning. Furthermore, this understanding then allows us to invent new algorithms that make AI models more reliable and efficient. I present three key contributions: (i) a new framework for studying compositional generalization in multimodal generative models, (ii) a generalization of Noether’s theorem in physics to explain how symmetry constrains the geometry of neural learning dynamics, and (iii) decoding the neural code for retinal response to natural scenes by interpretable AI.