We investigate open problems in computational neuroscience — especially concerning synaptic plasticity, memory consolidation, continual learning, representational geometry and efficient coding — using tools from theoretical physics, applied mathematics, machine learning and data science. We examine recordings of neural activity, develop targeted data analysis methods, and build theoretical frameworks to shed light on the mechanisms of biological computation.
The main goal of our research is to understand learning and memory under resource constraints in the biological brain, which performs highly non-trivial functions using noisy and severely limited components such as neurons and synapses. By constructing normative mathematical models of brain circuits, which can be tested on experimental data, we attempt to clarify how these limitations are overcome and uncover the underlying computational principles.