I am a PhD student in Applied Mathematics at Harvard, where I am supervised by Professor Cengiz Pehlevan. My work focuses on the mathematics of machine learning.
During my PhD, I’m interested in describing neural computation in the context of structured data and tasks, using tools from statistical physics and geometry. My current projects focus on exact analysis of in-context learning using random matrix theory, and the effects of low-dimensional structure on learning dynamics. My past research spans a range of applied maths areas, including early-universe cosmology, group theory, and ML4Physics.
I’m always happy to discuss my work and related topics, please reach out!
PhD in Applied Mathematics, 2023 - Present
Harvard University, School of Engineering and Applied Sciences
MSc in Theoretical Physics, 2022 - 2023
Perimeter Institute for Theoretical Physics
BA in Mathematics, 2018 - 2022
University of Cambridge, St Johns College
[Dec 2024] Presented poster (corresponding paper) at Neurips M3L.
[Summer 2024] Gave a talk about in-context-learning at the Kempner Institute, and presented posters at DIMACS Modelling Randomness workshop and Princeton ML Theory Summer School.
[July 2023] Started a research internship at Mila with Prof Siamak Ravanbakhsh on equivariance and neural operators.
[June 2023] Defended my master’s thesis (here’s the working draft).