Skip to content

Research

I am currently working on modeling nonlinear dynamical systems using Koopman theory, and on continual learning (memory) for LLMs.

Other topics I have prior experience with and am interested in studying:

  • probabilistic (Bayesian) inference / sampling, such as Markov chain Monte Carlo and Sequential Monte Carlo methods
  • recurrent alternatives to transformers such as State Space Models and linear attention variants
  • LLM memory, adaptation, test time training and reasoning

I was fortunate to have worked on a variety of research topics in my Bachelor's degree at the University of Toronto. I worked on probabilistic inference for language model alignment with Prof. Roger Grosse, causal inference using normalizing flows with Prof. Rahul Krishnan, and optimal scaling for Markov chain Monte Carlo with Prof. Jeffrey Rosenthal. I also spent a summer at EPFL in Switzerland, where I worked on state space models for graphs with Prof. Volkan Cevher.