$ cat bio.md
I’m Matt, a final-year PhD candidate at Princeton University working with Prof. Peter Melchior in the Dynamical Learning Lab. My research focuses on stochastic optimization, representation learning, and dynamical systems, with the goal of using principles from physics to build more capable and trainable models.
Prior to Princeton, I developed large-scale simulations of complex physical systems at the Australian National University.
> I am on the job market for Fall 2027.
Long-term vision
To design architectures and optimization methods that enable models to develop a deep understanding of complex dynamical systems, enabling new scientific discoveries and forming the foundations for increasingly general, physics-inspired intelligence.
$ tail -n 3 news.log
| Oct 01, 2025 | Excited to share our latent paper Dynamics of learning: Generating schedules from Latent ODEs, see my blog post |
|---|---|
| Sep 23, 2025 | Congratulations to Columbia undergraduate Angelina Yan on the acceptance of her NeurIPS workshop paper “A novel approach to classification of ECG arrhythmia types with latent ODEs” |
| May 13, 2025 | Lucky to be one of 4 students from Princeton University nominated for the 2025 Google PhD Fellowship! The first nomination from the astrophysical sciences department at Princeton |
$ grep --author="Wiemann" papers.bib | head -4
* previously published as Matt L. Sampson
-
-
Path-minimizing latent ODEs for improved extrapolation and inferenceMachine Learning: Science and Technology, Jun 2025 -
Score-matching neural networks for improved multi-band source separationAstronomy and Computing, Oct 2024 -
Spotting Hallucinations in Inverse Problems with Data-Driven PriorsIn ICML ML4Astrophysics Workshop (Oral), Jul 2023