Inspired by lots of great discussions at this year's [ICML] Int'l Conference on Machine Learning on Koopman/Transfer operators. For all those interested in this line of my work, here is a short summary. All of it is joint work with amazing people mainly from the Istituto Italiano di Tecnologia and École Polytechnique to whom I am super grateful, most of all for the joy, enthusiasm, and passion in making this project together 😃
We started at #NeurIPS2022 by formalizing then-popular algorithms like Dynamic Mode Decomposition (in its many flavors: extended, projected, kernel, etc.) as a machine learning problem of vector-valued
regression, and in the process, developed a new RKHS estimator motivated by the need to properly estimate spectral decomposition. ✍🏼
This led to a #NeurIPS2023 paper on minimax optimal bounds for this reduced rank regression (RRR) estimator and the first finite sample bounds for learning leading eigenpairs of normal Koopman operators
with kernel ridge, PCR, and RRR estimators. 🌌
In parallel, at the same #NeurIPS2023, we also introduced the Nystrom version of RRR and proved that the same learning bounds can be preserved with much improvement in sample and compute complexity. 📉
Lessons learned from the developed statistical learning theory: at #ICLR2024, we introduced a novel loss for deep learning to achieve the same generalization quality of kernel methods by learning finite-dimensional representations. 🕸️
While doing so, with our friends in robotics at #LDCC2024, we developed Koopman models that encode morphological symmetries of the rigid body in motion and showed the beautiful interplay of isotypic
and spectral decompositions. 🦾
And so we arrived at this #ICML2024, showing how transfer operator-based ML methods can achieve high probability bounds for forecasting distributions of geometrically ergodic processes for
arbitrary forecasting horizons. ♾️
…and this is just the start 🙈 since we have just put on #arXiv new papers on learning Itô diffusion processes, learning Langevin dynamics from biased data to speed up molecular simulations, using transfer
operators to learn conditional density (and hence conditional quantiles and more UQ fun stuff), and still a few more soon to come.
😃
Last but not least, we are also actively developing codes implementing all this in the #Kooplearn project. 💻
It’s been and continues to be a very joyful ride 🎢😃
Whoever is at ICML now and interested to know more, feel free to message me.