Header Paragraph

Two seminars on Tuesday 22nd of July

Image

On the 22.7.2025 there will be two consecutive talks in VR2-158.

Talk 1 - 10:00-11:00 by Boumediene Hamzi

Title: Bridging Machine Learning, Dynamical Systems, and Algorithmic Information Theory: Insights from Sparse Kernel Flows, Poincaré Normal Forms and PDE Simplification

Abstract: In this talk, we explore how Machine Learning (ML) and Algorithmic Information Theory (AIT), though emerging from different traditions, can mutually inform one another in the following directions:

- AIT for Kernel Methods: We investigate how AIT concepts inspire the design of kernels that integrate principles such as Kolmogorov complexity and Normalized Compression Distance (NCD). We propose a novel clustering method based on the Minimum Description Length (MDL) principle, implemented via K-means and Kernel Mean Embedding (KME). Additionally, we employ the Loss Rank Principle (LoRP) to learn optimal kernel parameters for Kernel Density Estimation (KDE), extending AIT-inspired techniques to flexible, nonparametric models.

- Kernel Methods for AIT: We also demonstrate how kernel methods can approximate AIT measures such as NCD and Algorithmic Mutual Information (AMI), offering new tools for compression-based analysis. In particular, we show that the Hilbert-Schmidt Independence Criterion (HSIC) can be interpreted as an approximation to AMI, providing a robust theoretical basis for clustering and dependence measurement. Finally, we illustrate how techniques from ML and Dynamical Systems (DS)—including Sparse Kernel Flows, Poincaré Normal Forms, and PDE Simplification— can be reformulated through the lens of AIT.

Our results suggest that kernel methods are not just flexible tools in ML— they can serve as conceptual bridges across AIT, ML, and DS, leading to more unified and interpretable approaches to unsupervised learning, the analysis of dynamical systems, and model discovery.

This work is based on the following papers

1.    Boumediene Hamzi, Marcus Hutter, Houman Owhadi, Bridging Algorithmic Information Theory and Machine Learning: Clustering, Density Estimation, Kolmogorov Complexity-Based Kernels, and Kernel Learning in Unsupervised Learning.
2.    Boumediene Hamzi, Marcus Hutter, Houman Owhadi, Bridging Algorithmic Information Theory and Machine Learning: A New Approach to Kernel Learning.
3.    Jonghyeon Lee, Boumediene Hamzi, Yannis Kevrekidis, Houman Owhadi, Gaussian Processes Simplify Differential Equations.
4.    Lu Yang, Xiuwen Sun, Boumediene Hamzi, Houman Owhadi, Naiming Xie, Learning Dynamical Systems from Data: A Simple Cross-Validation Perspective, Part V: Sparse Kernel Flows for 132 Chaotic Dynamical Systems.

 

Talk 2 - 11:00-12:00 by Jonghyeon Lee

Title: Kernels Simplify Differential Equations

Abstract: Many nonlinear ordinary and partial differential equations are difficult or time-consuming to solve and analyse. It is unsurprising that transforming them to equations with 'simpler' behaviour is an active field of research; this includes mapping them to linear differential equations either locally or globally or approximating the solution with a relatively small number of basis functions that capture the essential elements of the behaviour. Kernel methods have considerable value in learning such transformations because they are typically linearity in time complexity as a function of the collocation points and have strong theoretical convergence results. In the first part of the talk, we introduce the idea of generalized kernel regression to learn the Cole-Hopf transformation, which maps the nonlinear Burgers equation to the linear equation, and a Poincare normal form of the Hopf bifurcation of the Brusselator. We then move on to discussing the applications of kernels to recover the eigenfunctions of the Koopman operator, which maps a nonlinear ODE to a dynamical system in infinite dimensions, and applications including Lyapunov functions and quasi-potential functions of stochastic systems. Finally, we conclude by proposing a new kernelized reduced order model (KROM) which uses an empirical kernel matrix to quickly solve nonlinear PDEs.