AI4Science

Aurora -- A foundation Model of the Atmosphere

Aurora leverages the strengths of the foundation modelling approach to produce operational forecasts for a wide variety of atmospheric prediction problems, including those with limited training data, heterogeneous variables, and extreme events

Universal Physics Transformers -- A Framework For Efficiently Scaling Neural Operators

We introduce Universal Physics Transformers (UPTs), an efficient and unified learning paradigm for a wide range of spatio-temporal problems. UPTs operate without grid- or particle-based latent structures, enabling flexibility and scalability across meshes and particles.

We identify particle clustering originating from tensile instabilities as one of the primary pitfalls. Based on these insights, we enhance both training and rollout inference of GNN-based simulators with varying components from standard SPH solvers, including pressure, viscous, and external force components.

Smoothed particle hydrodynamics (SPH) is omnipresent in modern engineering and scientific disciplines. SPH is a class of Lagrangian schemes that discretize fluid dynamics via finite material points that are tracked through the evolving velocity …

Data-Driven Simulations

I am firmly convinced that AI is on the cusp of disrupting simulations at industry-scale. Therefore, I have started a new group at JKU Linz which has strong computer vision, simulation, and engineering components. My vision is shaped by experience both from university and from industry.

PDE-Refiner - Achieving Accurate Long Rollouts with Neural PDE Solvers

PDE-Refiner is an iterative refinement process that enables neural operator training for accurate and stable predictions over long time horizons. Published at NeurIPS 2023 (Spotlight).

Learning Lagrangian Fluid Mechanics with E(3)-Equivariant Graph Neural Networks

We introduce E(3)-equivariant GNNs to two well-studied fluid-flow systems, namely 3D decaying Taylor-Green vortex and 3D reverse Poiseuille flow. Published at GSI 2023.

Learning Lagrangian Fluid Mechanics with E(3)-Equivariant Graph Neural Networks

We introduce E(3)-equivariant GNNs to two well-studied fluid-flow systems, namely 3D decaying Taylor-Green vortex and 3D reverse Poiseuille flow. Published at GSI 2023.

Clifford Group Equivariant Neural Networks

We introduce a novel method to construct E(n)- and O(n)-equivariant neural networks using Clifford algebras. Published at NeurIPS 2023 (Oral).

Geometric Clifford Algebra Networks

We introduce Geometric Clifford Algebra Networks (GCANs) which parameterize combinations of learnable group actions. Published at ICML 2023.

ClimaX -- A foundation model for weather and climate

We develop and demonstrate ClimaX, a flexible and generalizable deep learning model for weather and climate science that can be trained using heterogeneous datasets spanning different variables, spatio-temporal coverage, and physical groundings. Published at ICML 2023 (Spotlight).