Neural Operators

Universal Physics Transformers -- A Framework For Efficiently Scaling Neural Operators

We introduce Universal Physics Transformers (UPTs), an efficient and unified learning paradigm for a wide range of spatio-temporal problems. UPTs operate without grid- or particle-based latent structures, enabling flexibility and scalability across meshes and particles.

PDE-Refiner - Achieving Accurate Long Rollouts with Neural PDE Solvers

PDE-Refiner is an iterative refinement process that enables neural operator training for accurate and stable predictions over long time horizons. Published at NeurIPS 2023 (Spotlight).

Towards Multi-spatiotemporal-scale Generalized PDE Modeling

We present PDEArena, a modern PyTorch Lightning-based deep learning framework for neural PDE modeling. Published at TMLR 07/2023.

Clifford Neural Layers for PDE Modeling

We introduce neural network layers based on operations on composite objects of scalars, vectors, and higher order objects such as bivectors. Published at ICLR 2023.

Lie Point Symmetry Data Augmentation for Neural PDE Solvers

We present how to use Lie Point Symmetries of PDEs to improve sample complexity of neural PDE solvers. Published at ICML 2022 (Spotlight).