Johannes Brandstetter

Johannes Brandstetter

Chief Researcher @ NXAI, Assistant Professor @ JKU Linz

Johannes Kepler University, Linz, Institute for Machine Learning

About me

I am leading a group “AI for data-driven simulations” at the Institute for Machine Learning at the Johannes Kepler University (JKU) Linz. Additionally, I am a Chief Researcher at NXAI - our new European AI hub in Linz (Austria).

I have obtained my PhD after working several years at the CMS experiment at CERN. During this time, I had the privilege of learning from brilliant minds from all around the world, and got the chance to co-author seminal papers in the realm of Higgs boson physics. In 2018, after completing my PhD, my career trajectory shifted towards machine learning, and I was fortunate to join the research group of Mr LSTM Sepp Hochreiter in Linz. Under Sepp’s mentorship, I delved into the intricacies of machine learning and modern deep learning over a span of 2.5 years.

From 2021 to 2023, I had the pleasure of spending three remarkable years in Amsterdam. Initially, I was part of the Amsterdam Machine Learning Lab lead by Max Welling, and subsequently joined Microsoft Research for 2 years. During this period, my passion for Geometric Deep Learning, particularly involving Geometric (Clifford) algebras, and my interest in partial differential equations (PDEs), with a particular focus on developing neural surrogates for (PDEs), became profound. Most importantly, I pivoted towards large-scale PDEs, including weather and climate modeling, which culminated in Aurora.

My years in Amsterdam have shaped my research vision. I am firmly convinced that AI is on the cusp of disrupting simulations at industry-scale. Every day thousands and thousands of compute hours are spent on turbulence modeling, simulations of fluid or air flows, heat transfer in materials, traffic flows, and many more. Many of these processes follow similar underlying patterns, but yet need different and extremely specialized softward to simulate. Even worse, for different parameter settings the costly simulations need to be run at full length from scratch.

This is what I want to change! Therefore, I have started a new group at JKU Linz which has strong computer vision, numerical simulation, and engineering components. We want to advance data-driven simulations at industry-scale, and place the Austrian industry engine Linz as a center for doing that.

News

[Feb 2024] I have started as Chief Researcher at NXAI.

[Oct 2023] I have re-joined Sepp Hochreiter’s group, starting my own group “AI for data-driven simulations”

NXAI

Mission statement of NXAI

With NXAI, we are building an European AI hub that comprises world-class research, close ties to local universities, and strong industrial know-how and support. Our mission is to create a large-scale entrepreneurial framework to transform latest scientific developments into industrial-ready applications.

At its core, NXAI is dedicated to independent research, which we believe is our greatest strength. However, we aspire to go further. We are building a pipeline that converts our research into high-impact industrial applications. Our goal is to become a nimble AI powerhouse at the forefront of the AI revolution in Europe.

Our research focus is on two key areas. The first is AI4Simulation, where we are building foundational models for industrial simulations. The second is Large Language Models (LLMs), highlighted by our flagship project xLSTM. In both domains, we have world-leading experts committed to unlocking the vast potential for creating downstream applications for engineering and industrial use cases

More information can be found at our webpage https://www.nx-ai.com/.

Recent (selected) Publications

Aurora -- A foundation Model of the Atmosphere
Aurora leverages the strengths of the foundation modelling approach to produce operational forecasts for a wide variety of atmospheric prediction problems, including those with limited training data, heterogeneous variables, and extreme events
Aurora -- A foundation Model of the Atmosphere
xLSTM -- Extended Long Short-Term Memory
How far do we get in language modeling when scaling LSTMs to billions of parameters, leveraging the latest techniques from modern LLMs, but mitigating known limitations of LSTMs?
xLSTM -- Extended Long Short-Term Memory

Open Positions

[Nov 2023] Open positions can be found here.