Compressing simulation output with cINNs and continual learning


Compressing simulation output with cINNs and continual learning

Willmann, A.; Schmerler, S.; Ebert, J.; Kesselheim, S.; Bussmann, M.; Chandrasekaran, S.; Debus, A.; Hoffmann, N.; Holsapple, K.; Juckeland, G.; Pausch, R.; Pöschel, F.; Schramm, U.; Steiniger, K.

The output of simulations can be extremely challenging to work with. For example, in large-scale particle-in-cell simulations in plasma physics, trillions of particles are simulated over millions of time steps, creating Petabytes of data. In this project, we develop methods to compress particle data by training conditional invertible neural networks (cINNs) on the particle data. The particles then can be reconstructed by running the trained model in generative mode. This allows us to reach up to millionfold compression, and a controlled loss of a accuracy. The models can be conditioned not only on the temporal axis but also on other types of simulation outputs of smaller data volume, leading potentially to even higher compression factors.

In order to enable the neural network model to represent simulation data over long time spans, we apply methods from Continual Learning, where each new learning task is represented by a dataset produced by a simulation time step. This approach enables us to efficiently solve the inverse problem of particle data reconstruction from radiation data in a time-resolved manner by side-stepping demanding simulations.

  • Poster
    Helmholtz AI Conference 2023, 12.-14.06.2023, Hamburg, Germany

Permalink: https://www.hzdr.de/publications/Publ-38096