Helmholtz AI Consultant Team for Matter Research
Helmholtz AI aims to empower scientists to apply machine learning methods to their scientific problem domains. This goal will be achieved by fostering and stimulating collaborative interdisciplinary research projects; by leveraging similarities between data-driven solutions across domains; by integrating field-specific excellence and AI/ML prowess; by improving the quality, scalability and timely availability of emerging methods and tools and by training the current and next generation of scientists in using AI methods and tools.
These goals will be pursued beyond institute and center limits by regular funding opportunities and collaboration-as-a-service offerings as well as many more activites. At HZDR, a Young Investigator Gruppe led by Dr. Nico Hoffmann and a Helmholtz AI Consultant Team led by Dr. Peter Steinbach have been installed. Both build the Helmholtz AI Local Unit to support all scientists within the research field matter of the German Helmholtz Association. This page introduces the Helmholtz AI Consultant Team.
"The Helmholtz AI consultants team's mission at HZDR is to consult scientists primarily of the research field Matter in the application of automated data processing and knowledge extraction methods. We want to disseminate state-of-art best practises in ML and data science. With this, we hope to boost data understanding of our clients at the global academic scale in order to provide a competitive advantage. Within this mandate, we will try to advance methods or tooling in order to reduce the time investment on our as well as on our clients' side."
You, your data and us
It is our task to aid scientists with their needs to process small and big data. For this, we offer in-person free-of-charge consulting as well as collaborative projects, i.e. vouchers. A voucher is meant to guarantee a fair and uniform processing of projects at HZDR and other Helmholtz centers across Germany. Therefor, a voucher must comply to the following criteria:
- It should describe a feasible goal which can be achieved by state-of-the art machine learning methods and assets of artificial intelligence.
- It should describe a project, that can be concluded within a period of 2 weeks or upto 6 months.
- It should report and link to data, that can be used to train and use state-of-the art machine learning methods.
- It should define uncertainty bounds that current method obtain and a possible AI agent should improve on.
These vouchers are defined and created in collaboration with us. After that, they are submitted into a light-weight review process. The central administration of Helmholtz AI, other consultant teams as well as the central Helmholtz Office in Berlin will review the voucher and potentially approve it for action.
You have Questions or Ideas? Contact Us: firstname.lastname@example.org
|name||building/room||+49 351 260 ....|
Ion-beam analysis spectrum evaluation
|Description:||Ion beam analysis is a method to analyse the surface near elemental and isotopic composition of practically any target material with a tomographic information. The evaluation of ion-beam analysis spectra so far requires extensive manual input, strongly limiting the methods throughput. Simple chi2 based optimisation methods such as simplex, implemented in physics models such as SimNRA, hardly allow finding global minima of the spectral evaluation due to the many involved parameters and non-linearities. The physics models allow for generating huge amounts of synthetic data with high quality via forward calculations, though. This opens ups the possibility for deep learning algorithms to learn evaluating even complex spectra and assist the optimisation methods in finding close-to-final initial values or even replacing the given tool for finding final results.|
|Expected Results (Goals):||
NTK Based Parameter Annealing
Physics-informed neural networks (PINNs) are learning-based algorithms for solving partial differential equations (PDEs). Their training involves the minimization of a physically-informed loss function using stochastic gradient descent. The loss function integrates prior knowledge of the reformulated PDEs which is iteratively minimized. However, despite their empirical success, little is known about their behavior during training. Moreover, due to unknown reasons sometimes they fail to converge. This project aims to investigate the behavior of PINNs using Neural Tangent Kernels (NTK). NTK captures the behavior of fully-connected neural networks in the infinite width limit during training via gradient descent. The limiting NTK can be used to provide insights into the training dynamics of PINNs. The eigenvalues of the NTK can help anneal hyperparameters to recalibrate the convergence rate of PINNs as a result. In this project, the aim is to leverage the loss functions of PINNs to solve different PDEs by the integration of NTK.
|Expected Results (Goals):||
Instance Segmentation in Videos of Bright-Field Microscopes
Here we explore the capabilities of algorithms designed for video instance segmentation (VIS) or multi-object tracking (MOT) for segmenting overlapping bubbles in video data.
To accurately detect bubbles in experimental studies, AI-assisted segmentation of overlapping bubbles in acquired images is becoming more and more standard. Although there are several promising approaches for this task, they have problems at high levels of overlap and are often unable to reconstruct the hidden part of the occluded bubble. Moreover, such segmentation is not able to detect fully occluded bubbles in an image. The use of video sequences can help to solve these problems, since partially or completely occluded bubbles are often more visible in an image just before or after the initial image.
|Expected Results (Goals):||