Overview of current SHK positions in the Data Management and HPC group
The Working Group "Data Management and HPC"
For a state-of-the-art research data management it is recommended to consider the data life cycle first. Our long-term goal is to document the entire data life cycle regarding the Findable, Accessible, Interoperable and Reusable (FAIR) way. A similar principles also applies to software software.
Especially the steps from the creation of the research data until the publication of data and results should be included in the final publication. We developed a concept for the entire research project lifecycle which contains all essential components and with a focus on (meta)data exchange. Descriptions about the individual components provided by our working group can be found in our Top-Level Service Strategy.
We are developing a concept of a data management lifecycle for our scientists at HZDR. This implies the integration of existing data sources, documentation of experiments, gathering metadata, management and integration of data analysis of primary data as well as establishing a complete data provenance with integrated workflows.
We provide support for existing or planned projects concerning the above presented topics. The electronic documentation is often the first point of contact with this subject. Automated data acquisition and interfaces with analysis programs are also common main topics. We will support the establishment and initialization of an executable version of data aquicition. The goal will always be that the involved scientists will be able to continue and optimize the projects by themselves afterwards. Our group will always be available for further questions.
Support in optimizing HPC applications and workloads,
Infrastructure for managing a project life cycle with our HZDR infrastructure,
Documentation of experiments (Lab Documentation System),
Automated inbound data transfers into our systems from multiple data sources,
Establishment of workflows related to the FAIR principles,
Support in archiving of research data, workflows and the scientific publication itself.
Services for the Topic "Data Management & Analysis"
"Data Management & Analysis" is a new research topic in the research program "Matter & Technology" of the Helmholtz research field "Matter". The Computational Science department hosts a small group, which supports the domain scientists in the Institute of Radiation Physics working on the same topic. The group is responsible for maintaining mission critical software components as well as interfacing them with the software solutions of HZDR as well as the research field.
Software co-design for high performant, platform independent components
Performance analysis and support with the optimization of existing applications
After each measurement the user would like to have the post-processed file available as soon as possible. The experiments are controlled with Labview and the post-processing should initiated automatically on the cluster.
Expected Results (Goals):
After each measurement the post-processing "workflow" is initiated, runs on the cluster and the resulting file is available on bigdata