Overview of current SHK positions in the Data Management and HPC group
The Working Group "Data Management and HPC"
For a state-of-the-art research data management it is recommended to consider the data life cycle first. Our long-term goal is to document the entire data life cycle regarding the Findable, Accessible, Interoperable and Reusable (FAIR) way. A similar principles also applies to software software.
Especially the steps from the creation of the research data until the publication of data and results should be included in the final publication. We developed a concept for the entire research project lifecycle which contains all essential components and with a focus on (meta)data exchange. Descriptions about the individual components provided by our working group can be found in our Top-Level Service Strategy.
Mission
We are developing a concept of a data management lifecycle for our scientists at HZDR. This implies the integration of existing data sources, documentation of experiments, gathering metadata, management and integration of data analysis of primary data as well as establishing a complete data provenance with integrated workflows.
Support
We provide support for existing or planned projects concerning the above presented topics. The electronic documentation is often the first point of contact with this subject. Automated data acquisition and interfaces with analysis programs are also common main topics. We will support the establishment and initialization of an executable version of data aquicition. The goal will always be that the involved scientists will be able to continue and optimize the projects by themselves afterwards. Our group will always be available for further questions.
Support in optimizing HPC applications and workloads,
Infrastructure for managing a project life cycle with our HZDR infrastructure,
Documentation of experiments (Lab Documentation System),
Automated inbound data transfers into our systems from multiple data sources,
Establishment of workflows related to the FAIR principles,
Support in archiving of research data, workflows and the scientific publication itself.
Services for the Topic "Data Management & Analysis"
"Data Management & Analysis" is a new research topic in the research program "Matter & Technology" of the Helmholtz research field "Matter". The Computational Science department hosts a small group, which supports the domain scientists in the Institute of Radiation Physics working on the same topic. The group is responsible for maintaining mission critical software components as well as interfacing them with the software solutions of HZDR as well as the research field.
Software co-design for high performant, platform independent components
Performance analysis and support with the optimization of existing applications
After each measurement the user would like to have the post-processed file available as soon as possible. The experiments are controlled with Labview and the post-processing should initiated automatically on the cluster.
Expected Results (Goals):
After each measurement the post-processing "workflow" is initiated, runs on the cluster and the resulting file is available on bigdata
Provide an ETL Workflow for Turbulence Fluid Dynamic Simulations
Description:
Create an ETL Workflow based on Celery with PostgrSQL and optional Elasticsearch integration.
Expected Results (Goals):
Python-based ETL workflow for our FWCC PostgreSQL database
Setting up a celery workflow environment
Integration of the workflow into our celery infrastructure
Visualization and administration of the workflows using Flower or Airflow
Connect Jupyter Notebooks on hemera to the PostgreSQL database
Setting up an OpenDistro (Elastic Search + Kibana + LDAP)
Synchronize the PostgrSQL database with Elasticsearch using LogStash
Visualize the data with Kibana
Owner:
Oliver Knodel
Customer:
(FWDC, Thomas Ziegenhein)
Automated GitLab CI-Job for the bitstream cration on Hemera
Description:
Create a CI-Job to automate the FPGA bitstream creation after every commit (with a special Tag) to provide a valid bitstream and to bring the GitLab project to the next level.
Expected Results (Goals):
Validated project sources to enable a bitstream build based on the data provided in the GitLab repository.
Creation of a reproduceable (command line based) FPGA development pipeline with necesasary tools/dependencys on Hemera.
Automated GitLab HPC Runner producing valid bitstreams as artefacts.
Provide a Toolflow for FPGA-DAQ Development using High-Level-Synthesis
Description:
Create a service which generates FPGA designs from OpenCL code using the High-Level-Synthesis (HLS) Tools from Xilinx on Hemera and implement first data aquisition cores.
Expected Results (Goals):
setup the toolflow on Hemera
implement first cores in pure C or OpenCL
document the project in GitLab and use CI for code validation
validate the core using SW/HW Cosimulation
optimize the code using directives and create different solutions on the provided FPGA (FWKK)
create the hardware design and deploy it on the ELBE-FPGA
Owner:
Oliver Knodel
Customer:
ELBE Experimant (FWKK, Andreas Wagner)
Provide project IDs with and without proposals
Description:
Create a service which validates proposal IDs or provides a "HZDR-ID" for non proposal projects
Expected Results (Goals):
setup DMS Guidance System (Webfrontend and API)
mirror GATE database using OAuth and cURL
provide validation function for user + proposal ID requests
provide new "HZDR-ID" and validation for non proposal projects
provide additional information for validated IDs
Owner:
Oliver Knodel
Customer:
Laser group collecting laboratory environmental sensor data (FWKT)
Make data sets available in a consistent and useful way.
Expected Results (Goals):
Prepare data sets of KLOE05, KLOE08, KLOE10, KLOE12 and the updated sets of KLOE17 for upload to the HEPData repository using the hepdata_lib python library