We investigate high-energy-density phenomena in matter under extreme conditions induced by extreme electromagnetic fields, temperatures, and pressures. In this context, we develop surrogate models for materials science that propel multi-scale materials modeling to scales unattainable with first-principles algorithms while maintaining their accuracy. Most notably, we develop surrogates for density functional theory, one of the world’s largest overall computational expenses due to its prevalence in physical, biological, and materials sciences.
What is the data science project’s research question? The goal of this project is to implement a surrogate model – a so-called average-atom model – that will enable the data-efficient simulation of matter under extreme conditions.
What data will be worked on? High-fidelity simulation data from density functional theory will be used to validate the surrogate model. Validation data consists of electronic structure output for a given atomic configuration tabulated on a grid of temperatures and pressures.
What tasks will this project involve? The tasks include developing an open-source average-atom code. The primary task of the student is to (1) implement the differential equations representing this problem and (2) test the code against experimental and high-fidelity simulation data. A blueprint for the code development in terms of legacy code will be provided. The code development should be done in python.
What makes this project interesting to work on? This surrogate model will allow us to model phenomena in matter under extreme conditions that are otherwise unattainable with standard algorithms. For example, it will enable computationally efficient simulations of electron dynamics in a material at high temperature exposed to a high-intensity laser from a first-principles, quantum-mechanical perspective. For instance, the surrogate model will be relevant for modeling the heating process of fuel capsules in the context of inertial confinement fusion technology.
What is the expected outcome? Contribution to research paper, Contribution to software development
What infrastructure, programs and tools will be used? Can they be used remotely? The Center for Advanced Systems Understanding, Helmholtz-Zentrum Dresden-Rossendorf will provide their high-performance computing resources with a total of 214 compute nodes (with 40 CPU cores per node and 36 of them attached to GPUs). Our data management plan is guided by the FAIR (Findable, Accessible, Interoperable, and Reusable) principles. Data will be stored on repositories that follow the FAIR guidelines (such as RODARE or ZENODO), ensuring open access to the data via assigned unique identifiers. The software will be developed using git version control hosted at gitlab.hzdr.de. Remote access to all of these resources will be provided.
What skills are necessary for this project? Scientific computation, Computational models, Computer simulations, High-performance computing
Is the data open source? Yes
Interested candidates should be at Master level. Attila Cangi is looking for 1 visiting scientist, working on the project together with the team.