The people making the usage of HPC resources and services possible are a team of GWDG
employees arching several working groups, consisting to large parts of members from the AG
Computing. The people listed on this page are ordered
alphabetically.
Bernhard Bandow obtained his diploma in physics at the Technische Universität Berlin working on MD simulations of systems with confined geometry. He subsequently received his PhD in physical chemistry from the Christian-Albrechts-Universität zu Kiel in 2007 on global geometry optimization of water clusters employing genetic algorithms. After a stay as postdoc at the German Institute for Rubber Technology (DIK) in Hannover in 2008 he switched to work at the Leibniz Universität Hannover for the Computing Center and the North German Supercomputing Alliance (HLRN). Starting in 2011 he worked at the computing center of the Max Planck Institute for Solar System Research in Göttingen. In 2019 he joined the GWDG as HPC-Coordinator of the Göttingen Campus Institute for Dynamics of Biological Networks (CIDBN).
Competencies:
Campus Institute for Dynamics of Biological Networks
Mr. Johannes Biermann joined the Computing WG in the Digital Humanities (DH) area on February 14, 2022. The goal is to establish HPC usage in this discipline. Previously, he has carried out various projects in the DH context at SUB Göttingen. Johannes Biermann studied "Information Technology - Operational Information Systems" at the Duale Hochschule Baden-Württemberg Stuttgart. Afterwards he worked as an eBusiness specialist in a private company. Afterwards he did his master at the State Academy of Fine Arts in Stuttgart in 2013 in the field of "Information Technology - Operational Arts in Stuttgart in the field of "Conservation of New Media and Digital Information".
Christian Boehme has been with GWDG since 2003 and introduced and operated the resource management for GWDG's first Linux based HPC cluster. He has coordinated planning and operation of several HPC systems, including the NHR system "Emmy", as well as University of Göttingen's Modular Data Center (MDC). He has also coordinated national research projects on HPC-as-a-Service and Performance Monitoring. Previous career stations were the University of Bochum with Dominik Marx, the University of Strasbourg with Georges Wipff, and the University of Marburg, where he obtained his PhD in Computational Chemistry with Gernot Frenking.
Alexander Goldmann studied Media Management (B.A.) at the Ostfalia University of Applied Sciences in Salzgitter. He then completed a traineeship as a PR consultant in the Berlin avertising agency "Zum golden Hirschen". Subsequently, he was responsible for the development and maintenance of the community in the Berlin coworking space of St. Oberholz as a community manager and for the entire marketing and PR. As part of this activity, he also gained experience in looking after various community members. In addition to working as a community manager, he was also responsible for the planning and implementation of events, training courses, workshops and other events for both internal and external participants.
Christoph Hottenroth works as a technical employee. He works there as an IT system administrator in the team that operates the new DLR supercomputer "CARO". Christoph worked for a long time as an IT system administrator in the field of Windows administration, covering all areas from hardware and virtualization to networks and special software administration. His special focus has so far been in the Microsoft Exchange area.
Nils Kanning studied physics at the University of Göttingen and obtained a PhD in mathematical physics at the Humboldt University of Berlin. He continued his research in the field of integrable models as a postdoc at the Ludwig Maximilian University of Munich. At GWDG, he is now part of the team operating and supporting the DLR HPC system "Caro". As part of this role, he is concerned with research collaborations and public relations for the system.
Azat Khuziyakhmetov studied Applied Mathematics and Computer Science in Moscow State University, continued doing Master's degree in Internet Technologies and Information Systems program (ITIS) in University of Göttingen, finished with the Master thesis "Anomaly detection of GPU utilization with neural networks". He worked as a software developer and administrator. In University of Göttingen he was involved in teaching of courses "Algorithms for Programming Contests" and "Parallel Computing". In GWDG he was involved in the ProfiT-HPC project. Currently works in DLR team and administrates multiple HPC clusters.
Competencies:
DLR
administration
monitoring
Open topics for work and projects
Topic
Professor
Type
Comparing performance of Remote Visualisation techniques
Prof. Julian Kunkel
BSc, MSc
Recommendation System for performance monitoring and analysis in HPC
Christian Köhler studied Physics at the University of Göttingen and finished his diploma with the thesis "String-localized fields and point-localized currents in massless Wigner representations with infinite spin" in 2011. In 2015 he finished his PhD thesis "On the localization properties of quantum fields with zero mass and infinite spin". He joined GWDG for software development in the INF ADIR project and transitioned to the HPC team in the context of starting operation of the HLRN-IV system "Emmy". Since then, he consults SCC and HLRN users concerning scientific applications and serves for the office of HLRN's administrative board.
Sebastian Krey studied statistics at the Technical University of Dortmund with a minor in physics and a major in technometrics. He completed his studies in 2008 with the diploma thesis "SVM based sound classification". After that he was a scholarship holder of the graduate college "Statistical Modeling" and a research assistant in the DFG research group 1511 "Protection and control systems for reliable and safe electrical energy transmission". From 2015 to 2019 he was a research associate at the Institute for Data Science, Engineering, and Analytics at the Technical University of Cologne and worked on various projects on statistics and machine learning methods and applied his experience in applied mathematics and statistics to mathematics and data science Education of the engineering and computer science courses introduced.
Dr. Kunkel is a Professor in High-Performance Computing at the University of Göttingen, a Deputy Head of the GWDG and group leader of the working group Computing. Previously, he was a Lecturer at the Computer Science Department at the University of Reading and a postdoc in the research department of the German Climate Computing Center (DKRZ). Julian became interested in the topic of HPC storage in 2003, during his studies of computer science. Besides his main goal to provide efficient and performance-portable I/O, his HPC-related interests are data reduction techniques, performance analysis of parallel applications and parallel I/O, management of cluster systems, cost-efficiency considerations, and the software engineering of scientific software. He is a founding member of the IO500 benchmarking effort, the Virtual Institute for I/O, and the HPC Certification Forum. He is committed to excellence in research and teaching.
Competencies:
High-Performance Data Analytics
Data Management
Data-driven workflows
Parallel file systems
Application of machine learning methods
Performance portability
Data reduction techniques
Management of cluster systems
Performance analysis of parallel applications and parallel I/O
As part of his doctoral project at the HU Berlin, Tino Meisel dealt with basic research in the field of optoelectronics and epitaxy on semiconductors. He was already using the MATLAB and Wolfram Mathematica applications used in the HPC area, as well as the Python programming language for a data science project to analyze SARS-CoV-2 time series.
Marcus Merz has gained experience in various areas of technology through his professional career. Because of his study of technical computer science and his work, he has knowledge on all levels of the development of hardware designs with FPGA as well as the Acquired firmware, driver and Linux development for embedded systems. In addition, there is experience in construction, in integration and in Operation of networks and corresponding components in the medical field. This also includes the construction and service of a real-time network and control system for a particle accelerator in cancer therapy. In all these areas he was also for the responsible for the technical and administrative support of his colleagues and customers.
Mrs. Stefanie Mühlhausen works as a research assistant in the research group Computing (AG C). She supports the team in scientific activities and in teaching. Ms. Mühlhausen studied biology and applied computer science at the Georg-August- University. Applied Computer Science with a focus on bioinformatics and worked at the Max-Planck- Institute for Biophysical Chemistry on characterisics of eukaryotic genome evolution. doctorate. After her PhD, Ms. Mühlhausen did research at the Milner Center for Evolution in Bath, UK as well as working as a Data Scientist in industry. Most recently she worked as a research associate at the Institute for Computer Science at the University of Göttingen in the spin-off project "Genometation". Ms. Mühlhausen has many years of experience with computing on HPC systems.
Hendrik Nolte studied Physics at the University of Göttingen and finished his Master's degree with the thesis "Visualization and Analysis of Multidimensional Photoelectron Spectroscopy Data". He joined GWDG in 2019 to support the general development of an in-house data lake solution.
Competencies:
Data Lakes
Secure Workflow
Veröffentlichungen
2023
Secure HPC: A workflow providing a secure partition on an HPC system
(Hendrik Nolte, Nicolai Spicher, Andrew Russel, Tim Ehlers, Sebastian Krey, Dagmar Krefting, Julian Kunkel),
2023-01-01
DOI
2022
Realising Data-Centric Scientific Workflows with Provenance-Capturing on Data Lakes
(Hendrik Nolte, Philipp Wieder),
2022-01-01
DOI
Realising Data-Centric Scientific Workflows with Provenance-Capturing on Data Lakes
(Hendrik Nolte, Philipp Wieder),
2022-01-01
URL
Toward data lakes as central building blocks for data management and analysis
(Philipp Wieder, Hendrik Nolte),
2022-01-01
DOI
Toward data lakes as central building blocks for data management and analysis
(Hendrik Nolte, Philipp Wieder),
2022-01-01
URL
BibTeX: Secure HPC: A workflow providing a secure partition on an HPC system
@article{2_133248,
author = {Hendrik Nolte and Nicolai Spicher and Andrew Russel and Tim Ehlers and Sebastian Krey and Dagmar Krefting and Julian Kunkel},
doi = {10.1016/j.future.2022.12.019},
grolink = {https://resolver.sub.uni-goettingen.de/purl?gro-2/133248},
month = {01},
title = {Secure HPC: A workflow providing a secure partition on an HPC system},
type = {article},
year = {2023},
}
BibTeX: Toward data lakes as central building blocks for data management and analysis
@article{2_129372,
abstract = {"Data lakes are a fundamental building block for many industrial data analysis solutions and becoming increasingly popular in research. Often associated with big data use cases, data lakes are, for example, used as central data management systems of research institutions or as the core entity of machine learning pipelines. The basic underlying idea of retaining data in its native format within a data lake facilitates a large range of use cases and improves data reusability, especially when compared to the schema-on-write approach applied in data warehouses, where data is transformed prior to the actual storage to fit a predefined schema. Storing such massive amounts of raw data, however, has its very own challenges, spanning from the general data modeling, and indexing for concise querying to the integration of suitable and scalable compute capabilities. In this contribution, influential papers of the last decade have been selected to provide a comprehensive overview of developments and obtained results. The papers are analyzed with regard to the applicability of their input to data lakes that serve as central data management systems of research institutions. To achieve this, contributions to data lake architectures, metadata models, data provenance, workflow support, and FAIR principles are investigated. Last, but not least, these capabilities are mapped onto the requirements of two common research personae to identify open challenges. With that, potential research topics are determined, which have to be tackled toward the applicability of data lakes as central building blocks for research data management."},
author = {Hendrik Nolte and Philipp Wieder},
grolink = {https://resolver.sub.uni-goettingen.de/purl?gro-2/129372},
month = {01},
title = {Toward data lakes as central building blocks for data management and analysis},
type = {article},
url = {https://publications.goettingen-research-online.de/handle/2/114449},
year = {2022},
}
BibTeX: Toward data lakes as central building blocks for data management and analysis
@article{2_114449,
abstract = {"Data lakes are a fundamental building block for many industrial data analysis solutions and becoming increasingly popular in research. Often associated with big data use cases, data lakes are, for example, used as central data management systems of research institutions or as the core entity of machine learning pipelines. The basic underlying idea of retaining data in its native format within a data lake facilitates a large range of use cases and improves data reusability, especially when compared to the schema-on-write approach applied in data warehouses, where data is transformed prior to the actual storage to fit a predefined schema. Storing such massive amounts of raw data, however, has its very own challenges, spanning from the general data modeling, and indexing for concise querying to the integration of suitable and scalable compute capabilities. In this contribution, influential papers of the last decade have been selected to provide a comprehensive overview of developments and obtained results. The papers are analyzed with regard to the applicability of their input to data lakes that serve as central data management systems of research institutions. To achieve this, contributions to data lake architectures, metadata models, data provenance, workflow support, and FAIR principles are investigated. Last, but not least, these capabilities are mapped onto the requirements of two common research personae to identify open challenges. With that, potential research topics are determined, which have to be tackled toward the applicability of data lakes as central building blocks for research data management."},
author = {Philipp Wieder and Hendrik Nolte},
doi = {10.3389/fdata.2022.945720},
grolink = {https://resolver.sub.uni-goettingen.de/purl?gro-2/114449},
month = {01},
title = {Toward data lakes as central building blocks for data management and analysis},
type = {article},
year = {2022},
}
BibTeX: Realising Data-Centric Scientific Workflows with Provenance-Capturing on Data Lakes
@article{2_129373,
author = {Hendrik Nolte and Philipp Wieder},
grolink = {https://resolver.sub.uni-goettingen.de/purl?gro-2/129373},
month = {01},
title = {Realising Data-Centric Scientific Workflows with Provenance-Capturing on Data Lakes},
type = {article},
url = {https://publications.goettingen-research-online.de/handle/2/121151},
year = {2022},
}
BibTeX: Realising Data-Centric Scientific Workflows with Provenance-Capturing on Data Lakes
@article{2_121151,
author = {Hendrik Nolte and Philipp Wieder},
doi = {10.1162/dint_a_00141},
grolink = {https://resolver.sub.uni-goettingen.de/purl?gro-2/121151},
month = {01},
title = {Realising Data-Centric Scientific Workflows with Provenance-Capturing on Data Lakes},
type = {article},
year = {2022},
}
Open topics for work and projects
Topic
Professor
Type
Development of a provenance aware ad-hoc interface for a data lake
Prof. Julian Kunkel
BSc, MSc
Semantic classification of metadata attributes in a data lake using machine learning
Martin Leandro Paleico oversees various aspects of the GWDG's bioinformatics offerings. Dr. Paleico studied Chemistry at the University of Buenos Aires and received his PhD in Computational and Theporetical Chemistry from the University of Göttingen in 2021, entitled "Neural Network Potential Simulations of Copper Supported on Zinc Oxide Surfaces". His interests are in chemistry, biology, programming, machine learning and system administration.
Trevor began his studies in Tunisia at the University of Gabès, where he received his bachelor's degree earned in mathematics. He continued his mathematics studies at the University of Göttingen in mathematics with a focus on stochastics, that he successfully completed his master's degree graduated During his studies he worked as a student assistant at Fraunhofer IEE and at the Institute for Mathematical Stochastics (IMS) in Goettingen, where he gained experience in various fields such as data science, machine learning and Python programming could collect.
Mr. Vogt obtained his "Master of Science" degree in applied computer science at the University of Göttingen. As part of his master's thesis "Enhancing the visual hull method" in the field of "Computer Vision", he gained significant experience in the field of high-performance computing, especially with GPUs. During his studies at the University of Göttingen, he was also employed as a student assistant at the university and was entrusted with system administration and software development tasks. Mr. Vogt has many years of experience with software development and administration, both from his studies and from personal projects.
Artur Wachtel works as a research assistant in the "Computing" working group (AG C). There he supports the team that operates the new DLR supercomputer "CARO". After studying physics at the University of Göttingen, Mr. Wachtel received his doctorate in physics from the University of Luxembourg on the thermodynamics of chemical reaction networks. In the following years he did research in statistical physics and theoretical biophysics at the Universities of Luxembourg and Yale. Mr. Wachtel also has many years of experience in Linux system administration and, since his doctorate, also experience with computing on HPC systems.
High-Performance Data Analytics, Data Management, Data-driven workflows, Parallel file systems, Application of machine learning methods, Performance portability, Data reduction techniques, Management of cluster systems, Performance analysis of parallel applications and parallel I/O, Software engineering of scientific software, Personalized teaching