The Big Data Analytics group in Jülich develops software and methodology to build a microscopic resolution 3D model of the human brain, which includes detailed information about distributions of neuronal cells and nerve fibers, and microstructurally defined 3D maps of brain areas. This requires microscopic imaging and analysis of large quantities of histological brain sections at high throughput, leading to image datasets at the Petabyte scale. At the intersection of neuroinformatics, computer vision and artificial intelligence (AI), our research addresses
– Data and workflow management for high throughput microscopic imaging,
– Machine learning and computer vision algorithms for biomedical image analysis on high performance computers,
– Software development for structured access to large image datasets and interactive 3D exploration of high-resolution brain atlases over the web
What is the data science project’s research question? The main topic will be on segmenting grey / white matter in ultra-high resolution 2D histological imaging. The goal is to develop a curvature based segmentation, e.g., an active contour approach, to better separate gray/white matter from the background.
What data will be worked on? ultra-high resolution microscopic scans of human brain tissue
What tasks will this project involve? Learn about and get familiar with
– processing and analysis of high resolution image data
– scalable algorithms for processing and analysis of microscopic scans of human brain tissue.
With that the guest will gain practical experience in big data processing workflows with high-performance computing systems.
What makes this project interesting to work on? We provide high-performance workstation and access to Juelich’s HPC systems. From the
first day the guest will be part of a very motivated, dynamic and young research team. Additionally, nice intercultural environment and occasional presentations of ongoing research from other students and colleagues. We are happy to welcome the guest on site, but we try to make it remotely possible as far as possible.
What is the expected outcome? Contribution to research paper, Contribution to software development
What infrastructure, programs and tools will be used? Can they be used remotely? The guest will be part of the local Helmholtz AI unit in Jülich. Our data management and image analysis workflows run on a distributed environment which includes Jülich’s modular high-performance computing systems. Image acquisition of whole-brain sections at micrometer resolution is performed on a high throughput microscopy facility. Outputs of our research are typically integrated with the human brain atlas hosted in the European research infrastructure EBRAINS, to which we contribute key developments. Most of our software is implemented in Python. Image analysis and machine learning approaches use common Deep Learning frameworks, typically PyTorch or Tensorflow.
What skills are necessary for this project? Data analytics / statistics, Scientific computation, Computational models, Visualization, High-performance computing, Computer Vision and Image Processing/Analysis
Is the data open source? The 20 micron BigBrain dataset is open source. However, 1 micron data set is currently only used internally, but will be released later.
Interested candidates should be at Master level. Susanne Wenzel is looking for 1 visiting scientist, working on the project with Timo Dickscheid (email@example.com), as supervisor.