• Scientific computing
• Parallel & distributed HPC methods
• Numerical modelling
• Python/ C/ Fortran
Parallel & distributed HPC methods
Python/ C/ Fortran
Inti has a background in astrophysics and has published on topics reaching from numerical cosmology, galaxy dynamics, stellar cluster formation to the evolution of planetary systems.
He obtained his PhD from Leiden University. After his PhD Inti did a postdoc at CMU in Pittsburgh on cosmological structure formation. Back from the US, he joined the development of the astrophysical multi-purpose software environment (AMUSE). He developed this research grade software package to facilitate coupled multi-physics and multi-scale numerical simulations. AMUSE is used by students and researchers worldwide to formulate and conduct numerical experiments in astrophysics.
His interest in the development of efficient and easy to use methods for coupled simulations led him to head a cross-disciplinary effort to transplant the technology developed as part of the AMUSE project to the oceanographic and climate science domains. This effort, funded by the Netherlands eScience Center, resulted in the development of the OMUSE coupling framework.
Inti's main expertise lies in the field of scientific computing, with extensive knowledge about parallel HPC methods, distributed computing and numerical simulation methods.
• Molecular Simulations
• Energy Research
• Quantum Chemistry
• Numerical methods
• Programing Languages: Python, C, Fortran
Programing Languages: Python, C, Fortran
Nicolas obtained a PhD in Nanoscience from University Paul Sabatier, Toulouse, France. During his PhD he studied new solutions for molecular quantum computing, or in simple words teaching single molecule how to count. In 2010, he joined the Theoretical Chemistry Department of Northwestern University where he studied how energy and electric charges propagate in biomolecules such as DNA and light harvesting complexes. He then moved in 2013 to the chemical Engineering Department of Delft University of Technology where he studied new ideas for solar energy research and molecular electronics.
Nicolas has developed different scientific software packages for various applications ranging from quantum transport to the calculation of electronic structure and electronic dynamics of molecular systems. He also has a keen interest in scientific visualization and illustration, statistical analysis of scientific data and the application of deep learning techniques to molecular science.
Nicolas joined the eScience Center in August 2017 as a research engineer.
Bram started his computational life with a master in aerodynamics and CFD in Delft, among other things he worked on Riemann solvers (at TNO TASS) and integral boundary layers solvers (at ECN). At the same time he had a job at a satellite operator (SES Newskies) to automate the testing of pseudo-code.
For his PhD Bram worked on solving extremely anisotropic diffusion equations. In the meantime he had a paid web service for which he had to process/enrich domain names, and he did a pre-master in information law.
After his studies he went into the telco industry as a data analyst/scientist, sparking his interest in machine learning. This last year Bram worked at the UvA on the personalized communication project (http://personalised-communication.net/), where he co-developed the Robin-plugin, the data pipeline and the analysis library.
Bram is still involved in work on recommender systems at the UvA, specifically on the development of a framework for diversity enhancement and as a side project he is involved in developing a ML-library for using genetic data to predict the most effective chemotherapy.
• Semantic Web
• Recommender Systems
• Linked Data
• Data Analytics
In the last five years, Valentina has been a PhD student affiliated with the Web & Media Group (VUA Computer Science Department), supervised by prof. Guus Schreiber, and by prof. Lora Aroyo.
The focus of her PhD is on the use of Semantic Web technologies to enhance Recommender Systems. In particular, Valentina investigated how to improve the description of the items to recommend with information gathered from different Linked Data sources.
The objective of Valentina's research is to improve the serendipity of recommendations. During the last year, she has also been employed by Huygens ING as a scientific programmer, where she developed a tool to enrich historical data with Linked Open Data sources.
• Machine learning
• Image processing
• Statistical modelling
Wouter is a researcher in artificial intelligence and statistics. He obtained a Master of Science degree in Neuroscience from Maastricht University in 2013, with a focus on how the brain processes information. For his master thesis he interned at the Werner Reichardt Centre for Integrative Neuroscience in Tübingen, Germany. During his time there he worked on developing a computational model of how animal brains guide eye movements, such that they gather as much information as fast as possible.
In 2013, he started his PhD in Artificial Intelligence at Delft University of Technology, where he worked on systems that learn to make decisions. In particular, he focused on domain adaptation in machine learning, which is concerned with data from different populations. The aim is to learn from one population and generalize specifically to a target population. His research output consists of theoretical conditions for targeted generalization, domain-adaptive classification algorithms, robust parameter estimators and analyses of sampling variance. In 2016, he visited Cornell University in the U.S. to work on the combination of domain adaptation and causal inference.
Wouter is generally interested in mathematical models of intelligent behavior, probabilistic graphical models, statistical inference and probability theory.
• Data analysis
• Physics & astrophysics
• Scientific methods
• Machine learning
Physics & astrophysics
Ben did his masters in physics and astrophysics at the UvA (cum laude) and his PhD in Leuven (Belgium). He worked on modeling the interaction between the radiation field and collections of solid-state particles. This to interpret infrared observations of stellar outflows and planet forming disks.
After his PhD he did two postdocs, one at Stockholm University and a second at the European Space Agency (ESTEC, The Netherlands). During those years he continued work on radiative transfer modeling and infrared observations and also focused on making and analyzing laboratory infrared spectra of meteoritic minerals. The work of Ben and colleagues led to multiple international publications (among which in Nature).
Ben is interested in different scientific disciplines, data analysis and visualization and he looks forward to developing himself in machine learning techniques. He uses python for most of his work and, among others, also has experience with sql, java and Fortran.
• numerical algorithms implementation
• functional programming
numerical algorithms implementation
Felipe did his PhD thesis on computational photochemistry, specifically on the dynamics of molecules in excited state. Since then Felipe has been developing software for scientific applications using a functional programming perspective.
He already collaborated with the eScience Center during the past 2 years, being a postdoc on the Computational Chemistry made Easy project.
Felipe is particularly focused on the implementation of numerical algorithms using Numpy and workflows in quantum chemistry using Noodles.
• Mathematical Modelling
• Evolutionary Biology
• Game Theory
Laurens studied Theoretical Physics and Economics and did some research in Evolutionary Biology. His main interests lie in Mathematical Modelling and trying to understand 'mechanisms', in particular using Game Theory.
Laurens has previously worked as a software engineer and as a consultant, but he missed the academic challenges at these jobs. The diverse projects at the eScience Center are more interesting and fun!
• Statistical modelling
• Machine learning
• Astronomy and planetary science
Astronomy and planetary science
Yifat has a bachelor degree in Physics from Ben-Gurion university (Israel) and a master in Physics from Tel-Aviv university (Israel), studying gravitational lensing in galaxy clusters. She received her Ph.D. in Planetary Science (Tel-Aviv university), where she focused on studying detection methods for transiting extra-solar planets. In the course of her research she became interested in statistics and algorithms and developed innovative methods for predictive modeling. These methods can be used to increase the yield of transiting planets from low-cadence surveys.
After moving to the Netherlands, Yifat continued to a post-doc position at the exoplanets group in the institute of Astronomy in the University Van Amsterdam, where she worked on statistical analysis of planetary atmospheres. Aside from planets, she is interested in machine learning, Bayesian inference, and algorithms.
Yifat joined the Netherlands eScience Center in summer 2017.
• Scientific Programming
• Efficient Computing
• Python, C
Bouwe studied at the of University of Amsterdam where he obtained a BSc in physics, an MSc in astronomy and an MSc in theoretical physics.
His astronomy master’s project was on estimating and modelling gravitational waves emitted from neutron star thermonuclear bursts, while his physics master’s project was a theoretical study on extending the standard model of particle physics with right handed neutrinos and a heavy gauge boson.
After graduating, he started working in the space industry, where he developed software for processing image data acquired by earth observation satellites for KNMI and ESA.
Among other things, he implemented improvements to the algorithms used in the data processors, increasing the accuracy and dramatically reducing the processing time. He is most experienced in the languages Python and C, but very interested in learning new languages and programming concepts. His main research interest is applying advanced computational techniques to solve real world problems, e.g. in medical science, environmental science, or developing safer and greener technology.
Tom obtained his BSc and MSc in physics (track Particle Physics) at the University of Amsterdam. For his thesis he worked on technology behind proton radiography, a method that can possibly improve the accuracy of proton radiation treatment of tumors.
Tom has always had an interest in computer programming and enjoys using his skills to solve practical problems. He is especially familiar with web standards and joined the eScience Center to help in projects with web related problems.
• Computational modeling
• Knowledge engineering
• Linked data
• Environmental science
Martine studied at Wageningen University where she obtained a MSc in both Biology and Environmental Science. After graduation in 2005 she worked at the Radboud University Nijmegen as a junior researcher modeling the accumulation and behavior of toxic chemicals in the environment. In 2007 she joined the PBL Netherlands Environmental Assessment Agency. At PBL she supported colleagues in developing, using and testing various types of computational models on, for example, sustainable development, flood risk and the Dutch energy system
In 2011 she started her PhD research at the Computer Science department of the Vrije Universiteit Amsterdam. During this research she developed methods for the automatic interpretation of spreadsheets that are used in natural science research.
Martine joined the Netherlands eScience Center in summer 2017. In her view, science is about sharing knowledge. At the eScience Center she aims to contribute to transparency in research software and methods, so that these can be understood and reused by other persons than the original developers.
• Fluid Mechanics
• Computational Fluid Dynamics
Computational Fluid Dynamics
Yang obtained his bachelor in Dalian University of Technology, China. Major in Naval Architecture and Ocean Engineering, Yang found himself of great interest in fluid mechanics and numerical modeling. He extended his interest in the field of Computational Fluid Dynamics (CFD) and received his master in Delft University of Technology, Netherlands. His master thesis was finished in Maritime Research Institute Netherlands (MARIN), with a topic of the research on Vortex Induced Motion (VIM) of offshore platforms through CFD.
In 2017, Yang joined the Netherlands eScience Center. Serving as a PhD candidate, Yang works on the Blue Action project, under the EU Horizon 2020 Work Programme. The aim of the project is to gain more understanding about the Arctic weather and climate. Yang mainly focus on the atmospheric and oceanic energy transport towards the Arctic region. The access to high resolution numerical climate models and latest reanalysis datasets enables him to study the Arctic climate and improve the weather prediction.
Meanwhile, he also works in the Wageningen University (WUR).
• Programming languages
• Functional programming
• Randomized testing
Atze studied at the Vrije Universiteit (VU) in Amsterdam, where he got a BSc in Computer Science and an MSc in Theoretical Computer Science (cum laude). During his studies he got interested in programming languages, functional programming and compilers, among other things. He joined the Software Analysis and Transformation team at Centrum voor Wiskunde en Informatica (CWI), where he got a PhD on programming language topics under supervision of Prof. Paul Klint.
He then moved to Gothenburg, Sweden for a post-doc at Chalmers University of Technology where he continued to work on (functional) programming languages, mainly topics relating to the programming language Haskell. Aside from programming languages and programming languages-research, he is knowledgeable in algorithms, computer graphics, randomized testing and recently, he delved into the fascinating area of machine learning.
Atze joined the Netherlands eScience Center as an eScience Research Engineer in March 2017.
• Computer vision
• Image processing
• Medical image analysis
• Algorithm validation
• Grand challenges
Medical image analysis
Adriënne has a bachelor degree in computer science and a master degree (cum laude) and PhD degree in biomedical image science.
Her PhD research at the image sciences institute (ISI) in the UMC Utrecht focused on image processing to reduce the X-ray radiation dose in computed tomography (CT) scans while maintaining image quality. She developed three noise reduction methods (one for 3D and two for 4D data) to improve the image quality of CT scans acquired with low radiation dose. As well as a method to derive vascular information from cerebral 4D CT perfusion (CTP) scans that has the potential to replace the additional CT angiography (CTA) scan.
As a post-doc at the biomedical image analysis group (BIGR) in the Erasmus MC (Rotterdam) she focused on noise reduction in 3D XperCT scans acquired with the C-arm CBCT system and compressed sensing, after which she returned to the ISI for her post-doc research on quantitative analysis of MR brain scans for cerebrovascular disease management.
During her post-doc research she developed an interest in grand challenges ( https://grand-challenge.org/All_Challenges/), which are open scientific competitions that use evaluation data and metrics to rank the performance of algorithms with respect to an objective. She has organized the MICCAI (http://www.miccai2013.org/) grand challenge on MR brain image segmentation (MRBrainS) workshop in Nagoya (Japan) and has set-up and maintains the open MRBrainS challenge evaluation framework (http://mrbrains13.isi.uu.nl/). At IEEE ISBI 2015 (http://biomedicalimaging.org/2015/) in New York, she organized the challenge workshop on neonatal and adult MR brain image segmentation (neatbrains15.isi.uu.nl). She is chair of the challenge workshops at IEEE ISBI '16 and '17 and co-organized the tutorial on designing benchmarks and challenges for measuring algorithm performance in biomedical image analysis at IEEE ISBI '16. At the NFBIA Summer School 2015, she gave a workshop on designing challenges in biomedical image analysis.
She was initiator and organizer of the ImagO colloquium series on medical imaging for the PhD programme of the graduate school of life sciences in the UMC Utrecht and was in the program committee of the international workshop on machine learning in medical imaging (MLMI) '15 and '16 at MICCAI.
In October 2016, Adriënne joined the Netherlands eScience Center as an eScience coordinator.
Her research is currently focussed on representation learning and designing a theoretical framework for grand challenges in biomedical image analysis.
Tom holds a PhD in Communication Science. He has studied Communication Science (MSc) and Journalism (MA) at the University of Amsterdam. The emergence and impact of new ICT and media technologies has always been a key theme in his work. After working as a television and newspaper journalist, he has been a PhD candidate and a lecturer at the University of Amsterdam and has been involved in various research projects in the field of social media, journalism and politics. Between 2012 and 2016, Tom worked as a researcher and consultant at the department Strategy & Policy of the Information Society of TNO. He has been involved in national and international research projects related to technology acceleration, smart industry, entrepreneurship, big data, privacy and telecommunication. Furthermore, Tom is the co-author of the Handboek Nieuwe Media (Handbook New Media) and is a mentor for the tech accelerator Startupbootcamp Amsterdam.
• Monte Carlo simulations
• Tomographic image reconstruction
• Proton therapy and radiation detection technologies
Monte Carlo simulations
Tomographic image reconstruction
Proton therapy and radiation detection technologies
Faruk has a bachelor degree in Engineering Physics and a MSc degree in Experimental Particle Physics. During his MSc he has worked for CERN ATLAS TRT group and was involved in software development as well as detector R&D efforts.
After completing his MSc, Faruk started his PhD at Ghent University and he has worked for ENVISION project which aims to develop novel imaging systems for hadrontherapy. Faruk is finishing his PhD at the Center for Advanced Radiation Technology (KVI-CART) of the University of Groningen. During his PhD, Faruk has been involved in Monte Carlo simulations of proton therapy and related TOF-PET-imaging. He participated in experiments related to proton therapy at the cyclotron facility of KVI-CART.
• Financial administration
• Management accounting
Noura studied bussiness economics at the Hoge School van Amsterdam. She finished her bachelor in 2014. During her study she did interns at the Belastingdienst and the Gemeente Amsterdam.’’
In February 2016 Noura joined the Netherlands eScience Center as Assistant Operations.
• High-performance computing
• Monte Carlo simulation
Monte Carlo simulation
Gijs van den Oord studied theoretical physics and mathematics at Utrecht University. In his master thesis, he investigated the relation between super-membranes and matrix models in string theory. After that he started a PhD in particle phenomenology at the Radboud University and Nikhef, where he developed expertise in high-performance computing, numerical simulation and the Monte Carlo method. With his C++ code he was able to simulate weak boson scattering in Higgsless models at the Large Hadron Collider.
After his graduation, Gijs has worked as a consultant in scientific software development. He has helped creating the DeltaShell framework at Deltares, embedding hydrological computational codes into object-oriented wrappers to facilitate visualization and coupling. Here he has also contributed to D-Flow Flexible Mesh, a shallow-water equation solver on unstructured grids.
Recently Gijs has started working on Primavera, a project with KNMI to study the EC-Earth climate model at high resolution, and a project with CWI and KNMI that aims to couple cloud-resolving large-eddy simulations to global atmospheric climate codes.
• Machine learning
• Big Data Analytics
Big Data Analytics
Dafne studied Computer Science and Mathematics at Utrecht University. During her master studies, she focused on Machine Learning and Data Analytics. Her master thesis was on recognizing product names using Conditional Random Fields, which she developed during her internship at VigLink in San Francisco.
After graduation, Dafne worked at ING as a Data Scientist, where she further developed her skills in Machine Learning and applied them to different business problems. She also got interested in distributed Machine Learning for big data sets, using tools like Spark.
Karima completed the training Sociaal-juridische Dienstverlening at Frans Hals College. She has over 15 years of working experience at different companies in different disciplines.
In February 2016, Karima joined the Netherlands eScience Center as a Secretary.
• Evolutionary Algorithms
• Machine Learning
Berend did his BSc. in Computer Science and MSc. in Technical Artificial Intelligence at the VU University Amsterdam. He did his PhD on artificial evolution of robot organisms. The idea behind his thesis was the creation of real world objects that evolve, in this case robot organisms.
In particular he aimed to allow robots to evolve their bodies and minds to adapt to their environment and tasks.
For his research he developed, tested and compared algorithms for such systems in simulation. He then developed a complete ecosystem for robot organisms in which the bodies and minds of these robot organisms evolved using these algorithms.
• High-performance programming
• C++ programming
• Bayesian methods
• Databases and data models
• System architecture
Databases and data models
Lourens studied Computer Science at the University of Twente in The Netherlands, where he received an MSc (Hons.) in Databases and Information Systems in 2007.
He then joined the Computational Geo-Ecology group of the University of Amsterdam as a scientific programmer on the NDFF-EocGRID project (Part of the Virtual Laboratory for e-Science programme). Lourens led the design of the data model and system architecture of the Dutch National Database of Flora and Fauna (NDFF). The NDFF is a repository for species observations that is now used by volunteer and professional observers to record their observations, by governments to ensure compliance with EU nature protection directives, by scientists for species distribution modelling, and by companies operating in the Dutch landscape to assess potential risks to nature at a time when they can still be easily and inexpensively mitigated.
Following this, Lourens continued at UvA as a PhD candidate in Computational Geo-Ecology, working on the incorporation of dispersal limitations into species distribution models, and fitting such models to observation data using Bayesian techniques. He also worked with colleagues on processing high-resolution global forest cover data into statistics on forest loss and fragmentation, with the goal of investigating at which scales ecosystem services are most affected by these changes.
While working on these projects, Lourens discovered a love for the methodological and technical aspects of doing research, and in February 2016 he joined the eScience Center.
• Graph algorithms and network analysis
• High-performance computing
• Physics (climatology)
• Formal modelling and large-scale simulations
• Scientific/Object-Oriented/Functional programming
Graph algorithms and network analysis
Formal modelling and large-scale simulations
Rena holds double MSc in Applied Mathematics from Baku State University and in Computer Science from KTH, Sweden. In 2011 she received her PhD degree in Theoretical Computer Science from VU Amsterdam. Her research focused on (formal) modelling and analysis of large-scale stochastic systems. She worked as a postdoctoral fellow and an Assistant Professor at VU, and research visitor at NICTA Sydney and University of Melbourne on variety of interdisciplinary projects related to large-scale complex sytems.
Rena joined the Netherlands eScience Center in 2016, she is coordinating several climate science and physics projects.
• Bio-molecular databases
• Unsupervised learning (clustering)
• Distributed/parallel computing
• Web/UI development
• Scientific and statistical computing: Python, R, C/C++
Unsupervised learning (clustering)
Scientific and statistical computing: Python, R, C/C++
Arnold graduated in Molecular Biology (2004, Comenius University in Bratislava, Slovakia), with a specialization in bioinformatics. In the last year of his masters, he received a scholarship to study at the Wageningen University and Research Centre (WUR), the Netherlands. He developed bioinformatics tools to efficiently map the genetic changes and to visualize epidemiologically relevant biomarkers in newly sequenced genomes of White spot syndrome (shrimp) virus isolates.
In the same year, Arnold started his PhD research at the Laboratory of Bioinformatics, WUR, on improving orthology-based detection in fully sequenced genomes and the integration of protein orthology/family resources. In close collaboration with SURFsara, he used the Dutch Life Science Grid to compute a comprehensive map of corresponding (orthologous) proteins across available proteomes from all three domains of life. Analyzing this large data set required a different approach as used by existing tools, namely memory-efficient (out-of-core) graph heuristics, which he implemented in the netclust open-source software.
In 2009 Arnold obtained his doctorate degree in Bioinformatics and continued, as post-doc in the same lab, in the development of the multi-parametric version of this tool called multi-netclust, which enabled analysis of combined data networks (e.g., protein similarity networks) from different sources. In 2011 Arnold moved to Switzerland where he joined the Department of Ecology and Evolution, University of Lausanne and Swiss Institute of Bioinformatics, to work on an e-Science project (Grid-enabled Selectome) aimed at speeding up the detection of positive selection in animal genomes using the Swiss Multi Science Grid (SMSCG).
In 2012 Arnold joined the Department of Genetics at the Erasmus Medical Center in Rotterdam, where he focused on computational aspects of the semi-quantitative mass-spectrometry(MS)-based proteomics, in particular on reliable detection of cellular responses (pathways) upon exposure of mammalian cells to different non-ionizing electromagnetic fields. To facilitate the MS data handling and statistical analyses, Arnold developed PIQMIe, a freely available proteomics web server.
Arnold’s work has been centered around the development of efficient algorithms and user-friendly (web-based) tools for scalable molecular data mining using distributed/parallel computing infrastructures. In 2015 Arnold joined the Netherlands eScience Center where he works on the application of semantic web technologies on biomedical data integration and knowledge discovery.
See Arnold´s list of publications
• Sensor data
• Human movement registration
• Signal processing
• R programming
• Measurement error reduction
Human movement registration
Measurement error reduction
Vincent graduated in Human kinetic technology (BEng) at The Hague University of Applied Sciences and in Human movement sciences (MSc with cum laude) at the VU University in Amsterdam. In 2008 Vincent moved to England to complete a PhD in Epidemiology at the MRC Epidemiology Unit within the University of Cambridge. Vincent did a post-doc at the Institute of Cellular Medicine within Newcastle University.
Central theme of Vincent’s work has been the development of scientific software and algorithms to process data from wearable movement sensors. Vincent pioneered the analysis of data collected with human wrist-mounted high-resolution accelerometers that have been implemented since 2007 in population research on daily physical activity and sleep. Over the years Vincent published on various methodological issues relating to this topic. Further, Vincent has co-authored publications on the first large scale implementation of the technology by scientists in Brazil and the United Kingdom. He released his code as open access software in R package GGIR.
Vincent joined the Netherlands eScience Center in 2015.
Ronald joined the Netherlands eScience Center in 2015 to work as an eScience research engineer on the ERA-URBAN project. The project is aimed at developing an environmental re-analysis specifically on the scale of the urban environment, a long-term archive of urban energy and water balances at very high resolution (~100m).
Ronald studied aerospace engineering at the Delft University of Technology. During his masters program he specialized in data analysis, remote sensing and Earth systems. He worked on the detection of anthropogenic changes in terrestrial water storage using a combination of hydrological models and remote sensing data.
After his masters he joined the Royal Netherlands Meteorological Institute (KNMI) to do a PhD. During his PhD research, he assessed the ability of climate models to correctly simulate precipitation using statistical techniques. Ronald defends his PhD “Assessment of uncertainties in simulated European precipitation” in March 2015.
Ronald is also a core developer for the Arch Linux project, a lightweight and flexible Linux distribution that tries to Keep It Simple.
• Web Science
• Linked Data / Semantic Web
• Optimized Data Handling
• Big Data Analytics
Linked Data / Semantic Web
Optimized Data Handling
Big Data Analytics
Willem received his PhD at VU University Amsterdam and TNO for work on Ontology Alignment for Information Integration. His main research topics in the past 10 years are semantics, augmented sense making, visual analytics, information integration, and text mining.
He is a co-organizer of the LISC and DeRiVE workshop series; the Ontology Alignment Evaluation Initiative (OAEI); and the Linked Science Tutorial series about improving the speed, efficiency and transparency of Web research.
He has developed the Simple Event Model (SEM), an OWL ontology for the description of event data; spatiotemporal indexing for SWI-Prolog (awarded with a best paper award at the EKAW 2010 conference), and the SPARQL package for the R statistical programming language.
Willem was Chief Data Scientist at SynerScope B.V. where, as member of the management team, he helped to bring in a Series A investment. He developed various interactive visual analytics techniques for free text and brought them to the market in the fields of logistics, financial services.
Since 2013 he is guest researcher at the Web & Media group of the VU University Amsterdam, where until that time he was Assistant Professor, and gives lectures at various higher education institutions, such as TIAS, the Amsterdam Business School, Hogeschool InHolland, TU Delft and the University of Amsterdam.
• Geometric algorithms
• Numerical methods
Johan studied astrophysics at the University of Groningen, Kapteyn Astronomical Institute. He is finishing his PhD thesis on the dynamics of the large-scale structure of the Universe. In this work he used tessellation techniques from computational geometry to model and visualise the complex patterns that we find throughout the Universe.
• Research coordination
• Research communication
• Climate research
• Physical oceanography
Wilco Hazeleger serves as Director of the Netherlands eScience Center since 2014, a research center that connects digital technologies with applications in all scientific domains. He has a chair in Climate Dynamics at Wageningen University.
At Wageningen University and Reading University, Wilco studied meteorology. He received his PhD in 1999 in physical oceanography from Utrecht University, after which he went to Columbia University in New York to conduct research on decadal climate variability.
In 2002 Wilco started working at the Royal Netherlands Meteorological Institute (KNMI) on climate dynamics, climate scenarios and the development of global Earth system models. He initiated and led the EC-Earth project, a European Earth system modelling consortium that develops a state-of-the-art Earth system model based on ECMWF's numerical weather prediction model. Until 2014 Wilco led climate research divisions at KNMI. In 2013 he served as Acting Director of a research department on Climate and Seismology Research at KNMI.
Wilco has (co)-authored over 100 refereed publications. He serves on a number of international and national science committees on meteorology, climate and data science, including the SRG of the UK Met Office and the advisory committee of the Swedish eScience Center, and he leads the Big Data national science initiative in the Netherlands.
• Radio Astronomy
• High Performance Computing
• Distributed Computing
• Accelerators (GPUs)
High Performance Computing
Rob received his PhD at VU University Amsterdam for work on "Efficient Java-Centric Grid Computing". Rob has designed and implemented the Manta, Ibis, Satin, and JavaGAT systems (now standardized in OGF as SAGA) and worked on the EU GridLab project, and the Dutch Virtual Labs for eScience project.
From 2009, Rob was a researcher at ASTRON, the Netherlands Institute for Radio Astronomy, where he designs and develops software for the real-time data processing of the LOFAR software telescope, the largest radio telescope in the world. This software runs on an IBM Blue Gene/P supercomputer. Rob performed research on radio astronomy algorithms and pipelines for LOFAR and the exascale SKA telescope. Rob’s latest research focused on the use of many-core architectures such as GPUs for radio astronomy.
Since 2011, he is assistant professor at VU University Amsterdam, where he teaches many-core technology, and initiated the first and only CUDA Teaching Center in the Netherlands. If you are looking for an interesting masters project, please go here. Rob’s research interests include high performance computing, parallel and distributed algorithms, green computing, networks, programming languages, and compiler construction.
In May 2012, Rob joined the Netherlands eScience Center. Rob’s current position is Director eScience Technology. He is responsible for eScience Technology development in all projects, project leader of our eScience technology platform, and manager of the eScience Engineers.