Alert.png The wiki is deprecated and due to be decommissioned by the end of September 2022.
The content is being migrated to other supports, new updates will be ignored and lost.
If needed you can get in touch with EGI SDIS team using operations @ egi.eu.

Difference between revisions of "EGI Distributed Competence Centre"

From EGIWiki
Jump to navigation Jump to search
 
(16 intermediate revisions by 4 users not shown)
Line 1: Line 1:
{{DeprecatedAndMovedTo|new_location=https://confluence.egi.eu/display/EGIBG/Community+Managers}}
{{EGI_Activity_groups_menubar}}  
{{EGI_Activity_groups_menubar}}  
[[Category:Distributed Competence Centres]]
{{TOC_right}}<br>


{{TOC_right}}


= About the Distributed Competence Centre =
= Introduction =
A Distributed Competence Centre (DCC) exists across the NGIs, projects, user communities and technology providers of the EGI Collaboration. The DCC includes user-support personnel and technical assets that can be accessed by research communities to support their research activities with distributed computing services from EGI. <br>


A Distributed Competence Centre (DCC) exists across the NGIs, projects, user communities and technology providers of the EGI Collaboration. The DCC includes user-support personnel and technical assets that can be accessed by research communities to support their research activities with distributed computing services from EGI. The DCC is partially supported by the PF7 EC Project EGI-InSPIRE between May-December 2014.
<br>


The DCC works as a distributed team of experts run under the coordination of EGI.eu.  
The DCC works as a distributed team of experts run under the coordination of EGI.eu.  
What is the process behind the DCC?
*New user communities contact DCC to request support OR potential communities are identified and contacted by DCC to offer support for using EGI services
*Experts from the DCC are appointed to capture, refine and document the requirements of the new community
*Small projects are defined from the DCC to address the requirements of the community (in maximum 6 month timeframe) (Further information about such short projects: [https://wiki.egi.eu/wiki/Virtual_Team_Projects Virtual Team projects]). The projects can for example:
**Develop a data and compute model and make use of the EGI solutions
**Identify the technologies needed to address the user requirements, and bring these into EGI
**Integrate science domain-specific applications services with EGI using some high level framework


= DCC Members  =
= DCC Members  =
Three types of experts are involved in the DCC:


Three types of experts are involved in the DCC (see also the table below):
#'''Scientific communities''' with expertise in models, algorithms from certain scientific domain (for example life sciences).
 
#'''National Grid Initiatives''' (NGIs) or specific institutes from certain NGIs with expertise in user support, application porting, analysis of data and compute model requirements, development and deployment of distributed application services.  
#'''National Grid Initiatives''' (NGIs) or specific institutes from certain NGIs with expertise in user support, application porting, analysis of data and compute model requirements.
#'''Technology providers''' who develop software technologies that can simplify the development, porting, integration and/or operation of scientific application services on EGI. Technology providers can help in the technical analysis of the requirements, can suggest technical solutions and participate in the implementation on EGI.
#'''User communities''' with expertise in services and application porting in certain scientific domain (for example life sciences).  
#'''Technology providers''' who join the DCC to get in touch with users, and can help in the technical analysis of the requirements and in suggesting technical solutions.


= How do I become a partner of the DCC?  =
= How do I become a partner of the DCC?  =


Please contact ucst (at) egi.eu and provide information about the area of expertise you can contribute with to the DCC, and the type of member you want to be (NGI/institution, User/scientific community, Technology provider)
The implementation of the DCC concept is supported by the H2020 EGI-Engage project between March 2015 - August 2017. EGI-Engage [[EGI-Engage:Competence centres|WP6 Knowledge Commons]] includes 8 Competence Centers that together form the DCC:
 
* [[CC-ELIXIR|TASK SA2.3 ELIXIR]]
= Partners and assets  =
* [[CC-BBMRI|TASK SA2.4 BBMRI]]
 
* [[CC-MoBrain|TASK SA2.5 MoBrain]]
This page is a web based registry of the skills and technical assets of the DCC. The registry is not meant to be an exhaustive and fully up to date catalogue of assets of the DCC, but one that gives a good picture about the skills and interests of the various partners.
*'[[CC-DARIAH|TASK SA2.6 DARIAH]]
 
* [[CC-LifeWatch|TASK SA2.7 LifeWatch]]
The registry is maintained in a distributed fashion by the contributors. If you need access to edit this wiki page please email the EGI.eu user community support team at ucst@egi.eu.
* [[CC-EISCAT_3D|TASK SA2.8 EISCAT_3D]]
 
* [[CC-EPOS|TASK SA2.9 EPOS]]
{| width="100%" cellspacing="5" cellpadding="5" border="0" class="wikitable sortable"
* [[CC-Disaster_Mitigation|TASK SA2.10 Disaster Mitigation]]
|-
| '''List of skills and assets'''
| '''Partner (institute or NGI or project or...)'''
| '''Contacts'''<br>
| '''Country'''
|-
|
Earth Science
 
*Solutions to access data from ES Datacenters (e.g.: access to ESGF, which was done with IPSL, France, automatic ES Datadiscovery with opensearch, use of iRODS)
*administrative support for resource providers in ES VO ( SCAI (running glite, Unicore, globus; ROC in DE) and CNRS IPGP are Service providers)
*also available: catch all ES VO for ES single Users, groups etc.
*experience with ES Data Formats and different portal technologies
 
| User community: Earth Science
Partners: CNRS IPSL (Sébastien Denvil), Fraunhofer SCAI (Horst Schwichtenberg, André Gemünd)
 
|
Sébastien Denvil
 
sebastien.denvil@ipsl.jussieu.fr
 
<br>
 
Horst Schwichtenberg
 
horst.schwichtenberg@scai.fraunhofer.de
 
<br>
 
André Gemünd
 
andre.gemuend@scai.fraunhofer.de
 
| International collaboration
|-
|
Life-Sciences
 
*Medical imaging: neuroimaging, cardiovascular imaging, radiotherapy simulation
*Bioinformatics: genomics, proteomics, biobanking, transcriptomics
*High-level services for life scientists: grid application porting, execution and monitoring, grid workflow management, biomedical data management and sharing
*user interfaces: scientific gateways, browsers
*VO administration and operations, services operating and development
 
| User community: Life Science Grid Community
Partners: I3S, CREATIS, AMC
 
|
Johan Montagnat<br>
 
johan.montagnat@cnrs.fr<br>
 
<br>
 
Franck Michel<br>
 
franck.michel@cnrs.fr<br>
 
<br>
 
Tristan Glatard<br>
 
tristan.glatard@creatis.insa-lyon.fr
 
<br>
 
Silvia Olabarriaga
 
s.d.olabarriaga@amc.nl
 
| International collaboration
|-
|
*Science-wise: Macromolecular structure determination and analysis, Molecular dynamics simulations, Macromolecular docking, Small molecule docking, Structural modelling [Biophysical studies&nbsp;? - would be more experimental]
 
*ICT-wise: Application porting to grid, Scientific Web interfaces, Virtualization
 
| User community: WeNMR
Partners: CIRMMP/IT, INFN/IT, Uni of Utrecht/NL
 
|
prof. dr. Alexandre Bonvin<br>
 
a.m.j.j.bonvin@uu.nl
 
<br>
 
Dr. Antonio Rosato
 
rosato@cerm.unifi.it<br>
 
<br>
 
Marco Verlato<br>
 
Marco.Verlato@pd.infn.it<br>
 
| International collaboration
|-
| Electronic structure calculations, Nuclei dynamics and statistical treatments, Molecular dynamics simulations, Assemblage of multiscale simulations, Material-based molecular modelling, Quantum Chemistry/Molecular Dynamics standard data formats
*Community-wise: Quality based collaborative credit economy, Research based distributed learning
 
*ICT-wise: Application porting to GRID/HPC infrastructures, Cross platform scientific workflows, Web-enabled GRID applications
 
| User Community: Computational Chemistry
Partners: Uni of Perugia
 
|
<span class="st">Antonio Lagana</span>
 
<span class="st">lagana05@gmail.com</span>
 
| International collaboration
|-
|
*Parallel Computing: MPI
*Satellite Image Processing: GRASS GIS
*Hydrological modelling: SWAT
*Bioinformatics: Gromacs, NAMD
*Weather Forecasting: WRF
*Linear Algebra: BLAS, SCALAPACK
*Virtualization
 
| NGI: ArmNGI
|
Dr. Hrachya Astsatryan<br>
 
hrach@sci.am
 
| Armenia
|-
|
*Work with scientific groups to capture e-infrastructure requirements of the following communities: - environmental modeling and environmental protection, climate change impact, financial mathematics, modeling of semiconductor devices and support their use of pan-European grid and cloud resources.
 
*Supporting communities in developing their data and compute models: - combined use of grid and cloud resources for data intensive applications - stochastic modeling and processing of scientific data using Monte Carlo methods.
 
*Application porting and testing (define your area of expertise): - Compute intensive parallel applications using MPI, OpenMP, OpenCL, CUDA that still fit grid clusters like our clusters;
 
*Testing of new services - We have sufficient infrastructure to allow for testing
 
*Test infrastructures - We can provide access for advanced GPGPU and Xeon Phi grid resources
 
*Compute services - We provide access to HPC grid clusters equipped with Infiniband and servers with GPU and Xeon Phi accelerators 1.7. MPI: - extensive knowledge of development, profiling and testing of MPI-based applications and combined usage of MPI and other technologies like OpenMP, OpenCL, CUDA, etc.
 
| NGI: Bulgarian Grid Infrastructure
|
Aneta Karaivanova<br>
 
anet@parallel.bas.bg<br>
 
<br>
 
Todor Gurov<br>
 
gurov@parallel.bas.bg
 
| Bulgaria
|-
|
*interviewing scientific groups to capture e-infrastructure requirements with special emphasis on bio and medical sciences like ELIXIR, BBMRI, Instruct);
*supporting communities in developing their data and compute models;
*test infrastructures;
*compute services, esp. in virtualized (cloud) environment;
*user authentication and authorization;
*MPI;
*portal preparation for research infrastructures and ESFRI projects
 
| NGI: CESNET NGI_CZ
|
Prof. RNDr. Ludek Matyska<br>
 
ludek@ics.muni.cz<br>
 
<br>
 
Ivana Křenková<br>
 
krenkova@ics.muni.cz<br>
 
<br>
 
RNDr. Michal Procházka<br>
 
michalp@ics.muni.cz<br>
 
<br>
 
Miroslav Ruda<br>
 
ruda@ics.muni.cz
 
| Czech Republic
|-
|
*Virtualization (cloud: openstack)
*Grid (administration, monitoring, porting, user support)
*General HPC services (computing, storage)
*HPC computing (porting, MPI, CUDA...)
*Bioinformatics support (ELIXIR, BBMRI...)
*Electronic structure calculation (GPAW development)
*Multiphysical simulations (ELMER development)
*Data analysis (Chipster development)
*Training (software usage, grid, cloud, HPC...)
*Authentication and Authorisation
 
| NGI Finland/CSC
|
Jura Tarus
 
Jura.Tarus@csc.fi
 
| Finland
|-
|  
Dissemination about France Grilles and EGI and the services they offer. Interviewing scientific groups to capture e-infrastructure requirements of the following communities: INRA engineers and researchers not currently e-infrastructures users (biodiversity). Supporting communities in developing their data and compute models: a French group managing and processing financial data new to distributed infrastructures. Organization of Grid and HPC regional computing center Days with user experience talks, panels. All days webcasted. DIRAC instance service and DIRAC tutorials. iRODS instance service and iRODS tutorials. EGI central operations portal. Expertise around cloud federation, cloud installation and cloud usage
 
| NGI France Grilles
|
Genevieve Romier
 
genevieve.romier@idgrilles.fr
 
| France
<br>
 
|-
|
*interviewing scientific groups to capture e-infrastructure requirements;
*supporting communities in developing their data and compute models;
*application porting and testing (define your area of expertise); testing of new services;
*test infrastructures;
*data management services;
*compute services;
*user authentication and authorization;
*virtualization;
*MPI;
*coordinating Virtual Team projects;
*providing software and development effort for integrating applications with EGI portals or workflow engines.
*AppDB
*Community Software repository
 
| NGI: NGI_GRNET
|
Kostas Koumantaros
 
kkoum@grnet.gr
 
<br>Christos Kanellopoulos<br>
 
skanct@grnet.gr
 
| Greece
|-
|
*Application porting to Grids / Clouds and also to Desktop Grids (both with volunteer and private resources) with special focus on workflows
 
*Providing software (gUSE/WS-PGRADE) and development effort for integrating applications with EGI-enabled science gateways and workflow engines for Grid and Cloud environments
 
*Virtualisation: on-demand deployment of complex computing infrastructures and services
 
*Coordinating Virtual Team projects (e.g. Science Gateway and Desktop Grid related ones)
 
*Capturing new requirements
 
| Institute: MTA SZTAKI
|
Robert Lovas
 
robert.lovas@sztaki.mta.hu
 
| Hungary
|-
| We at NGI_IL have not only been working with the academy but also with life-science industries. We have rich experience with what it takes to adapt the kind of research done in the industry (SME's) to working on the Grid, and how to overcome the myriad of problems that occur in the process. We will be happy to share this knowledge with other NGIs.
*Porting of scientific application over grid/cloud infrastructure
*User support
*Web interface
*E-Identity Federation
*VOMS
*Virtualization
*Monitoring
*Parallel computing
*Networking
*X.509
*Operations
*Software programming and provisioning
*Netapp, EMC, Cisco, Linux admin and networking
 
| NGI: Israel Grid Infrastructure
|  
Zivan Yoash
 
zivan.yoash@isragrid.org.il
 
| Israel
|-
|
*GRID: Multi-year experience in the creation of computing models based on distributed infrastructure using the glite middleware. Most of the experience was gained in porting to the IGI/EGI infrastructures applications in the HEP and COMPCHEM domains, but we have experience also on bioinformatics applications for NGS and biodiversity. We are acquiring experience on porting meteorological models to the Grid environment exploiting the improved support to parallel jobs we reached in our NGI in last few years.
 
*GRID: multi-year experience in administering/configuring/running all the gLite core services. Many of those components are also developed by NGI_IT staff (VOMS;&nbsp;WMS;&nbsp;CREAM;&nbsp;DGAS&nbsp;Accounting system), so high qualified developing skills are also available.
 
*GRID/HPC: experience acquired in running small and medium sized HPC jobs in the Grid infrastructure. In collaboration with COMPCHEM we are acquiring experience in building computing models which address mixed HTC/HPC workflows distributed to different infrastructure. (i.e. EGI/PRACE or EGI/XSEDE)
 
*Virtualizazion/Cloud: we have know-how and experience related to the WNoDeS framework, used to run scientific workloads through virtualization technologies in medium to large scale data centers. In the past few years, we have also acquired significant know-how and experience with OpenStack, both in scientific environments and in national initiaties involving SMEs and targeting cloud-based solutions and services for Public Administrations.
 
*HIGH LEVEL WEB INTERFACE: we gained experience in building grid/cloud frontends based on liferay and ws-pgrade - we developed tools such as the IGI-Portal (https://portal.italiangrid.it) and the Catania Science Gateways (http://www.catania-science-gateways.it) based on those technologies
 
*APPLICATIONS: Expertise and offer of griddified important applications in several domains: (some of them available through the IGI Portal): CMMST&nbsp;: Venus, Crystal, QuantumEspresso, NAMD, Gaussian09 EARTH SCIENCE MODELLING: WRF, NEMO Gemomics: BLAST CFD: ANSYS, OPENFOAM HEP: FLUKA We are acquiring experience in handling licensed application in distributed systems (i.e. ANSYS).
 
*TRAINING: multi-year experience in organizing/running tutorials, training events and application porting schools.
 
| NGI: Italian Grid Infrastructure
| dcc-it@lists.italiangrid.it<br>
| Italy
|-
|
*interviewing scientific groups to capture e-infrastructure requirements;
*supporting communities in developing their data and compute models;
*application porting and testing ( GAMESS, GAUSSIAN, CRYSTAL and applications created by Lithuanian scientis for their own using);
*testing of new services;
*test infrastructures;
*data management services;
*compute services;
*virtualization;
*providing software and development effort for integrating applications with EGI portals or workflow engines.
 
| Institute: Vilnius University
|  
Jelena Tamuliene
 
Jelena.Tamuliene@tfai.vu.lt
 
<br>Rolandas Naujikas<br>
 
Rolandas.Naujikas@mif.vu.lt
 
<br>
 
Eduardas Kutka
 
Eduardas.Kutka@mif.vu.lt
 
| Lithuania
|-
|
At present time RENAM's specialists can:
 
*interviewing scientific groups to capture e-infrastructure requirements;
*supporting communities in developing their data and compute models;
*coordinating Virtual Team projects;
*testing of new services;
*test infrastructures;
*virtualization;
 
In the process training of specialists in:
 
*application porting and testing (Fortran, C++ from MS Visual Studio to Linux);
*user authentication and authorization;
*OpenMP; MPI;
 
| NGI: RENAM
| egidcclist@lists.renam.md<br><br>
| Moldova
|-
|
*MOOC development training and education on High Performance Computing;
*Data intensive science and visualization;
*Requirements engineering with relation to HPC infrastructure;
*Supporting communities developing their data and compute models and stategies;
*Application porting and testing (Map Reduce, NoSQL, MPI, OpenMP, CUDA);
*Testing of new noSQL databases;
*Data management services providing Persistent Identifier Services;
*User authentication and authorization;
*Virtualization and HPC cloud computing;
*MPI
 
| Institute: SURFsara
|
To be chosen
 
| Netherlands
|-
|
Skills for Polish NGI: interviewing scientific groups to capture e-infrastructure requirements -- yes, with great success for ESFRI supporting communities in developing their data and compute models (CTA &amp; EPOS activery presently) application porting and testing (vast experience concerning chemistry, biology, astrophysics, earth science and many others) testing of new services, domain infrastructures, deployment od domain services to the PL-Grid infrastructure. We provide services and expertiese for building web portals, science gateways ans also we develop a workflow engine (Kepler). PL-Grid is also provider of grid middleware QosCosGrid.
 
| NGI: PL-Grid
|
Mariusz Sterzel
 
m.sterzel@cyfronet.pl
 
| Poland
|-
|
*supporting communities in developing their data and compute models
*testing of new services
*test infrastructures
*data management services
*compute services
*Parallel computing
*virtualization and cloud computing
*application porting (Physics, Civil Engineering, life Sciences, ...)
 
| NGI: Portugal Grid Infrastructure
|
Gonçalo Borges
 
goncalo@lip.pt
 
<br>Jorge Gomes
 
jorge@lip.pt
 
<br>João Pina
 
jpina@lip.pt
 
| Portugal
|-
|  
*Grid technologies<br>Deployment, maintenance, testing, profiling and tuning of available Grid technologies (core services). This includes experience of monitoring of performance of deployed technologies.
*Grid porting<br>Grid porting, debugging, profiling, and tuning of serial and parallel (MPI, OpenMP, OpenMP/MPI, GPU, XeonPhi) applications in the area of physics, chemistry, engineering, material science, as well as utilisation of advanced Grid scheduler features.
*Workflow technologies<br>Deployment, maintenance, and customisation of gUSE workflow technologies, which also includes complex workflow creation, portlet creation, and development of custom interface that envelopes a generic workflow.
*High-level interfaces<br>Development and deployment of RESTful interfaces on the top of Grid ported application.
*Libraries and application tools<br>Experience with high-performance software libraries (LAPACK, BLAS, FFTW3, SPRNG, Intel MKL, ScaLAPAC, etc.) and application tools (MPICH, MPICH2, OpenMPI, gcc, gfortran, Intel Compilers, PortlandGroup Compilers, NAMD, CPMD, Firefly, AutoDock Vina, OpenEye, etc.)
*Training and education<br>Organization and realization of both introduction and advanced training events on Grid technology and Grid porting process.
 
| NGI: AEGIS
|
<span class="st">Antun Balaz</span>
 
<span class="st">antun@ipb.ac.rs</span>
 
<br>
 
<br>
 
<span class="st">ngi_aegis-grid-management@ipb.ac.rs</span>
 
| Serbia
|-
|
*Porting applications to HPC clusters, Grids and to GPU accelerators.
*Providing support for running applications on HPC clusters and Grids.
*Providing testing infrastructures for Grid and Cloud based on virtualized environments.
*Slovak NGI can contribute to deployment support and training of Grid and Cloud compute services.
*Slovak NGI has long term experience with virtualized environments and can contribute to support and training.
*Providing support for running complex parallel MPI and OpenMP applications on HPC clusters and Grids.
*Slovak NGI took part in VTP MPI within EGI: https://wiki.egi.eu/wiki/VT_MPI_within_EGI
*Slovak NGI coordinated 2 VTPs with aims to interviewing scientific groups and to capture e-infrastructure requirements:
**VTP Fire and Smoke Simulation (21/12/2011 - 30/06/2012) https://wiki.egi.eu/wiki/VT_Fire_Simulation, user and application support provided by Slovak NGI; VTP final report including 3 (Slovakia, Spain, Portugal) scientific groups requirements: https://documents.egi.eu/public/ShowDocument?docid=1341; Fire Simulation using the grid-enabled FDS system gFDS in EGI AppDB: http://appdb.egi.eu/store/software/gfds
**VTP SPEEch on the griD SPEED (7/03/2011 - 14/05/2013) https://wiki.egi.eu/wiki/VT_SPEED, user and application support provided by Slovak NGI; VTP final report: https://documents.egi.eu/public/ShowDocument?docid=1777
*Slovak NGI started communication with hydropedology scientific group from Slovakia with the aim to gather the requirements of this community.
*Slovak NGI started communication with nanotechnology scientific groups (Slovakia, Germany, Bulgaria, Spain, Poland) with the aims to gather the requirements of this community, specify use cases, create a VRC, and prepare a proposal of EU project “Cloud-Based High Performance Computing for Nanoscale Simulations”.
 
| NGI: SlovakGrid
|
Dr. Ladislav Hluchy
 
hluchy.ui@savba.sk
 
| Slovakia
|-
|
*Support to the deployment of the LIFEWATCH ICT-Core Services
**See presentation form Jesus Marco at https://indico.egi.eu/indico/contributionDisplay.py?sessionId=1&amp;contribId=5&amp;confId=1893 )
*Permanent contact with scientific groups to capture e-infrastructure requirements: HEP, Astronomy &amp; Astrophysics, Stat.Phys, Materials &amp; Quantum Chemistry, Ecosystems, Meteorology, Genetics, Engineering, Social Complex systems (econophysics, etc.)
*Supporting communities in developing their data and compute models; application porting and testing (define your area of expertise: see above, for HTC, HPC&amp;supercomputers, large data sets support)
*Test infrastructures;
*data management services; in particular data preservation
*user authentication and authorization in Cloud; as well as virtualization (under OpenStack)
*MPI
*coordinating Virtual Team projects
*providing software and development effort for integrating applications with EGI portals or workflow engines. In particular for data integration and modelling of ecosystems
 
| NGI: IberGrid
|
Jesus Marco de Lucas
 
marco@ifca.unican.es
 
| Spain
|-
|
*Interviewing scientific groups to capture e-infrastructure requirements
*Supporting communities in developing their data and compute models
*Application porting (focu in on HTC usecases, data analysis, model calibration)
*Expertise in porting scientific usecases on cloud infrastructure (private, public, hybrid)
*Expertise in cloud infrastructure provisioning
*Expertise in cloud-bursting for local IT centers
*Data lifecycle management (although this tends to be community specific)
 
| NGI: Swiss Grid Infrastructure
|
Sergio Maffioletti
 
sergio.maffioletti@gc3.uzh.ch
 
| Switzerland
|-
|
*Interviewing scientific groups to identify the e-inrastructure requirements especially with bio-medical sciences.
*Supporting communities in developing data and compute models as it had been already done for seismology.
*Testing the the new services: As we are experienced in administering, configuring and running most of the EMI/gLite core services.
*Test Infrastructures: We can provide access for different type of compute nodes, also with GPGPU resources.
*Virtualization and Cloud Computing
 
| NGI: TUBITAK ULAKBIM
| grid-teknik@ulakbim.gov.tr
| Turkey
|-
| Providing accounting solutions and expertise including for new resource types, Expertise in Security (operational, policy, software vulnerability), Expertise in Service Discovery/Repositories, GPU and associated technologies, Clouds and federated clouds, Life sciences, Training marketplace solutions for new projects and communities
| NGI: NGI_UK
|
Tamas Kiss
 
T.Kiss@westminster.ac.uk
 
| United Kingdom
|-
|
Skills and assets:
 
*interviewing scientific groups to capture e-infrastructure requirements;
*supporting communities in developing their data and compute models;
*application porting and testing (Map Reduce, MPI, OpenMP, CUDA);
*automatic complex information processing systems development;
*testing of new services;
*coordinating Virtual Team projects;
*engineering services;
*compute services;
*cloud resources.
 
| NGI_BY: National Academy of Sciences Center of Competence in Grid-technologies
|
Serge A. Salamanka
 
salamanka@newman.bas-net.by
 
| Belarus
|-
| Genesis II project and the GFFS
| Technology Provider: University of Virginia
|
Andrew Grimshaw
 
grimshaw@virginia.edu
 
| USA
|-
| DIRAC
| Technology Provider: Universitat de Barcelona, CPPM/CNRS
|
Ricardo Graciani Diaz
 
graciani@ecm.ub.es
 
<br>
 
Andrei Tsaregorodtsev
 
atsareg@in2p3.fr
 
<br>
 
Adrián Casajús
 
adria@ecm.ub.edu
 
<br>
 
Víctor Méndez
 
vmendez@caos.uab.es
 
<br>
 
Elisa Heymann
 
elisa.heymann@uab.es
 
| Spain, France.&nbsp;
|-
| Digital preservation and digital curation services
| Technology Providers: FTK-Forschungsinstitut für Telekommunikation e.V, M. Hemmje, R. Riestra, H. U. Heidbrink
|
Matthias Hemmje
 
Matthias.Hemmje@FernUni-Hagen.de
 
<br>
 
Ruben Riestra
 
ruben.riestra@grupoinmark.com
 
<br>
 
Hans-Ulrich Heidbrink
 
hans-ulrich.heidbrink@incontec.de
 
| Alliance for Persistent Archives (APA), INCONTEC
|-
| SAGA
| Technology Provider: RUTGERS School of Engineering S. Jha
|
Shantenu Jha
 
shantenu.jha@rutgers.edu
 
| USA
|-
| SCI-BUS
SCI-BUS develops a gateway technology (based on WS-PGRADE/gUSE) and a gateway customization methodology that enables effective development of customized science gateways to diverse user communities. Currently 27 SCI-BUS gateways are operated in Europe (http://www.sci-bus.eu/science-gateways) and numerous other gateways are under construction. SCI-BUS operates an Application and User Support Service that supports various user communities in creating and operating science gateways, and porting applications to various DCIs. SCI-BUS is strongly associated to EGI and the majority of SCI-BUS gateways utilize EGI resources and technology.
 
| Technology Provider: SCI-BUS European project: http://www.sci-bus.eu/
|
Peter Kacsuk
 
kacsuk@sztaki.hu
 
<br>
 
Tamas Kiss
 
T.Kiss@westminster.ac.uk
 
| International collaboration<br>
|-
|
The following EGI-Inspire partners have ARC technology experts available for DCC activities:
 
*University of Copenhagen (UCPH) - Denmark
*Arnes, SLING - Slovenia
*Uninett Sigma As (SIGMA) - Norway
 
If needed, experts from non EGI-Inspire affiliated members of the NorduGrid collaboration are also available.
 
| Technology provider: ARC
NGI: NorduGrid/ARC Partners: To be finalised by Nordugrid board
 
|
Balazs Konya
 
balazs.konya@hep.lu.se
 
| International collaboration
|-
|
Work with scientific groups to capture e-infrastructure requirements and support their use of pan-European grid and cloud resources in the following communities:
 
- coputational chemistry
 
- environmental modeling and environmental protection,
 
- climate change impact,
 
- bioinformatics
 
- coding and cryptography
 
Supporting communities in developing their data and compute models for use of grid and cloud resources for data intensive applications.
 
Application porting and testing:
 
- parallel applications using MPI and OpenMP
 
- parametric jobs
 
- workflow jobs
 
Testing of new services:
 
- We have sufficient infrastructure to allow for testing
 
Test infrastructures
 
- We are part of the fedCloud testbed and we can deploy different cloud instances for testing
 
Compute services:


- We provide access to HPC grid clusters equipped with Infiniband - extensive knowledge of development, profiling and testing of MPI-based applications and combined usage of MPI and other technologies like OpenMP
These competence are open for any partner to join. Please read more about the scientific and technical objectives of the competence centres and contact the respective centre leader to request membership.


| NGI: Macedonian NGI_MARGI
= Archive: DCC between 2010-2014 =
| Boro Jakimovski<br>boro.jakimovski@finki.ukim.mk<br><br>Vladimir Trajkovik<br>vladimir.trajkovik@finki.ukim.mk<br>
| <span>Macedonia</span>
|}


&lt;a _fcknotitle="true" href="Category:Community_Engagement"&gt;Community_Engagement&lt;/a&gt;
A simplified version of the DCC contcept was implemented in EGI through the EGI-InSPIRE FP7 project between 2010-2014. Further information about this implementation can be found on this archive page: [[EGI-InSPIRE DCC]]

Latest revision as of 18:59, 19 November 2021

Alert.png This article is Deprecated and has been moved to https://confluence.egi.eu/display/EGIBG/Community+Managers.



EGI Activity groups Special Interest groups Policy groups Virtual teams Distributed Competence Centres



Introduction

A Distributed Competence Centre (DCC) exists across the NGIs, projects, user communities and technology providers of the EGI Collaboration. The DCC includes user-support personnel and technical assets that can be accessed by research communities to support their research activities with distributed computing services from EGI.


The DCC works as a distributed team of experts run under the coordination of EGI.eu.

DCC Members

Three types of experts are involved in the DCC:

  1. Scientific communities with expertise in models, algorithms from certain scientific domain (for example life sciences).
  2. National Grid Initiatives (NGIs) or specific institutes from certain NGIs with expertise in user support, application porting, analysis of data and compute model requirements, development and deployment of distributed application services.
  3. Technology providers who develop software technologies that can simplify the development, porting, integration and/or operation of scientific application services on EGI. Technology providers can help in the technical analysis of the requirements, can suggest technical solutions and participate in the implementation on EGI.

How do I become a partner of the DCC?

The implementation of the DCC concept is supported by the H2020 EGI-Engage project between March 2015 - August 2017. EGI-Engage WP6 Knowledge Commons includes 8 Competence Centers that together form the DCC:

These competence are open for any partner to join. Please read more about the scientific and technical objectives of the competence centres and contact the respective centre leader to request membership.

Archive: DCC between 2010-2014

A simplified version of the DCC contcept was implemented in EGI through the EGI-InSPIRE FP7 project between 2010-2014. Further information about this implementation can be found on this archive page: EGI-InSPIRE DCC