Alert.png The wiki is deprecated and due to be decommissioned by the end of September 2022.
The content is being migrated to other supports, new updates will be ignored and lost.
If needed you can get in touch with EGI SDIS team using operations @ egi.eu.

Difference between revisions of "FedCloudDIRAC"

From EGIWiki
Jump to navigation Jump to search
Line 3: Line 3:
{{FedCloudUseCases |  
{{FedCloudUseCases |  


FCUC_Status    = Use case 1 finished, Use case 2 in progress |  
FCUC_Status    = Use case 1 finished, Use case 2 in progress, Use case 3 in progress |  
FCUC_StartDate = Use case 1 July 2012, Use case 2 April 2013|  
FCUC_StartDate = Use case 1 July 2012, Use case 2 April 2013, Use case 3 Setember 2013|  
FCUC_EndDate  = Use case 1 March 2013|   
FCUC_EndDate  = Use case 1 March 2013|   
FCUC_EGIName  = Gergely Sipos / gergely.sipos@egi.eu  |  
FCUC_EGIName  = Gergely Sipos / gergely.sipos@egi.eu  |  

Revision as of 12:13, 12 September 2013

General Information

  • Status: Use case 1 finished, Use case 2 in progress, Use case 3 in progress
  • Start Date: Use case 1 July 2012, Use case 2 April 2013, Use case 3 Setember 2013
  • End Date: Use case 1 March 2013
  • EGI.eu contact: Gergely Sipos / gergely.sipos@egi.eu
  • External contact: Víctor Méndez / vmendez@pic.es

Short Description

The DIRAC interware project provides a framework for building ready to use distributed computing systems. It has been proven to be a useful tool for large international scientific collaborations integrating in a single system, their computing activities and distributed computing resources: Grids, Clouds and HTC clusters. In the case of Cloud resources, DIRAC is currently integrated with Amazon EC2, OpenNebula, OpenStack and CloudStack. Some Monte Carlo (MC)simulation campaign were realized at the large scale project Belle II, providing over 10.000 thousand CPU days from Amazon. Until this use case in Fedcloud-tf, all cases have made used of a single cloud at a time. The work integrates the resources provided by the multiple private clouds of the EGI Federated Cloud and additional WLCG resources, providing high-level scientific services on top of them by using the DIRAC framework. New design has been adopted by a federated hybrid cloud architecture (Rafhyc). Initial integration and scaling tests demonstrates the architecture is valid to manage federated hybrid cloud IaaS to provide eScience SaaS. The solution has been adopted by LHCb DIRAC for the LHCb computing on federated clouds, using end-points just like another computing resource.

Use Case

Use Case 1: Running LHCb simulations of Monte Carlo jobs using IaaS in a federated manner, for integration and scaling tests.
Including OpenStack, OpenNebula and CloudStack multiple IaaS providers (Finished)

Use Case 2: Integrate the DIRAC extensions: LHCb-DIRAC and VMDIRAC (federated cloud) to reach a production level for the LHCb computing
Including federated services for proxy authentication and VM monitoring
Considering future developments on federated services for accounting and information systems

Additional Files

  • File 1
  • File 2