Alert.png The wiki is deprecated and due to be decommissioned by the end of September 2022.
The content is being migrated to other supports, new updates will be ignored and lost.
If needed you can get in touch with EGI SDIS team using operations @ egi.eu.

Difference between revisions of "FedCloudGIPSY"

From EGIWiki
Jump to navigation Jump to search
 
Line 1: Line 1:
[[Category: Technology ]]
{{Fedcloud_Menu}}
[[Category: Fedcloud-tf]]
 
{{FCloudCom_menubar}}
 
{{TOC_right}}
{{FedCloudUseCases |  
{{FedCloudUseCases |  
FCUC_Status    = Test & Integration|  
FCUC_Status    = Test & Integration|  

Latest revision as of 16:32, 7 May 2015

Overview For users For resource providers Infrastructure status Site-specific configuration Architecture



Federated Cloud Communities menu: Home Production use cases Under development use cases Closed use cases High level tools use cases



General Information

  • Status: Test & Integration
  • Start Date: 06/10/2014
  • End Date: -
  • EGI.eu contact: Diego Scardaci / diego.scardaci@egi.eu, Enol Fernandez / enol.fernandez@egi.eu
  • External contact: Susana Sanchez Exposito / sse@iaa.es, Daniele Lezzi / daniele.lezzi@bsc.es

Short Description

This project aims to integrate calibration, analysis and modelling pipelines of radio-astronomy data into a cloud infrastructure. It is developed jointly by users of the [www.lofar.org LOFAR] radio-telescope and members of the AMIGA4GAS project.

A cloud infrastructure like the EGI Federated Cloud provides:

  • flexibility to develop innovative processing pipelines;
  • a powerful frame for parallel processing pipelines and workflows;
  • the advantage of the elastic on-demand resource consumption.

Use Case

Modelling of the kinematic of galaxies. In the frame of AMIGA4GAS project, we are building a set of analysis web services, based in tasks of GIPSY (Groningen Image Processing System) that can be used as modules in workflows for modelling galaxies. We are testing different infrastructure to find the one that is most suitable to run specific GIPSY tasks, and that does not require a high level of computing resources. Currently, some of the web services launch the GIPSY tasks to Ibergrid and others to a Supercomputer cluster. We will build the analysis services that run on a cloud system. The final goal is to facilitate the astronomers launching their workflows in heterogeneous distributed computing infrastructures.

The community would like to adopt the COMPSs high-level tool to port the application on the EGI Federated Cloud.

Requirements

  • Storage: 50 GB
  • Memory: GIPSY could analyse cubes up to 8 GB. We would like to test the performance of GIPSY with 4 GB, 6 GB and 8 GB RAM instances.
  • CPU: one per instance.
  • Number of instances: the number of times a analysis web services will be executed as part of a workflow, depends on the inputs of that workflow. Sometimes several analysis service calls are grouped to be executed in one instance of a VM, but other times each call needs one instance.

Additional Files