Alert.png The wiki is deprecated and due to be decommissioned by the end of September 2022.
The content is being migrated to other supports, new updates will be ignored and lost.
If needed you can get in touch with EGI SDIS team using operations @ egi.eu.

Difference between revisions of "EGI-InSPIRE:Ibergrid-QR11"

From EGIWiki
Jump to navigation Jump to search
 
(26 intermediate revisions by 4 users not shown)
Line 1: Line 1:
{Template:Op menubar}} {{Template:Inspire_reports_menubar}} {{TOC_right}}  
{{Template:EGI-Inspire menubar}}
 
{{Template:Inspire_reports_menubar}}
{{TOC_right}}


<br>  
<br>  
Line 13: Line 16:
|NGI_IBERGRID
|NGI_IBERGRID
|LIP & CSIC
|LIP & CSIC
|Esteban Freire (CESGA)
|Esteban Freire (CESGA), Alvaro Simón (CESGA)
|}
|}


Line 34: Line 37:
! Outcome (Short report &amp; Indico URL)
! Outcome (Short report &amp; Indico URL)
|-
|-
| 5 Oct / 2012
| 7-9 Nov 2012
| Valencia, Spain
| Lisbon, Portugal
| Mini workshop SuperComputing + Grid
| IBERGRID 2012, 6th Iberian Grid Infrastructure Conference
| 5
| 50
|
| The 2012 IBERGRID conference was organized by LIP in Lisbon, Portugal. The main topics of IBERGRID 2012 Conference were: Infrastructures, Services and Operations, Innovation in the provision of IT services: virtualization and cloud computing, Data Management and Storage Systems, IT Management and Green Computing, EGI and WLCG Grid Computing Activities, Digital Repositories and Preservation,Community Oriented Services, User and Applications, Technology Transfer to Society. This is the annual meeting gathering IBERGRID operators and user communities to reassess the past activities, debate problems and define joint strategies. Conference URL: http://www.ibergrid.eu/2012
* IFIC: Programme available at http://ivicfa.uv.es/wp-content/uploads/2012/10/Programme_oct-5.pdf
|-
| &lt;Date&gt;
| &lt;Location&gt;
| &lt;Title&gt;
| &lt;Participants&gt;
| &lt;Outcome&gt;
|-
| &lt;Date&gt;
| &lt;Location&gt;
| &lt;Title&gt;
| &lt;Participants&gt;
| Test <!-- formatting_test -->
|}
|}


Line 64: Line 54:
! Outcome (Short report &amp; Indico URL)
! Outcome (Short report &amp; Indico URL)
|-
|-
| 19-20 Jan / 2012
| 4-5 Nov 2012
| Madrid Spain
| CERN
| Jornada de usuarios Grid
| CMS Offline and Computing Week
| 3
| 1
|
| * IFAE: Followup of the CMS computing activities and its impact on the Tier1 operations and plans, https://indico.cern.ch/conferenceDisplay.py?confId=171869
* CAPFE-GRANADA: Presentation “Datos del Observatorio Auger”, http://indico.ifca.es/indico/contributionDisplay.py?contribId=3&sessionId=3&confId=399
|-
|-
| 17-21 Sep / 2012
| 7-9 Nov 2012
| Prague, CZ Republic
| Lisbon, Portugal
| EGI Technical Forum
| IBERGRID 2012, 6th Iberian Grid Infrastructure Conference
| Several NGI members attending the conference (~ 20)
| 50
| Reflecting on the considerable progress made by the collaboration so far and looking forwards to the challenges of public availability, reusability and transparency of scientific methods and data in Open Science, the core theme of this Technical Forum was: “The development of an open and sustainable EGI ecosystem that will support Open Science in the digital European Research Area.”, https://indico.egi.eu/indico/conferenceDisplay.py?confId=1019
| Conference Programe URL: http://www.ibergrid.eu/2012/index.php?option=2
* BIFI-UNIZAR
# A. Giner et al, Hadoop Cloud SaaS access via WS-PGRADE adaptation
* CAFPE-GRANADA
# Julio Lozano-Bahilo, Pierre Auger Collaboration on Grid
* CESGA:
* CESGA:
# "Workshop: New features on Accounting Portal Release Electra", https://indico.egi.eu/indico/contributionDisplay.py?contribId=35&sessionId=39&confId=1019 , workshop
# A. Simon et al, New deployments on EGI Verification and Staged Rollout processes
# “Workshop: Future advancements of tools and regionalization”, https://indico.egi.eu/indico/contributionDisplay.py?contribId=43&sessionId=39&confId=1019.
# A. Simon et al, EGI Fedcloud Task Force
# “Demo: Providing cloud services”, https://indico.egi.eu/indico/sessionDisplay.py?sessionId=12&confId=1019#20120918
* CIEMAT
# “Demo: MPI in EGI”, https://indico.egi.eu/indico/contributionDisplay.py?contribId=107&confId=1019
# A. Delgado et al, Synchronization and Versioning of Cluster Configuration with Csync
* IFCA:
# M. Cárdena Montes et al, New Computational Developments in Cosmology
# Ibercloud: orchestrating services to provide virtualized access to IberGrid: https://indico.egi.eu/indico/contributionDisplay.py?sessionId=12&contribId=3&confId=1019
* IFCA
# “Demo: MPI in EGI”, https://indico.egi.eu/indico/contributionDisplay.py?contribId=107&confId=1019
# P. Orviz et al, Production change management using Puppet, Git, Jenkins and Gerrit
* LIP:
# E. Fernandez et al, IberCloud: federated access to virtualized resources
# Deploying User Oriented Services in IBERGRID: https://indico.egi.eu/indico/contributionDisplay.py?contribId=95&sessionId=55&confId=1019
# E. Fernandez, IberCloud Symposium: Theory and practice
# “Demo: MPI in EGI”, https://indico.egi.eu/indico/contributionDisplay.py?contribId=107&confId=1019
* IFIC
# M. Kaci et al, Response of the Iberian Grid Computing Resources to the ATLAS activities during the LHC data-taking
# M. Kaci et al, Data Management and Data Processing within a Grid Computing Model for AGATA
* IFISC-GRID
# A. Tugores and P. Colet, Poster: Integration of Web4Grid with intranet at CSIC
# A. Tugores and P. Colet, Poster: Efficient file management
* LIP
# G. Borges et al, IBERGRID: Deploying User Oriented Services
# G. Borges et al, IBERGRID Infrastructure Status
# J. Gomes et al, Clustering TopBDII systems with dynamic round-robin DNS
# G. Borges et al, Operations Management discussions
* PIC
# V. Méndez, Running in Federated Clouds with DIRAC
* UNICAN
# C. Blanco et al, WRF4SG: A Scientific Gateway for the Weather Research and Forecasting model community
* UB
# R. Graciani, Dirac Tutorial
* UMINHO-CP
# V. Oliveira, Even Bigger Data: Preparing for the LHC/ATLAS Upgrade
* UPV
# I. Blanquer et al, Requirements of Scientific Applications in Cloud Offerings
# M. Caballer et al, Towards SLA-driven Management of Cloud Infrastructures to Elastically Execute Scientific Applications
* USC
# V. Fernandez, User access to CVMFS software repositories on Ibergrid
|-
| 16 Nov 2012 
| Madrid (Spain)
| BigData Spain 2012
| 1
|
* IFISC-GRID: Programme available at http://www.bigdataspain.org/en/
|-
| 28-29 Nov 2012 
| Bilbao (Spain)
| RedIris Network
| (~ 20)
| Annual workshop organised by NREN. Followup of technical issues related to network, programme available at http://www.rediris.es/jt/jt2012/
* IFAE: Presentation from PIC on the plans from LHC to exploit the new high performance network infrastructure Rediris-Nova through LHCONE
* IFIC
* RedIRIS
* USC
|-
| 13-14 December 2012
| CERN
| LHCONE Point-to-Point Service Workshop
| 1
|
* IFAE: Network workshop focused on the technical details of the deployment of a dedicated high performance network infrastructure to connect Tier2s and Tier1s, https://indico.cern.ch/conferenceDisplay.py?confId=215393
|-
|-
| &lt;Date&gt;
| 28-30 Jan 2013 
| &lt;Location&gt;
| Amsterdam
| Title
| Evolving EGI Workshop and co-located e-FISCAL Workshop
| Participants
| 1
| Outcome Test <!-- formatting text -->
|  
* CESGA: Programme available at https://indico.egi.eu/indico/conferenceTimeTable.py?confId=1252#20130130
|}
|}


Line 99: Line 140:
|<Date>||<Location>||Title||Participants||Outcome  
|<Date>||<Location>||Title||Participants||Outcome  
|-
|-
-->  
-->


===1.3. PUBLICATIONS=== <!--List all publications as bullet points, detailing: Publication title, author(s), journal title, number/issue, date. Also mention any articles published further to interviews given by members of your activity.-->  
===1.3. PUBLICATIONS=== <!--List all publications as bullet points, detailing: Publication title, author(s), journal title, number/issue, date. Also mention any articles published further to interviews given by members of your activity.-->  
Line 110: Line 151:
! align="left" | Authors ''<br>1.<br>2.<br>3.<br>Et al?''
! align="left" | Authors ''<br>1.<br>2.<br>3.<br>Et al?''
|-
|-
| Gestión Eficiente de Recursos Grid Basada en la Búsqueda Dispersa. Facilitando la Auto-adaptación de Aplicaciones
| Superconducting Vortex Lattice Configurations on Periodic Potentials: Simulation and Experiment
| Proceeding of the "XXIII Jornadas de paralelismo 2012"
| J. Superconducting Novell Magnetism.
| Available online at  http://www.jornadassarteco.org/?page_id=166 (See Section #2C)  
|
* Year 2012
* Vol.: 25
* pp.: 2127–2130
* DOI: 10.1007/s10948-012-1636-8
|
# M. Rodríguez-Pascual
# A. Gómez
# R. Mayo-García
# D. Pérez de Lara
#  E.M. González
# A.J. Rubio-Montero
# J.L. Vicent
|-
| Dimensioning storage and computing clusters for efficient high throughput computing
| Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012)
| 2012 J. Phys.: Conf. Ser. 396 042040
|
# E. Accion
# A. Bria
# G. Bernabeu
# M. Caubet
# M. Delfino
# X. Espinal
# G. Merino
# F. Lopez
# F. Martinez
# E. Planas
|-
| Monitoring techniques and alarm procedures for CMS Services and Sites in WLCG
| Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012)
| 2012 J. Phys.: Conf. Ser. 396 042041
|
# J. Molina-Perez
# D. Bonacorsi
# O. Gutsche
# A. Sciabà
# J. Flix
# P. Kreuzer
# E. Fajardo
# T. Boccali
# M. Klute
# D. Gomes
# R. Kaselis
# R Du
# N Magini
# I Butenas
# W Wang
|-
| CMS Data Transfer operations after the first years of LHC collisions
| Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012)
| 2012 J. Phys.: Conf. Ser. 396 042033
|
# R. Kaselis
# S. Piperov
# N. Magini
# J. Flix
# O. Gutsche
# P. Kreuzer
# M. Yang
# S. Liu
# N. Ratnikova
# A. Sartirana
# D. Bonacorsi
# J. Letts
|-
| Providing global WLCG transfer monitoring
| Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012)
| 2012 J. Phys.: Conf. Ser. 396 032005
|
# J. Andreeva
# D. Dieguez Arias
# S. Campana
# J. Flix
# O. Keeble
# N. Magini
# Z. Molnar
# D. Oleynik
# A. Petrosyan
# G. Ro
# P. Saiz
# M. Salichos
# D. Tuckett
# A. Uzhinsky
# T. Wildish
|-
| Service monitoring in the LHC experiments
| Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012)
| 2012 J. Phys.: Conf. Ser. 396 032010
|
# Fernando Barreiro Megino
# Vincent Bernardoff
# Diego da Silva Gomes
# Alessandro di Girolamo
# José Flix
# Peter Kreuzer
# Stefan Roiser
|-
| CMS resource utilization and limitations on the grid after the first two years of LHC collisions
| Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012)
| 2012 J. Phys.: Conf. Ser. 396 032012
|
# Giuseppe Bagliesi
# Kenneth Bloom
# Daniele Bonacorsi
# Chris Brew
# Ian Fisk
# Jose Flix
# Peter Kreuzer
# Andrea Sciaba
|-
| Performance studies and improvements of CMS distributed data transfers
| Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012)
| 2012 J. Phys.: Conf. Ser. 396 032040
|
# D. Bonacorsi
# J. Flix
# R. Kaselis
# J. Letts
# N. Magini
# A Sartirana
|-
 
 
| Towards higher reliability of CMS computing facilities
| Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012)
| 2012 J. Phys.: Conf. Ser. 396 032041
|
# G. Bagliesi
# K. Bloom
# C. Brew
# J. Flix
# P. Kreuzer
# A. Sciabà
|-
| The benefits and challenges of sharing glidein factory operations across nine time zones between OSG and CMS
| Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012)
| 2012 J. Phys.: Conf. Ser. 396 032103
|
# I. Sfiligo
# J. M. Dost
# M. Zvada
# I. Butenas
# B. Holzman
# F. Wuerthwein
# P. Kreuzer
# S. W. Teige
# R. Quick
# J. M. Hernández
# J. Flix
|-
| Automating ATLAS Computing Operations using the Site Status Board
| Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012)
| 2012 J. Phys.: Conf. Ser. 396 032072
|  
|  
# María Botón-Fernández
# Andreeva J.
# Miguel A. Vega-Rodríguez
# Borrego Iglesias C.
# Francisco Prieto
# Campana S.
# Di Girolamo A.
# Dzhunov I.
# Espinal Curull X.
# Gayazov S.
# Magradze E
# Nowotka M. M.
# Rinaldi L.
# Saiz P.
# Schovancova J.
# Stewart G. A.
# Wright M.
|-
|-
| &lt;Publication title&gt;
| Major Changes to the LHCb Grid Computing Model in Year 2 of LHC Data
| &lt;Journal/Proceedings&gt;
| Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012)
| Vol:&lt;volume number&gt;<br>Issue:&lt;Issue&gt;<br>Pg: &lt;from&gt; - &lt;to&gt;
| 2012 J. Phys.: Conf. Ser. 396 032092
| 1.&lt;Author 1&gt;<br>2.&lt;Author2&gt;<br>3. &lt;Author3&gt;<br>
|  
# L. Arrabito
# V. Bernardoff
# D. Bouvet
# M. Cattaneo
# Charpentier
# P. Clarke
# J Closier
# P. Franchini
# R. Graciani
# E. Lanciotti
# V. Mendez
# S. Perazzini
# R. Nandkumar
# D. Remenska
# S. Roiser
# V. Romanovskiy
# R. Santinelli
# F. Stagni
# A. Tsaregorodtsev
# M. Ubeda Garcia
# A. Vedaee
# A Zhelezov
|-
|-
| &lt;Publication title&gt;
| Status of the DIRAC Project
| &lt;Journal/Proceedings&gt;
| Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012)
| Vol:&lt;volume number&gt;<br>Issue:&lt;Issue&gt;<br>Pg: &lt;from&gt; - &lt;to&gt;
| 2012 J. Phys.: Conf. Ser. 396 032107
| 1.&lt;Author 1&gt;<br>2.&lt;Author2&gt;<br>3. &lt;Author3&gt;<br>
|
# A. Casajus
# K. Ciba
# V. Fernandez
# R. Graciani
# V. Hamar
# V. Méndez
# S. Poss
# M. Sapunov
# F. Stagni
# A. Tsaregorodtsev
# M. Ubeda
|-
| The Integration of CloudStack and OCCI/OpenNebula with DIRAC
| Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012)
| 2012 J. Phys.: Conf. Ser. 396 032075
|
# Víctor Méndez Muñoz
# Víctor Fernández Albor
# Ricardo Graciani Diaz
# Adriàn Casajús Ramo
# Tomás Fernández Pena
# Gonzalo Merino Arévalo
# Juan José Saborido Silva
|-
| Trying to predict the future – resource planning and allocation in CMS
| Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012)
| 2012 J. Phys.: Conf. Ser. 396 042035
|
# Kenneth Bloom
# Ian Fisk
# Peter Kreuzer
# Gonzalo Merino
|}
|}


Line 132: Line 391:


===2.1. Progress Summary===
===2.1. Progress Summary===
# Follow up security issues with IBERIAN sites according to what is reported in the security dashboard.
# Follow up the monthly A/R report for IBERIAN sites failing the threshold.
# Follow up of the status of IBERIAN sites in GSTAT.
# Follow up with sites which were publishing accounting data with unknown UserDNs.
# Follow up with sites which were publishing accounting data with unknown UserDNs.
# Follow up with sites that had to republish accounting data, and coordinate with the Accounting Repository staff for the republishing of large amounts of data.
# Follow up with sites that had to republish accounting data, and coordinate with the Accounting Repository staff for the republishing of large amounts of data.
# Provide information about the sustainability status of the NGIss (Portugal and Spain) after May 2014 and about the EGI Global operations services.
# Collect and report experience about national VOs running on EMI WN
# Collect and provide information about the IBERGRID and site-specific resource allocation policies
# Dissimination of the UMD upgrade calendar and of the imposed policies to follow up the implementation of such calendar.
# Follow up site's UMD upgrade within IBERGRID. NGI GGUS tickets were opened to sites requesting upgrade plans. Tickets were reviewed on a weekly basis.
# Follow up site's UMD upgrade within IBERGRID. NGI GGUS tickets were opened to sites requesting upgrade plans. Tickets were reviewed on a weekly basis.
# Follow up all questions from IBERIAN sites regarding the upgrade to UMD (doubts, issues, questions).  
# Follow up all questions from IBERIAN sites regarding the upgrade to UMD (doubts, issues, questions).  
# Coordination for the migration of important regional services (R-Nagios, VOMS, LFC, WMS and Top-BDIIs)
# Dissimination of the UMD upgrade calendar and of the imposed policies to follow up the implementation of such calendar.
# Follow up the monthly A/R report for IBERIAN sites failing the threshold.
# Providing information about Ibergrid NGI documentation, http://ibergrid.lip.pt/
# Follow up security issues with IBERIAN sites according to what is reported in the security dashboard,
# Provide information about Federation of NGI services and central coordination for Portugal and Spain, https://wiki.egi.eu/wiki/Operations_Surveys#Federation_of_NGI_services_and_central_coordination
# Follow up of the status of IBERIAN sites in GSTAT
# Coordination for the migration of important regional services (R-Nagios, VOMS, LFC, WMS and Top-BDIIs).
# Decommission procedure for BIFI-IBERGRID and UMINHO-CP in progress
<br />
<br />


===2.2. Main Achievements===
===2.2. Main Achievements===
# GGUS/Ibergrid-RT integration finished: Ibergrid's helpdesk launched into production on January 30th, since all the testing and integration with GGUS have been successfully accomplished.
# Moving from EVO to SeeVogh application to organize the Ibergrid Operations meeting. Currently, these meetings are organizing every Monday morning.
# A systematic 100% A/R (6 months in a row) for the TopBDII service (after the implementation of the TopBDII HA mechanism).
# A systematic 100% A/R (6 months in a row) for the TopBDII service (after the implementation of the TopBDII HA mechanism).
# Major UMD upgrades for services operated by the NGI were successfully carried out with minimum interference in users activity (including R-NAGIOS).  
# Major UMD upgrades for services operated by the NGI were successfully carried out with minimum interference in users activity (including R-NAGIOS).  
# The majority of the sites are following the calendar for UMD migration. Currently only 5 sites have opened tickets regarding unsupported gLite middleware.
# All the Ibergrid sites were following the calendar for UMD migration and all the tickets regarding unsupported gLite middleware were closed.
# Implementation of a monthly review of security issues by the regional security staff.
# Solving GGUS ticket: #90450 "NGI_IBERGRID - core services grouping action" - It was created "NGI_IBERGRID_SERVICES" group service on GOCDB, https://goc.egi.eu/portal/index.php?Page_Type=View_Object&object_id=120331&grid_id=0
# Solving the issue with Nagios WN probes on Scientific Linux 6 OS. The package "grid-monitoring-probes-org.sam" was updated on the Ibergrid Regional Nagios. It can be found more information on the following link, https://tomtools.cern.ch/jira/browse/SAM-2999. One Ibergrid site was affected by this issue and in consequence, they get the 0% in the A/R reports for September and October.
<br />
<br />


Line 161: Line 421:
! scope="col" | Mitigation Description
! scope="col" | Mitigation Description
|-
|-
|Some sites in the IBERIAN region are requesting the use of relocatable tarballs for UIs and WNs. This is a know issue which is already being followed by EGI Operations through:
| There was an issue related with fetch-crl crond on the Ibergrid Regional Nagios and it was not able to generate a new proxy and submit the Nagios probes during 9 hours on the 31/12/2012.  
* https://rt.egi.eu/rt/Ticket/Display.html?id=4351
| Recomputation of A/R was requested and performed in
* https://ggus.eu/tech/ticket_show.php?ticket=74675
* https://ggus.eu/ws/ticket_info.php?ticket=90037
* https://ggus.eu/ws/ticket_info.php?ticket=81496
|The problem is still not solved
|-
|CESGA is affected by a SAM issue (nagios binary compatiblity with SL6+) and as a consequence, it had 0% A/R in September and October
* https://tomtools.cern.ch/jira/browse/SAM-2999
* https://ggus.eu/ws/ticket_info.php?ticket=86451
| A new binary is upgraded in the regional IBERGRID SAM instance. Recomputation of A/R has been requested in
* https://ggus.eu/ws/ticket_info.php?ticket=87660 (for Sep 2012)
* https://ggus.eu/ws/ticket_info.php?ticket=88118 (for Oct 2012)
|}
|}


Line 179: Line 430:
|-
|-
| Issue Description || Issue mitigation
| Issue Description || Issue mitigation
-->  
-->
 
[[Category:NGI_QR_Reports]]

Latest revision as of 16:12, 7 January 2015

EGI Inspire Main page


Inspire reports menu: Home SA1 weekly Reports SA1 Task QR Reports NGI QR Reports NGI QR User support Reports




Quarterly Report Number NGI Name Partner Name Author
QR11 NGI_IBERGRID LIP & CSIC Esteban Freire (CESGA), Alvaro Simón (CESGA)


1. MEETINGS AND DISSEMINATION

Note: Complete the tables below by adding as many rows as needed.

1.1. CONFERENCES/WORKSHOPS ORGANISED

Date Location Title Participants Outcome (Short report & Indico URL)
7-9 Nov 2012 Lisbon, Portugal IBERGRID 2012, 6th Iberian Grid Infrastructure Conference 50 The 2012 IBERGRID conference was organized by LIP in Lisbon, Portugal. The main topics of IBERGRID 2012 Conference were: Infrastructures, Services and Operations, Innovation in the provision of IT services: virtualization and cloud computing, Data Management and Storage Systems, IT Management and Green Computing, EGI and WLCG Grid Computing Activities, Digital Repositories and Preservation,Community Oriented Services, User and Applications, Technology Transfer to Society. This is the annual meeting gathering IBERGRID operators and user communities to reassess the past activities, debate problems and define joint strategies. Conference URL: http://www.ibergrid.eu/2012

1.2. OTHER CONFERENCES/WORKSHOPS ATTENDED

Date Location Title Participants Outcome (Short report & Indico URL)
4-5 Nov 2012 CERN CMS Offline and Computing Week 1 * IFAE: Followup of the CMS computing activities and its impact on the Tier1 operations and plans, https://indico.cern.ch/conferenceDisplay.py?confId=171869
7-9 Nov 2012 Lisbon, Portugal IBERGRID 2012, 6th Iberian Grid Infrastructure Conference 50 Conference Programe URL: http://www.ibergrid.eu/2012/index.php?option=2
  • BIFI-UNIZAR
  1. A. Giner et al, Hadoop Cloud SaaS access via WS-PGRADE adaptation
  • CAFPE-GRANADA
  1. Julio Lozano-Bahilo, Pierre Auger Collaboration on Grid
  • CESGA:
  1. A. Simon et al, New deployments on EGI Verification and Staged Rollout processes
  2. A. Simon et al, EGI Fedcloud Task Force
  • CIEMAT
  1. A. Delgado et al, Synchronization and Versioning of Cluster Configuration with Csync
  2. M. Cárdena Montes et al, New Computational Developments in Cosmology
  • IFCA
  1. P. Orviz et al, Production change management using Puppet, Git, Jenkins and Gerrit
  2. E. Fernandez et al, IberCloud: federated access to virtualized resources
  3. E. Fernandez, IberCloud Symposium: Theory and practice
  • IFIC
  1. M. Kaci et al, Response of the Iberian Grid Computing Resources to the ATLAS activities during the LHC data-taking
  2. M. Kaci et al, Data Management and Data Processing within a Grid Computing Model for AGATA
  • IFISC-GRID
  1. A. Tugores and P. Colet, Poster: Integration of Web4Grid with intranet at CSIC
  2. A. Tugores and P. Colet, Poster: Efficient file management
  • LIP
  1. G. Borges et al, IBERGRID: Deploying User Oriented Services
  2. G. Borges et al, IBERGRID Infrastructure Status
  3. J. Gomes et al, Clustering TopBDII systems with dynamic round-robin DNS
  4. G. Borges et al, Operations Management discussions
  • PIC
  1. V. Méndez, Running in Federated Clouds with DIRAC
  • UNICAN
  1. C. Blanco et al, WRF4SG: A Scientific Gateway for the Weather Research and Forecasting model community
  • UB
  1. R. Graciani, Dirac Tutorial
  • UMINHO-CP
  1. V. Oliveira, Even Bigger Data: Preparing for the LHC/ATLAS Upgrade
  • UPV
  1. I. Blanquer et al, Requirements of Scientific Applications in Cloud Offerings
  2. M. Caballer et al, Towards SLA-driven Management of Cloud Infrastructures to Elastically Execute Scientific Applications
  • USC
  1. V. Fernandez, User access to CVMFS software repositories on Ibergrid
16 Nov 2012 Madrid (Spain) BigData Spain 2012 1
28-29 Nov 2012 Bilbao (Spain) RedIris Network (~ 20) Annual workshop organised by NREN. Followup of technical issues related to network, programme available at http://www.rediris.es/jt/jt2012/
  • IFAE: Presentation from PIC on the plans from LHC to exploit the new high performance network infrastructure Rediris-Nova through LHCONE
  • IFIC
  • RedIRIS
  • USC
13-14 December 2012 CERN LHCONE Point-to-Point Service Workshop 1
28-30 Jan 2013 Amsterdam Evolving EGI Workshop and co-located e-FISCAL Workshop 1


1.3. PUBLICATIONS

Publication title Journal / Proceedings title Journal references
Volume number
Issue

Pages from - to
Authors
1.
2.
3.
Et al?
Superconducting Vortex Lattice Configurations on Periodic Potentials: Simulation and Experiment J. Superconducting Novell Magnetism.
  • Year 2012
  • Vol.: 25
  • pp.: 2127–2130
  • DOI: 10.1007/s10948-012-1636-8
  1. M. Rodríguez-Pascual
  2. A. Gómez
  3. R. Mayo-García
  4. D. Pérez de Lara
  5. E.M. González
  6. A.J. Rubio-Montero
  7. J.L. Vicent
Dimensioning storage and computing clusters for efficient high throughput computing Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012) 2012 J. Phys.: Conf. Ser. 396 042040
  1. E. Accion
  2. A. Bria
  3. G. Bernabeu
  4. M. Caubet
  5. M. Delfino
  6. X. Espinal
  7. G. Merino
  8. F. Lopez
  9. F. Martinez
  10. E. Planas
Monitoring techniques and alarm procedures for CMS Services and Sites in WLCG Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012) 2012 J. Phys.: Conf. Ser. 396 042041
  1. J. Molina-Perez
  2. D. Bonacorsi
  3. O. Gutsche
  4. A. Sciabà
  5. J. Flix
  6. P. Kreuzer
  7. E. Fajardo
  8. T. Boccali
  9. M. Klute
  10. D. Gomes
  11. R. Kaselis
  12. R Du
  13. N Magini
  14. I Butenas
  15. W Wang
CMS Data Transfer operations after the first years of LHC collisions Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012) 2012 J. Phys.: Conf. Ser. 396 042033
  1. R. Kaselis
  2. S. Piperov
  3. N. Magini
  4. J. Flix
  5. O. Gutsche
  6. P. Kreuzer
  7. M. Yang
  8. S. Liu
  9. N. Ratnikova
  10. A. Sartirana
  11. D. Bonacorsi
  12. J. Letts
Providing global WLCG transfer monitoring Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012) 2012 J. Phys.: Conf. Ser. 396 032005
  1. J. Andreeva
  2. D. Dieguez Arias
  3. S. Campana
  4. J. Flix
  5. O. Keeble
  6. N. Magini
  7. Z. Molnar
  8. D. Oleynik
  9. A. Petrosyan
  10. G. Ro
  11. P. Saiz
  12. M. Salichos
  13. D. Tuckett
  14. A. Uzhinsky
  15. T. Wildish
Service monitoring in the LHC experiments Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012) 2012 J. Phys.: Conf. Ser. 396 032010
  1. Fernando Barreiro Megino
  2. Vincent Bernardoff
  3. Diego da Silva Gomes
  4. Alessandro di Girolamo
  5. José Flix
  6. Peter Kreuzer
  7. Stefan Roiser
CMS resource utilization and limitations on the grid after the first two years of LHC collisions Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012) 2012 J. Phys.: Conf. Ser. 396 032012
  1. Giuseppe Bagliesi
  2. Kenneth Bloom
  3. Daniele Bonacorsi
  4. Chris Brew
  5. Ian Fisk
  6. Jose Flix
  7. Peter Kreuzer
  8. Andrea Sciaba
Performance studies and improvements of CMS distributed data transfers Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012) 2012 J. Phys.: Conf. Ser. 396 032040
  1. D. Bonacorsi
  2. J. Flix
  3. R. Kaselis
  4. J. Letts
  5. N. Magini
  6. A Sartirana
Towards higher reliability of CMS computing facilities Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012) 2012 J. Phys.: Conf. Ser. 396 032041
  1. G. Bagliesi
  2. K. Bloom
  3. C. Brew
  4. J. Flix
  5. P. Kreuzer
  6. A. Sciabà
The benefits and challenges of sharing glidein factory operations across nine time zones between OSG and CMS Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012) 2012 J. Phys.: Conf. Ser. 396 032103
  1. I. Sfiligo
  2. J. M. Dost
  3. M. Zvada
  4. I. Butenas
  5. B. Holzman
  6. F. Wuerthwein
  7. P. Kreuzer
  8. S. W. Teige
  9. R. Quick
  10. J. M. Hernández
  11. J. Flix
Automating ATLAS Computing Operations using the Site Status Board Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012) 2012 J. Phys.: Conf. Ser. 396 032072
  1. Andreeva J.
  2. Borrego Iglesias C.
  3. Campana S.
  4. Di Girolamo A.
  5. Dzhunov I.
  6. Espinal Curull X.
  7. Gayazov S.
  8. Magradze E
  9. Nowotka M. M.
  10. Rinaldi L.
  11. Saiz P.
  12. Schovancova J.
  13. Stewart G. A.
  14. Wright M.
Major Changes to the LHCb Grid Computing Model in Year 2 of LHC Data Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012) 2012 J. Phys.: Conf. Ser. 396 032092
  1. L. Arrabito
  2. V. Bernardoff
  3. D. Bouvet
  4. M. Cattaneo
  5. Charpentier
  6. P. Clarke
  7. J Closier
  8. P. Franchini
  9. R. Graciani
  10. E. Lanciotti
  11. V. Mendez
  12. S. Perazzini
  13. R. Nandkumar
  14. D. Remenska
  15. S. Roiser
  16. V. Romanovskiy
  17. R. Santinelli
  18. F. Stagni
  19. A. Tsaregorodtsev
  20. M. Ubeda Garcia
  21. A. Vedaee
  22. A Zhelezov
Status of the DIRAC Project Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012) 2012 J. Phys.: Conf. Ser. 396 032107
  1. A. Casajus
  2. K. Ciba
  3. V. Fernandez
  4. R. Graciani
  5. V. Hamar
  6. V. Méndez
  7. S. Poss
  8. M. Sapunov
  9. F. Stagni
  10. A. Tsaregorodtsev
  11. M. Ubeda
The Integration of CloudStack and OCCI/OpenNebula with DIRAC Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012) 2012 J. Phys.: Conf. Ser. 396 032075
  1. Víctor Méndez Muñoz
  2. Víctor Fernández Albor
  3. Ricardo Graciani Diaz
  4. Adriàn Casajús Ramo
  5. Tomás Fernández Pena
  6. Gonzalo Merino Arévalo
  7. Juan José Saborido Silva
Trying to predict the future – resource planning and allocation in CMS Journal of Physics: Conference Series / International Conference on Computing in High Energy and Nuclear Physics 2012 (CHEP2012) 2012 J. Phys.: Conf. Ser. 396 042035
  1. Kenneth Bloom
  2. Ian Fisk
  3. Peter Kreuzer
  4. Gonzalo Merino

2. ACTIVITY REPORT

2.1. Progress Summary

  1. Follow up security issues with IBERIAN sites according to what is reported in the security dashboard.
  2. Follow up the monthly A/R report for IBERIAN sites failing the threshold.
  3. Follow up of the status of IBERIAN sites in GSTAT.
  4. Follow up with sites which were publishing accounting data with unknown UserDNs.
  5. Follow up with sites that had to republish accounting data, and coordinate with the Accounting Repository staff for the republishing of large amounts of data.
  6. Follow up site's UMD upgrade within IBERGRID. NGI GGUS tickets were opened to sites requesting upgrade plans. Tickets were reviewed on a weekly basis.
  7. Follow up all questions from IBERIAN sites regarding the upgrade to UMD (doubts, issues, questions).
  8. Dissimination of the UMD upgrade calendar and of the imposed policies to follow up the implementation of such calendar.
  9. Providing information about Ibergrid NGI documentation, http://ibergrid.lip.pt/
  10. Provide information about Federation of NGI services and central coordination for Portugal and Spain, https://wiki.egi.eu/wiki/Operations_Surveys#Federation_of_NGI_services_and_central_coordination
  11. Coordination for the migration of important regional services (R-Nagios, VOMS, LFC, WMS and Top-BDIIs).


2.2. Main Achievements

  1. GGUS/Ibergrid-RT integration finished: Ibergrid's helpdesk launched into production on January 30th, since all the testing and integration with GGUS have been successfully accomplished.
  2. Moving from EVO to SeeVogh application to organize the Ibergrid Operations meeting. Currently, these meetings are organizing every Monday morning.
  3. A systematic 100% A/R (6 months in a row) for the TopBDII service (after the implementation of the TopBDII HA mechanism).
  4. Major UMD upgrades for services operated by the NGI were successfully carried out with minimum interference in users activity (including R-NAGIOS).
  5. All the Ibergrid sites were following the calendar for UMD migration and all the tickets regarding unsupported gLite middleware were closed.
  6. Solving GGUS ticket: #90450 "NGI_IBERGRID - core services grouping action" - It was created "NGI_IBERGRID_SERVICES" group service on GOCDB, https://goc.egi.eu/portal/index.php?Page_Type=View_Object&object_id=120331&grid_id=0
  7. Solving the issue with Nagios WN probes on Scientific Linux 6 OS. The package "grid-monitoring-probes-org.sam" was updated on the Ibergrid Regional Nagios. It can be found more information on the following link, https://tomtools.cern.ch/jira/browse/SAM-2999. One Ibergrid site was affected by this issue and in consequence, they get the 0% in the A/R reports for September and October.


2.3. Issues and mitigation

Issue Description Mitigation Description
There was an issue related with fetch-crl crond on the Ibergrid Regional Nagios and it was not able to generate a new proxy and submit the Nagios probes during 9 hours on the 31/12/2012. Recomputation of A/R was requested and performed in