Alert.png The wiki is deprecated and due to be decommissioned by the end of September 2022.
The content is being migrated to other supports, new updates will be ignored and lost.
If needed you can get in touch with EGI SDIS team using operations @ egi.eu.

Difference between revisions of "UMDQualityCriteria"

From EGIWiki
Jump to navigation Jump to search
Line 67: Line 67:


==== Operational Capabilities ====
==== Operational Capabilities ====
** [[EGI-InSPIRE:UMDQualityCriteria:Monitoring | Monitoring (Nagios)]]
* [[EGI-InSPIRE:UMDQualityCriteria:Monitoring | Monitoring (Nagios)]]
** [[EGI-InSPIRE:UMDQualityCriteria:Accounting | Accounting]]
* [[EGI-InSPIRE:UMDQualityCriteria:Accounting | Accounting]]


=== Identified Capabilities ===
=== Identified Capabilities ===

Revision as of 12:24, 25 October 2010

All the software included in the Unified Middleware Distribution (UMD) must meet a set of Quality Criteria defined by EGI. The Quality Criteria can be classified into generic criteria, i.e. criteria which sould hold for any component of the UMD, and specific criteria, i.e. criteria valid for a particular component only.

Validation of Criteria

In order to be verified, quality criteria is specified as a set of tests. Those tests must ensure the correctness, completeness and security of each service. Software providers must include with each component a complete test plan that covers all the quality criteria. The test plan is composed by:

  • General description of the test plan.
  • A Test Suite following the template described in the next section for each of the test cases included in:
    • Generic criteria.
    • Specific criteria applicable to the component.
    • Bugs detected by EGI

Template for test validation

For each test specified in the criteria, there must be a report. The report should contain several sections:

Description

Description of the test objectives. Include any references to specific bugs that are tested.

Set up

  • A description of required steps prior to the execution of the test, and if relevant, differences between this set up and the standard installation procedure for the software. Tests should be written in such a way that they can be integrated into different frameworks. Manual testing should be avoided.
  • Tests must be fully configurable either by options to the test script or a configuration file (key/value pairs resp. environment variables for export). There must be no hardcoded paths etc.
  • If the test has prerequisites (e.g. a valid proxy) this must be documented and a check at the beginning of the test has to be implemented.
  • Detailed documentation on how to run the test, that allows the validation team to repeat the test if necessary.

Results

Tests should have a simple output (text or simple html) that allows adding a post processing step to the results in order to be transformed for use with a particular presentation framework. Description of the test results for these cases:

Correct input/Normal workflow

Testing the software as it should work, with correct input. Describe different inputs combinations used for testing and when this test should pass or fail. Description of the output observed when running the test (most of the time a copy and paste of the commands or screenshot should be enough) that is reproducible by the validation team.

Error workflow - erroneous input

Testing the error codes: providing wrong input to the software should return the proper error messages. Describe different inputs combinations used for testing and when this test should pass or fail. Description of the output observed when running the test (most of the time a copy and paste of the commands or screenshot should be enough) that is reproducible by the validation team.

Non tested features

Describe in this section any features of the software that miss a test.

Generic acceptance criteria

EGI-InSPIRE:UMDQualityCriteria:Generic

Specific acceptance criteria

OLD: There is a template for the creation of new critera at EGI-InSPIRE:UMDQualityCriteria:Template

Defined Capabilities

These capabilities are identified and defined in the current UMD RoadMap:

Functional Capabilities

Security Capabilities

(OLD security services )

Operational Capabilities

Identified Capabilities

These capabilities are identified in the UMD RoadMap but still waiting for EGI Community input for definition:

Functional Capabilities

  • File Encryption/Decryption
Sensitive data needs to be stored securely. Before being stored in a remote file store the file may need to be encrypted and then on retrieval de-encrypted before use. The capability should also provide solutions relating to the storage of the keys needed to perform these tasks.
  • Metadata Catalogue
The metadata catalogue is used to store and query information relating to the data (files, databases, etc.) stored within the production infrastructure.
  • Database Access
Many communities are moving to the use of structured data stored in relational databases. These need to be accessible for controlled use by remote users as any other e-Infrastructure resource.
  • File Transfer Scheduling
The bandwidth linking resource sites is a resource that needs to be managed in the same way compute resources at a site are accessed through a job scheduler. By being able to schedule wide area data transfers, requests can be prioritised and managed. This would include the capability to monitor and restart transfers as required.
  • Remote Instrumentation
Instruments are data sources frequently encountered within e-Infrastructures. As part of a distributed computing architecture providing remote access to manage and monitor these instruments is becoming increasingly important within some communities.
  • Workflow
The ability to define, initiate, manage and monitor a workflow is a key capability across many user communities. It is also a capability that can be deployed by a user or a user community (i.e. it does not need to be a service provided as part of the core infrastructure) but the various workflow systems may have requirements that need to be supported within the core infrastructure.

Operational Capabilities

  • Virtual Image Management
As virtual machine images become the default approach to providing the environment for both jobs and services, increased effort is needed on building the trust model around the distribution of images. Resource providers will need a mechanism for images to be distributed, cached and trusted for execution on their sites.
  • Virtual Machine Management
The core functionality is for authorized users to manage the virtual machine life-cycle and configuration on a remote site (i.e. start, stop, pause, etc.) Machine images would be selected from a trusted repository at the site that would be configured according to site policy. Together this would allow site managers to determine both who could control the virtual machines running on their sites and who generated the images used on their site.
  • Messaging
Within distributed systems, a message ‘bus’ provides a reliable mechanism for data items to be sent between producers and (multiple) consumers. Such a capability, once established, can be reused by many different software services.