All the software included in the Unified Middleware Distribution (UMD) must meet a set of Quality Criteria defined by EGI. The Quality Criteria can be classified into generic criteria, i.e. criteria which sould hold for any component of the UMD, and specific criteria, i.e. criteria valid for a particular component only.
Validation of Criteria
In order to be verified, quality criteria is specified as a set of tests. Those tests must ensure the correctness, completeness and security of each service. Software providers must include with each component a complete test plan that covers all the quality criteria. The test plan is composed by:
- General description of the test plan.
- A Test Suite following the template described in the next section for each of the test cases included in:
- Generic criteria.
- Specific criteria applicable to the component.
- Bugs detected by EGI
Template for test validation
For each test specified in the criteria, there must be a report. The report should contain several sections:
Description of the test objectives. Include any references to specific bugs that are tested.
- A description of required steps prior to the execution of the test, and if relevant, differences between this set up and the standard installation procedure for the software. Tests should be written in such a way that they can be integrated into different frameworks. Manual testing should be avoided.
- Tests must be fully configurable either by options to the test script or a configuration file (key/value pairs resp. environment variables for export). There must be no hardcoded paths etc.
- If the test has prerequisites (e.g. a valid proxy) this must be documented and a check at the beginning of the test has to be implemented.
- Detailed documentation on how to run the test, that allows the validation team to repeat the test if necessary.
Tests should have a simple output (text or simple html) that allows adding a post processing step to the results in order to be transformed for use with a particular presentation framework. Description of the test results for these cases:
Correct input/Normal workflow
Testing the software as it should work, with correct input. Describe different inputs combinations used for testing and when this test should pass or fail. Description of the output observed when running the test (most of the time a copy and paste of the commands or screenshot should be enough) that is reproducible by the validation team.
Error workflow - erroneous input
Testing the error codes: providing wrong input to the software should return the proper error messages. Describe different inputs combinations used for testing and when this test should pass or fail. Description of the output observed when running the test (most of the time a copy and paste of the commands or screenshot should be enough) that is reproducible by the validation team.
Non tested features
Describe in this section any features of the software that miss a test.
Generic acceptance criteria
Specific acceptance criteria
There is a template for the creation of new critera at EGI-InSPIRE:UMDQualityCriteria:Template
Current criteria for different components classified by UMD Capabilities:
- Functional Capabilities
- Information Discovery EGI-InSPIRE:UMDQualityCriteria:InformationDiscovery
- Compute EGI-InSPIRE:UMDQualityCriteria:ComputingServices
- Compute Job Scheduling
- File Access EGI-InSPIRE:UMDQualityCriteria:FileAccess OLD EGI-InSPIRE:UMDQualityCriteria:StorageServices
- File Encryption/Decryption
- Database Access
- Metadata Catalogue
- File Transfer
- File Transfer Scheduling
- Parallel Job
- Remote Instrumentation
- Security Capabilities
- Operational Capabilities
- Virtual Image Management
- Virtual Machine Management
- Monitoring (Nagios) EGI-InSPIRE:UMDQualityCriteria:Monitoring