Alert.png The wiki is deprecated and due to be decommissioned by the end of September 2022.
The content is being migrated to other supports, new updates will be ignored and lost.
If needed you can get in touch with EGI SDIS team using operations @ egi.eu.

Difference between revisions of "UMDQualityCriteria"

From EGIWiki
Jump to navigation Jump to search
 
(31 intermediate revisions by 3 users not shown)
Line 1: Line 1:
= UMD Quality Criteria =
== UMD Quality Criteria ==


All the software included in the Unified Middleware Distribution (UMD) must meet a set of Quality Criteria defined by EGI. The Quality Criteria can be classified into generic criteria, i.e. criteria which sould hold for any component of the UMD, and specific criteria, i.e. criteria valid for a particular component only.
All the software included in the Unified Middleware Distribution (UMD) must meet a set of Quality Criteria defined by EGI. The Quality Criteria can be classified into generic criteria, i.e. criteria which sould hold for any component of the UMD, and specific criteria, i.e. criteria valid for a particular component only.


== Validation of Criteria ==
Any Technology Provider (TP) willing to have their software verified by the SA2 team should carefully read them.


In order to be verified, quality criteria is specified as a set of tests. Those tests must ensure the correctness, completeness and security of each service. Software providers must include with each component a complete test plan that covers all the quality criteria. The test plan is composed by:
=== QC Documents ===
* General description of the test plan.
* A Test Suite following the template described in the next section for each of the test cases included in:
** Generic criteria.
** Specific criteria applicable to the component.
** Bugs detected by EGI


=== Template for test validation ===
The Quality Criteria (QC) is written following the classification of capabilities defined in can the [https://documents.egi.eu/secure/ShowDocument?docid=272 UMD RoadMap], one document for each type of capability. Take into account that a software component may cover QC specified in more than one of those documents.


For each test specified in the criteria, there must be a report. The report should contain several sections:
=== QC Roadmap ===
==== Description ====
Description of the test objectives. Include any references to specific bugs that are tested.


==== Set up ====
QC definition is a continuous process driven by the requirements of users and operations communities, however the verification of new software releases will be done against fixed releases of the QC documents. At least every 6 months a new major version of the QC will be released, although more frequent releases may occur if necessary.


* A description of required steps prior to the execution of the test, and if relevant, differences between this set up and the standard installation procedure for the software. Tests should be written in such a way that they can be integrated into different frameworks. Manual testing should be avoided.
Each new release of the QC will be announced in this page and the TCB mailing list. Along with the announcement of each release of QC specification, a date for its use in verification will be specified. It is expected that verification will use new releases of the QC within 2 to 4 weeks from the release date.


* Tests must be fully configurable either by options to the test script or a configuration file (key/value pairs resp. environment variables for export). There must be no hardcoded paths etc.
Each QC release will contain documentation with:
* major changes introduced in the version.
* criteria to be deprecated in next release of the QC, if any.


* If the test has prerequisites (e.g. a valid proxy) this must be documented and a check at the beginning of the test has to be implemented.
As soon as a major update of the QC is released, the next version will be made available as draft in the DocDB. This draft will allow the TP to plan their testing efforts.  


* Detailed documentation on how to run the test, that allows the validation team to repeat the test if necessary.
==== Current QC Specification ====


==== Results ====
Current QC Specification can be found at the [https://documents.egi.eu/public/ShowDocument?docid=240 QC document] in EGI DocDB.
Tests should have a simple output (text or simple html) that allows adding a post processing step to the results in order to be transformed for use with a particular presentation framework. Description of the test results for these cases:
Check the [[EGI-InSPIRE:UMDQualityCriteria:QC1 | release notes]].


===== Correct input/Normal workflow =====
==== Future Releases ====
Testing the software as it should work, with correct input. Describe different inputs combinations used for testing and when this test should pass or fail. Description of the output observed when running the test (most of the time a copy and paste of the commands or screenshot should be enough) that is reproducible by the validation team.


===== Error workflow - erroneous input =====
{| class="wikitable sortable"  style="border:1px solid black" cellspacing="0" cellpadding="3" border="1"
Testing the error codes: providing wrong input to the software should return the proper error messages. Describe different inputs combinations used for testing and when this test should pass or fail. Description of the output observed when running the test (most of the time a copy and paste of the commands or screenshot should be enough) that is reproducible by the validation team.
|- style="background:Lightgray"  align="left"
! DocDB link
! Expected Release Date
! Expected Verification Date
! More information
|-
| [https://documents.egi.eu/public/ShowDocument?docid=240 240]
| 10. 02. 2011
| 10. 02. 2011
| [[EGI-InSPIRE:UMDQualityCriteria:QC1 | release notes]]
|-
| -
| 01. 08. 2011
| 15. 08. 2011
| -
|}


==== Non tested features ====
==== Past QC releases ====
Describe in this section any features of the software that miss a test.


== Generic acceptance criteria ==
There are no past QC releases.
 
=== Documentation ===
Services in UMD must include a comprehensive documentation written in a uniform and clear style, which reflects all of the following items:
* Functional description of the software.
* User documentation, including complete man pages of the commands and user guides.
* Complete API documentation (if there is an API)
* Administrator documentation that includes the installation procedure; detailed configuration of service; starting, stopping and querying service procedures; ports (or port ranges) used and expected connections to those ports; cron jobs needed for the service)
* List of processes that are expected to run, giving a typical load of the service. List of how state information is managed and debugging information (e.g.: list of log files, any files or databases containing service information).
* Notes on the testing procedure and expected tests results.
 
'''Verification''': existence of the documentation with all the required items.
 
=== Source Code Quality and Availability ===
The source code of each component of the UMD middleware should follow a coherent and clear programming style that helps in the readability of the code and eases maintenance, testing, debugging, fixing, modification and portability of the software. Open source components must publicly offer their source code and the license with the binaries.
 
'''Verification''': for Open Source components, availability of the code and license. Source code quality metrics are desirable.
 
=== Management, Monitoring, Traceability ===
 
All the services must include tools related to:
* Starting, stopping, suspending, listing and querying the status of all the service daemons.
* Checking the responsiveness of all the service components or daemons
* Checking the correctness of the service components behavior (expected actions after a request are taken)
* Tracing all the user actions in the system (e.g. by generating logs)
 
Ideally, these tools should be also available remotely, allowing operators to react timely to problems in the infrastructure. A uniform interface for remote management and monitoring should be followed by all the services.
Monitorization should also be easily used in existing monitoring systems such as Nagios.
 
'''Verification''':
Test suite must include tests cases for:
* start, stop, suspend, and query status of service
* check responsiveness of service (expected ports open and expected answer to commands received)
* check correctness of service behavior (expexted actions after a request are taken)
* track of user actions in the system (generation of logs and accounting information)
 
=== Configuration ===
Tools for the automatic or semi-automatic configuration of the services must be provided with the software. These tools should allow the unassisted configuration of the services for the most common use cases while being customizable for advanced user. Complete manual configuration must be always allowed.
 
'''Verification''': test suite must include the configuration mechanisms and tools. Yaim is considered as the preferred tool.
 
== Specific acceptance criteria ==
 
There is a template for the creation of new critera at [[EGI-InSPIRE:UMDQualityCriteria:Template]]
 
The specific acceptance criteria of the UMD are classified according to the following areas:
* Security Services [[EGI-InSPIRE:UMDQualityCriteria:SecurityServices]]
* Computing Services: [[EGI-InSPIRE:UMDQualityCriteria:ComputingServices]]
* Data Services: [[EGI-InSPIRE:UMDQualityCriteria:StorageServices]]
* Information Services: [[EGI-InSPIRE:UMDQualityCriteria:InformationServices]]

Latest revision as of 22:11, 24 December 2014

UMD Quality Criteria

All the software included in the Unified Middleware Distribution (UMD) must meet a set of Quality Criteria defined by EGI. The Quality Criteria can be classified into generic criteria, i.e. criteria which sould hold for any component of the UMD, and specific criteria, i.e. criteria valid for a particular component only.

Any Technology Provider (TP) willing to have their software verified by the SA2 team should carefully read them.

QC Documents

The Quality Criteria (QC) is written following the classification of capabilities defined in can the UMD RoadMap, one document for each type of capability. Take into account that a software component may cover QC specified in more than one of those documents.

QC Roadmap

QC definition is a continuous process driven by the requirements of users and operations communities, however the verification of new software releases will be done against fixed releases of the QC documents. At least every 6 months a new major version of the QC will be released, although more frequent releases may occur if necessary.

Each new release of the QC will be announced in this page and the TCB mailing list. Along with the announcement of each release of QC specification, a date for its use in verification will be specified. It is expected that verification will use new releases of the QC within 2 to 4 weeks from the release date.

Each QC release will contain documentation with:

  • major changes introduced in the version.
  • criteria to be deprecated in next release of the QC, if any.

As soon as a major update of the QC is released, the next version will be made available as draft in the DocDB. This draft will allow the TP to plan their testing efforts.

Current QC Specification

Current QC Specification can be found at the QC document in EGI DocDB. Check the release notes.

Future Releases

DocDB link Expected Release Date Expected Verification Date More information
240 10. 02. 2011 10. 02. 2011 release notes
- 01. 08. 2011 15. 08. 2011 -

Past QC releases

There are no past QC releases.