Alert.png The wiki is deprecated and due to be decommissioned by the end of September 2022.
The content is being migrated to other supports, new updates will be ignored and lost.
If needed you can get in touch with EGI SDIS team using operations @

Difference between revisions of "UMDQualityCriteria"

From EGIWiki
Jump to navigation Jump to search
Line 57: Line 57:
== Specific acceptance criteria ==
== Specific acceptance criteria ==

The specific acceptance criteria of the UMD can be specified according to the following areas:
The specific acceptance criteria of the UMD are classified according to the following areas:
* Security Services: [[EGI-InSPIRE:UMDQualityCriteria:SecurityServices]]
* Security Services: [[EGI-InSPIRE:UMDQualityCriteria:SecurityServices]]
* Computing Services: [[EGI-InSPIRE:UMDQualityCriteria:ComputingServices]]
* Computing Services: [[EGI-InSPIRE:UMDQualityCriteria:ComputingServices]]
* Data Services: [[EGI-InSPIRE:UMDQualityCriteria:StorageServices]]
* Data Services: [[EGI-InSPIRE:UMDQualityCriteria:StorageServices]]
* Information Services: [[EGI-InSPIRE:UMDQualityCriteria:InformationServices]]
* Information Services: [[EGI-InSPIRE:UMDQualityCriteria:InformationServices]]

Revision as of 11:41, 26 July 2010

UMD Quality Criteria

All the software included in the Unified Middleware Distribution (UMD) must meet a set of Quality Criteria defined by EGI. The Quality Criteria can be classified into generic criteria, i.e. criteria which sould hold for any component of the UMD, and specific criteria, i.e. criteria valid for a particular component only.

Test Plan

In order to be verified, quality criteria is specified as a set of tests. Those tests must ensure the correctness, completeness and security of each service. Software providers must include with each component a complete test plan that covers all the quality criteria. The test plan is composed by:

  • General description of the test plan.
  • A Test Suite with documentation for each of the test cases (objective of the test, how to run it, expected output, possible errors, pass/fail criteria) included in:
    • Generic criteria.
    • Specific criteria applicable to the component.
  • Tests results for all the specified tests.

In the case of revision releases, the test plan must cover bugs fixed in the release.

Generic acceptance criteria


Services in UMD must include a comprehensive documentation written in a uniform and clear style, which reflects all of the following items:

  • Functional description of the software.
  • User documentation, including complete man pages of the commands and user guides.
  • Complete API documentation (if there is an API)
  • Administrator documentation that includes the installation procedure; detailed configuration of service; starting, stopping and querying service procedures; ports (or port ranges) used and expected connections to those ports; cron jobs needed for the service)
  • List of processes that are expected to run, giving a typical load of the service. List of how state information is managed and debugging information (e.g.: list of log files, any files or databases containing service information).
  • Notes on the testing procedure and expected tests results.

Verification: existence of the documentation with all the required items.

Source Code Quality and Availability

The source code of each component of the UMD middleware should follow a coherent and clear programming style that helps in the readability of the code and eases maintenance, testing, debugging, fixing, modification and portability of the software. Open source components must publicly offer their source code and the license with the binaries.

Verification: for Open Source components, availability of the code and license. Source code quality metrics are desirable.

Management, Monitoring, Traceability

All the services must include tools related to:

  • Starting, stopping, suspending, listing and querying the status of all the service daemons.
  • Checking the responsiveness of all the service components or daemons
  • Checking the correctness of the service components behavior (expected actions after a request are taken)
  • Tracing all the user actions in the system (e.g. by generating logs)

Ideally, these tools should be also available remotely, allowing operators to react timely to problems in the infrastructure. A uniform interface for remote management and monitoring should be followed by all the services. Monitorization should also be easily used in existing monitoring systems such as Nagios.

Verification: Test suite must include tests cases for:

  • start, stop, suspend, and query status of service
  • check responsiveness of service (expected ports open and expected answer to commands received)
  • check correctness of service behavior (expexted actions after a request are taken)
  • track of user actions in the system (generation of logs and accounting information)


Tools for the automatic or semi-automatic configuration of the services must be provided with the software. These tools should allow the unassisted configuration of the services for the most common use cases while being customizable for advanced user. Complete manual configuration must be always allowed.

Verification: test suite must include the configuration mechanisms and tools. Yaim is considered as the preferred tool.

Specific acceptance criteria

The specific acceptance criteria of the UMD are classified according to the following areas: