|(49 intermediate revisions by 3 users not shown)|
Generic acceptance criteria =
== Documentation ==
in UMDmust a of . , of the , and , .e. .
Services in UMD must include a comprehensive documentation written in a uniform and clear style, which reflects all of the following items:
* Functional description of the software.
* User documentation, including complete man pages of the commands and user guides.
* Complete API documentation (if there is an API)
* Administrator documentation that includes the installation procedure; detailed configuration of service; starting, stopping and querying service procedures; ports (or port ranges) used and expected connections to those ports; cron jobs needed for the service)
* List of processes that are expected to run, giving a typical load of the service. List of how state information is managed and debugging information (e. g.: list of log files, any files or databases containing service information).
* Notes on the testing procedure and expected tests results.
'''Verification''': existence of the documentation with all the required items.
UMD software must support testing of its features and functionality. The tests must ensure the correctness, completeness, security and scalability of each service. Error situation and input validation have to be included in the test suite. The software provider must define a test plan for each service and provide a test suite and the results of running it against each new release. Interoperability with the rest of the UMD software must be explicitly tested. A global test suite for the UMD distribution will guarantee the correct behavior of the complete set of services in a controlled environment.
'''Verification''': existence of the test plan, test suite and the results of the test plan.
of the , of .
Tools for the automatic or semi-automatic configuration of the services must be provided with the software. These tools should allow the unassisted configuration of the services for the most common use casses while being customizable for advanced use cases. Complete manual configuration must be always allowed.
'''Verification''': test suite must include the configuration mechanisms and tools. Yaim is considered as the preferred tool.
the and . the .
== Management, Monitoring, Traceability==
the and the . the of expected of the the .
All the services must include tools (and tests for them) related to:
* Starting, stopping, suspending, listing and querying the status of all the service daemons.
* Checking the responsiveness of all the service components or daemons (expected ports open and expected answer to commands received)
* Checking the correctness of the service components behavior (expected actions after a request are taken)
* Tracing all the user actions in the system (e.g. by generating logs)
Ideally, these tools should be also available remotely, allowing operators to react timely to problems in the infrastructure. A uniform interface for remote management and monitoring should be followed by all the services.
Monitorization should also be easily used in existing monitoring systems such as Nagios.
in the .
be in .
'''Verification''':Test suite must include tests on this functionality.
UMD services should be interoperable between them. All the services should assure their correct functionality within the rest of the UMD middleware for a given major release before entering the distribution. Ideally, backward compatibility between major releases should be kept. Interoperability between other middleware distributions is recommended, therefore compliance to existing and future standards should be priority in all the distributed software.
'''Verification''': interoperability test of the service, assuring the correct behavior within the environment.
Source Code Quality and Availability ==
The source code of each component of the UMD middleware should follow a coherent and clear programming style that helps in the readability of the code and eases maintenance, testing, debugging, fixing, modification and portability of the software. Open source components must publicly offer their source code and the license with the binaries.
'''Verification ''': for Open Source components, availability of the code and license. Source code quality metrics are desirable.
Specific acceptance criteria =
This section will detail the specific acceptance criteria for each of the services that are part of the UMD.
* Computing Services: [[EGI-InSPIRE:UMDQualityCriteria:ComputingServices]]
* Storage Services: [[EGI-InSPIRE:UMDQualityCriteria:StorageServices]]
* Information Services: [[EGI-InSPIRE:UMDQualityCriteria:InformationServices]]
* Data Management Services: [[EGI-InSPIRE:UMDQualityCriteria:DataManagementServices]]
* Workload Management Services: [[EGI-InSPIRE:UMDQualityCriteria:WorkloadManagementServices]]
UMD Quality Criteria
All the software included in the Unified Middleware Distribution (UMD) must meet a set of Quality Criteria defined by EGI. The Quality Criteria can be classified into generic criteria, i.e. criteria which sould hold for any component of the UMD, and specific criteria, i.e. criteria valid for a particular component only.
Any Technology Provider (TP) willing to have their software verified by the SA2 team should carefully read them.
The Quality Criteria (QC) is written following the classification of capabilities defined in can the UMD RoadMap, one document for each type of capability. Take into account that a software component may cover QC specified in more than one of those documents.
QC definition is a continuous process driven by the requirements of users and operations communities, however the verification of new software releases will be done against fixed releases of the QC documents. At least every 6 months a new major version of the QC will be released, although more frequent releases may occur if necessary.
Each new release of the QC will be announced in this page and the TCB mailing list. Along with the announcement of each release of QC specification, a date for its use in verification will be specified. It is expected that verification will use new releases of the QC within 2 to 4 weeks from the release date.
Each QC release will contain documentation with:
- major changes introduced in the version.
- criteria to be deprecated in next release of the QC, if any.
As soon as a major update of the QC is released, the next version will be made available as draft in the DocDB. This draft will allow the TP to plan their testing efforts.
Current QC Specification
Current QC Specification can be found at the QC document in EGI DocDB.
Check the release notes.
||Expected Release Date
||Expected Verification Date
||10. 02. 2011
||10. 02. 2011
|| release notes
||01. 08. 2011
||15. 08. 2011
Past QC releases
There are no past QC releases.