EGI Verifier Guideline QC5
|Technology||Software Component Delivery||Software Provisioning||UMD Middleware||Cloud Middleware Distribution||Containers Distribution||Technology Glossary|
|Quality Assurance |||Quality Criteria Definition |||Quality Criteria Dissemination |||Quality Criteria Verification |||Verifier Guideline |||Verification Testbed |||Glossary|
The main objective of this guideline is to aid new SA2 verifiers to complete Verification process successfully.
When a new product is available, the TP has to follow the Software Provisioning Process. Once the software is correctly uploaded to the repository, the release enters into the verification phase. The requirements are that the TP has to provide all the necessary information to the verifier (QCV) so that the QCV can assess that the TP has tested in advance the quality of the software. Depending on the type of release, different actions will be taken by the QCV.
The verification process starts when the following pre-conditions are met:
- RT ticket is in state Open.
- The RolloutProgress is set to Unverified.
- CommunicationStatus is OK.
- Owner is set to nobody.
If these conditions are reached then Verification process may be started.
Once the verification ticket meets the preconditions described above, the QCV must perform the following steps:
- Set RT ticket Owner with the current QCV.
- Set UMDRelease to the appropriate UMD Release (currently 1) and save the state.
- Changes RolloutProgress to In Verification to start the actual verification.
Each product has a specific template that includes all QC that the product must comply with. Templates are available at Verification Reports and Executive Summary templates and are created according to the QC products mapping. These documents are updated if new UMD Quality Criteria is released.
Make sure that you are using the correct documents, in case of doubt, consult the EGI Quality Criteria Dissemination and EGI Quality Criteria Verification pages that contain the latest information.
Verifiers must fill two documents when the verification is finished, The Executive Summary (QC_Verification -Executive_Summary _v?.doc) and the specific product template (QC_Verification_Template - <PRODUCT_NAME>_v?.xls). ** These documents should be converted to PDF and concatenated into one final PDF Verification Report **. The version number represents the QC version used to create these files. The fields to fill into QC_Verification_Template excel file are:
- Y, when the product meets the criteria
- N, when the product does not meet the criteria,
- NA, when the criteria is not applicable for the product (e.g. API documentation for products without a public API)
- TP, when the criteria was tested by the Technology Provider and the validator trusts the results of the tests.
- or VLD, when the criteria was tested by the validation team
- Comments: include here any relevant comments or links to more information for the specified criteria.
Level of Testing
The verification process is different for UMD Revision, Minor or Major releases.
- Major releases (may not be backwards compatible):
- Verifier MUST actively assess all assigned QCs (executing specific tests, reading available documentation, etc).
- Test the new functionalities.
- Product installation from scratch (or upgrade if It's supported by the product).
- Minor releases (backwards compatible):
- Verifiers only checks QCs affected by update changes (described at TP release notes).
- Verifiers must fill these QCs into the QC_Verification_Template excel file, the rest of QCs should be left in blank or just add a comment:"Minor release, this QC was already verified.".
- Package update installation and verification.
- Product installation from scratch.
- Verifiers only checks QCs affected by update changes (described at TP release notes).
- Revision releases (backwards compatible):
- Verifiers only checks new package installation and upgrades, no further testing is needed.
- Optionally verifiers can verify the new bug fixes.
If the Verifier finds problems or issues, either they are clarified within the ticket by the verification team, OR, if the problems needs the interaction of the Technology Provider a GGUS ticket should be opened (i.e. missing documentation).
- All the new GGUS tickets should be created with the default priority, the short description field must begin with "Verification" and then the description of the issue.
If the verifier finds a critical or show-stopper issue the GGUS ticket must be set as top priority.
- The ticket should describe the criticality of the issue in the ticket body. If it's a show stopper or not, or if there are possible workarounds, etc.
- DMSU support unit will then route it to the technology providers as they see fit.
- For critical or show-stopper issues the Verifier should change the RT ticket RolloutProgress to the Waiting for Response status.
- All the GGUS ticket links must be included into the RelatedGGUSTickets RT field. Verifiers should also include links to all the created GGUS tickets as reply into the verification RT.
- Once the problem is solved, the RolloutProgress must be changed again to In verification.
- Each GGUS ticket link must be included into the Verification Executive Summary.
There are two kind of RT comments or replies (all the comments are visible to watchers and TPs):
- Comment: A background communication.
- Reply: A public response. GGUS and TP opened tickets must be included as Reply.
Verifiers Mailing list
For any verification doubt or internal questions it's available a Verifiers mailing list: sw-rel-qc(at)mailman.egi.eu
All the verifiers are included in this mailing list through EGI SSO.
TP Tickets Response Time
The SLA with EMI states the following response times for GGUS tickets:
Note, these are not resolution times!
|Top Priority||4 hours|
|Urgent||2 working days|
|Less Urgent||15 working days|
EMI SLA, pages 13 and 14, available at https://documents.egi.eu/document/461
QCs tests are Mandatory (M) or Optional (O).
- A product is REJECTED if it fails the installation or configuration process.
- A product is REJECTED if it fails ANY Mandatory QC.
- A product is VERIFIED if it pass ALL assigned QCs.
- A product is VERIFIED if it fails ANY Optional QC.
I have the impression that mandatory/optional mixes two aspects of the verification: a) The verifier's obligation to execute a QC test (mandatory vs. optional), and b) The verification outcome when a fault is detected (critical vs. non-critical) Assuming the following definition for CRITICAL/NON-CRITICAL: A CRITICAL Quality Criterion MUST will cause immediate and unconditional rejection of the tested product if a failure of any type is detected during testing. A NON-CRITICAL Quality Criterion may still lead to acceptance of the tested product if a documented(!) workaround to avoid or mitigate the failure exists. And using the definition of MANDATORY/OPTIONAL: A MANDATORY Quality Criterion must be assessed or verified for any software release supplied by a Technology Provider. An OPTIONAL Quality Criterion may be assessed or tested by the verifier. With the definition of ASSESSING/TESTING as: "A verifier is ASSESSING a Quality Criterion by analysing documentation (release notes, known bugs list, documented changes, test results, etc.) supplied by the Technology Provider without actual independent test execution. A Quality Criteron is TESTED if the verifier independently executes the test associated to the Quality Criterion using the same sources of documentation, but an test infrastucture independent from the Technology Provider." Those aspects may provide you with a matrix guide for verifiers to produce reliable and fair outcomes: '''1) MANDATORY and CRITICAL'''''' QC''': '''MUST''' be assessed by the verifier, and '''MUST''' be tested by the verifier for any software release (whether major, minor or revision release. '''2) MANDATORY and NON-CRITICAL QC''': '''MUST''' be assessed for minor and revision releases, '''MAY''' be tested for minor releases (depending on other circumstances) a '''MUST''' be tested for major releases. '''3) OPTIONAL and NON-CRITICAL QC:''' '''MAY''' be assessed for any release, and '''MAY''' be tested for any release, if time permits. '''4) OPTIONAL and CRITICAL QC''': This combination does not make sense. For all cases 1 - 3 the criticality determines the outcome of the verification based on detected failures.
When the Verification template is finished an Executive Summary is created (use QC_Verification -Executive_Summary _v?.doc as template). These templates are available at Verification Reports and Executive Summary templates. This document includes:
- A summary of tests failed and passed (Critical and non critical) and the verifiers comments (tests executed, opened GGUS tickets, installed packages, etc).
- A list of comments for other teams involved in the Software Provisioning process (SR, QC, )
This Executive summary should be concatenated in a PDF with the verification templates and published as the final Verification Report.
Publication of Verification Report
In order to store the verification report, a new space must be created in the DocumentDB. The new document must comply with the following information:
- Title: QC Verification Report: <PRODUCT_NAME>-<VERSION>.
- The File field is used to upload Verification Report PDF. Verification Report should be named with the nomenclature: QC_Verification--Report_<PRODUCT_NAME>_<VERSION>.pdf.
- Keywords field: Space separated keyword names, must include Quality, Criteria, Verification, etc.
- Media type must be set to Document.
- status must be set to FINAL.
- View must be set to public.
- Modify should be set to inspire-SA2.
- TOPICS space field must be set to Software Provisioning Verification Reports and WP5.
The generated document DB link must be included into the QualityCriteriaVerificationReport field of the RT ticket of the verification.
End of Verification (final steps)
Once the report is stored in the doc DB and linked in the RT ticket, the result of the verification must be set by changing the RolloutProgress field of the ticket.
- If the product is accepted then change this field to the "StageRollout" value. This will cause the Rollout teams to continue with the software provisioning process.
- If the product is not accepted, then change the value to "Rejected", causing the process to stop.
- If the rejection involves two or more RT tickets, the verifier MUST'fill QualityCriteriaVerificationReport in each affected RT ticket.
- Special case about Documentation QC: If the product does not met a mandatory Documentation QC, the verifier must set "Failed against mandatory documentation QC" field to true.
- When the process is finished (the product is accepted or rejected) the verifier must fill the "Time Worked" RT field to account the real hours/minutes spent to finish the verification process.
UMD release schedule
- t-2 weeks: Freeze the release date. The next regular UMD release (i.e. except emergency releases!) will be published at that date, no matter what.
- t-1 week: Freeze contents of the upcoming UMD release. Any product in the Software Provisioning process that did *not* get accepted by StagedRollout by that day, will be pushed to the next release (or even later). Technically, at that day, all products with an RT ticket in state "UMDStore" in the "sw-rel" queue will be taken into the UMD Composer, all at once. (see below).
- t-1 week: Compose the UMD release. The products that made the cut-off date will be taken into the next UMD release in the UMD Composer. Verification and StagedRollout assemble the necessary documentation, such as Konwn issues, Installation notes, etc. for the release's Wiki page, and summaries for the UMD release itself that will be published in the repository later-on. 2 days are seen enough, particularly when the relevant information is pre-assembled in a temporary Wiki space etc.
- t-3 days: A formal test release is prepared including all selected products including(!) the release documentation (release notes etc). The point is to have a complete set for final reviews. Further test releases are prepared as required and documented.
- t: The UMD Release is published for production. The aim is to publish UMD releases on a Monday, or in the first half of the week up to and including Wednesdays.