Alert.png The wiki is deprecated and due to be decommissioned by the end of September 2022.
The content is being migrated to other supports, new updates will be ignored and lost.
If needed you can get in touch with EGI SDIS team using operations @ egi.eu.

Difference between revisions of "EGI Verifier Guideline QC5"

From EGIWiki
Jump to navigation Jump to search
Line 47: Line 47:


== Criteria Verification  ==
== Criteria Verification  ==
=== Level of Testing  ===
=== Level of Testing  ===


Line 61: Line 60:
**Product installation from scratch.
**Product installation from scratch.


<br>
=== Holding Verification  ===
If during the verification process, any problems that need to be solved interacting with the Technology Provider arise (i.e. missing documentation), the Verifier should change the RT ticket '''RolloutProgress''' to the ''Waiting for Response'' status and comment in the ticket the cause of the problem and how communication is done with the Technology Provider.


In the case of EMI, the communication channel to use is [https://ggus.org/ GGUS].
=== Handling Issues ===


'''Verifiers must include all the created GGUS tickets as reply into the product RT.'''<br> Once the problem is solved, the '''RolloutProgress''' must be changed again to ''In verification''.  
If the Verifier finds problems or issues, either they are clarified within the ticket by the verification team, OR, if the problems needs the interaction of the Technology Provider a GGUS ticket should be opened (i.e. missing documentation).
# If a GGUS ticket is opened, this ticket will be assigned to the DMSU support unit, which will then route it to the technology providers as they see fit.
# When opening a ggus ticket, you should let DMSU decide on the priority, but should describe the criticality of the issue in the ticket body. If it's a show stopper or not, or if there are possible workarounds, etc.
# The Verifier should change the RT ticket '''RolloutProgress''' to the ''Waiting for Response'' status and in the verification RT ticket add a link to the GGUS ticket. '''Verifiers must include all the created GGUS tickets as reply into the product RT.'''
# Once the problem is solved, the '''RolloutProgress''' must be changed again to ''In verification''.  


<br>


=== RT Comments  ===
==== RT Comments  ====


There are two kind of RT comments or replies:  
There are two kind of RT comments or replies:  
Line 80: Line 77:
*'''Reply:''' A public response (visible to watchers). GGUS and TP opened tickets must be included as ''Reply''.
*'''Reply:''' A public response (visible to watchers). GGUS and TP opened tickets must be included as ''Reply''.


<br>


=== TP Tickets Response Time  ===
==== TP Tickets Response Time  ====


*'''EMI'''
*'''EMI'''
Line 106: Line 102:


EMI SLA, pages 13 and 14, available at https://documents.egi.eu/document/461  
EMI SLA, pages 13 and 14, available at https://documents.egi.eu/document/461  


=== Product Acceptance  ===
=== Product Acceptance  ===
QCs tests are '''Mandatory''' (M) or '''Optional''' (O).  
QCs tests are '''Mandatory''' (M) or '''Optional''' (O).  


Line 139: Line 135:


For all cases 1 - 3 the criticality determines the outcome of the verification based on detected failures.
For all cases 1 - 3 the criticality determines the outcome of the verification based on detected failures.
</pre>  
</pre>
 
== Verification Summary  ==
== Verification Summary  ==



Revision as of 15:52, 26 July 2011

Technology Software Component Delivery Software Provisioning UMD Middleware Cloud Middleware Distribution Containers Distribution Technology Glossary


Quality Assurance | Quality Criteria Definition | Quality Criteria Dissemination | Quality Criteria Verification | Verifier Guideline | Verification Testbed | Glossary





The main objective of this guideline is to aid new SA2 verifiers to complete Verification process successfully.

Verification Preconditions

When a new product is available, the TP has to follow the Software Provisioning Process. Once the software is correctly uploaded to the repository, the release enters into the verification phase. The requirements are that the TP has to provide all the necessary information to the verifier (QCV) so that the QCV can assess that the TP has tested in advance the quality of the software. Depending on the type of release, different actions will be taken by the QCV.

The verification process starts when the following pre-conditions are met:

  1. RT ticket is in state Open.
  2. The RolloutProgress is set to Unverified.
  3. CommunicationStatus is OK.
  4. Owner is set to nobody.

If these conditions are reached then Verification process may be started.

Verification Start

Once the verification ticket meets the preconditions described above, the QCV must perform the following steps:

  1. Set RT ticket Owner with the current QCV.
  2. Set UMDRelease to the appropriate UMD Release (currently 1) and save the state.
  3. Changes RolloutProgress to In Verification to start the actual verification.

Verification Template

Each product has a specific template that includes all QC that the product must comply with. Templates are available at Verification Reports and Executive Summary templates and are created according to the QC products mapping. These documents are updated if new UMD Quality Criteria is released.

Warning:
Make sure that you are using the correct documents, in case of doubt, consult the EGI Quality Criteria Dissemination and EGI Quality Criteria Verification pages that contain the latest information.


The fields to fill in the template are:

  • Accepted:
    • Y, when the product meets the criteria
    • N, when the product does not meet the criteria,
    • NA, when the criteria is not applicable for the product (e.g. API documentation for products without a public API)
  • Tested:
    • TP, when the criteria was tested by the Technology Provider and the validator trusts the results of the tests.
    • or VLD, when the criteria was tested by the validation team
  • Comments: include here any relevant comments or links to more information for the specified criteria.

Criteria Verification

Level of Testing

The product in verification must be installed and tested in the EGI Verification Testbed, however the verification process is different for UMD Minor or Major releases:

  • Major releases (may not be backwards compatible):
    • Verifier MUST actively assess all assigned QCs (executing specific tests, reading available documentation, etc).
    • Product installation from scratch (or upgrade if It's supported by the product).
  • Minor releases (backwards compatible):
    • Verifiers only checks QCs affected by update changes.
    • Package update installation and verification.
    • Product installation from scratch.


Handling Issues

If the Verifier finds problems or issues, either they are clarified within the ticket by the verification team, OR, if the problems needs the interaction of the Technology Provider a GGUS ticket should be opened (i.e. missing documentation).

  1. If a GGUS ticket is opened, this ticket will be assigned to the DMSU support unit, which will then route it to the technology providers as they see fit.
  2. When opening a ggus ticket, you should let DMSU decide on the priority, but should describe the criticality of the issue in the ticket body. If it's a show stopper or not, or if there are possible workarounds, etc.
  3. The Verifier should change the RT ticket RolloutProgress to the Waiting for Response status and in the verification RT ticket add a link to the GGUS ticket. Verifiers must include all the created GGUS tickets as reply into the product RT.
  4. Once the problem is solved, the RolloutProgress must be changed again to In verification.


RT Comments

There are two kind of RT comments or replies:

  • Comment: A background communication (not visible to watchers).
  • Reply: A public response (visible to watchers). GGUS and TP opened tickets must be included as Reply.


TP Tickets Response Time

  • EMI

The SLA with EMI states the following response times for GGUS tickets:

Note, these are not resolution times!

Priority Response Time
Top Priority 4 hours
Urgent 2 working days
Less Urgent 15 working days

EMI SLA, pages 13 and 14, available at https://documents.egi.eu/document/461


Product Acceptance

QCs tests are Mandatory (M) or Optional (O).

  • A product is REJECTED if it fails the installation or configuration process.
  • A product is REJECTED if it fails ANY Mandatory QC.
  • A product is VERIFIED if it pass ALL assigned QCs.
  • A product is VERIFIED if it fails ANY Optional QC.
I have the impression that mandatory/optional mixes two aspects of the verification:
a) The verifier's obligation to execute a QC test (mandatory vs. optional), and
b) The verification outcome when a fault is detected (critical vs. non-critical)

Assuming the following definition for CRITICAL/NON-CRITICAL:
A CRITICAL Quality Criterion MUST will cause immediate and unconditional rejection of the tested product if a failure of any type is detected during testing. 
A NON-CRITICAL Quality Criterion may still lead to acceptance of the tested product if a documented(!) workaround to avoid or mitigate the failure exists.

And using the definition of MANDATORY/OPTIONAL:
A MANDATORY Quality Criterion must be assessed or verified for any software release supplied by a Technology Provider. An OPTIONAL Quality Criterion may be
assessed or tested by the verifier.
 
With the definition of ASSESSING/TESTING as:
"A verifier is ASSESSING a Quality Criterion by analysing documentation (release notes, known bugs list, documented changes, test results, etc.) supplied 
by the Technology Provider without actual independent test execution. A Quality Criteron is TESTED if the verifier independently executes the test associated 
to the Quality Criterion using the same sources of documentation, but an test infrastucture independent from the Technology Provider."

Those aspects may provide you with a matrix guide for verifiers to produce reliable and fair outcomes:
'''1) MANDATORY and CRITICAL'''''' QC''': '''MUST''' be assessed by the verifier, and '''MUST''' be tested by the verifier for any software release (whether major, minor or revision release.
'''2) MANDATORY and NON-CRITICAL QC''': '''MUST''' be assessed for minor and revision releases, '''MAY''' be tested for minor releases (depending on other circumstances) a '''MUST''' be tested for major releases.  
'''3) OPTIONAL and NON-CRITICAL QC:''' '''MAY''' be assessed for any release, and '''MAY''' be tested for any release, if time permits.
'''4) OPTIONAL and CRITICAL QC''': This combination does not make sense.

For all cases 1 - 3 the criticality determines the outcome of the verification based on detected failures.

Verification Summary

When the Verification template is finished an Executive Summary is created, following the template available at Verification Reports and Executive Summary templates. This document includes:

  • A summary of tests failed and passed (Critical and non critical).
  • A list of comments for other teams involved in the Software Provisioning process (SR, QC, )

Publication of Verification Reports

In order to store the verification reports, a new space must be created in the DocumentDB. The new document must comply with the following information:

  • Title: QC Verification Report: <PRODUCT_NAME>-<VERSION>.
  • Files to include:
    • First file field is used to upload Executive Summary doc. Executive Summary file names should have this nomenclature: QC_Verification--Executive_Summary_<PRODUCT_NAME>_<VERSION>.doc.
    • Second file filed is used to upload QC Verification report. Verification report file names should have this nomenclature: QC_Verification--Report_<PRUDUCT_NAME>_<VERSION>.doc .
  • Keywords field: Space separated keyword names, must include Quality, Criteria, Verification, etc.
  • Media type must be set to Document.
  • status must be set to FINAL.
  • View must be set to public.
  • Modify should be set to inspire-SA2.
  • TOPICS space field must be set to Software Provisioning Verification Reports and WP5.

The generated document DB link must be included into the QualityCriteriaVerificationReport field of the RT ticket of the verification.

End of Verification

Once the reports are stored in the doc DB and linked in the RT ticket, the result of the verification must be set by changing the RolloutProgress field of the ticket. If the product met is accepted then change this field to the "StageRollout" value. This will cause the Rollout teams to continue with the software provisioning process. If the product is not accepted, then change the value to "Rejected", causing the process to stop.

UMD release schedule

  • t-2 weeks: Freeze the release date. The next regular UMD release (i.e. except emergency releases!) will be published at that date, no matter what.


  • t-1 week: Freeze contents of the upcoming UMD release. Any product in the Software Provisioning process that did *not* get accepted by StagedRollout by that day, will be pushed to the next release (or even later). Technically, at that day, all products with an RT ticket in state "UMDStore" in the "sw-rel" queue will be taken into the UMD Composer, all at once. (see below).


  • t-1 week: Compose the UMD release. The products that made the cut-off date will be taken into the next UMD release in the UMD Composer. Verification and StagedRollout assemble the necessary documentation, such as Konwn issues, Installation notes, etc. for the release's Wiki page, and summaries for the UMD release itself that will be published in the repository later-on. 2 days are seen enough, particularly when the relevant information is pre-assembled in a temporary Wiki space etc.


  • t-3 days: A formal test release is prepared including all selected products including(!) the release documentation (release notes etc). The point is to have a complete set for final reviews. Further test releases are prepared as required and documented.


  • t: The UMD Release is published for production. The aim is to publish UMD releases on a Monday, or in the first half of the week up to and including Wednesdays.