Roles and Activities > Analyst Role Set > Test Analyst > Define Assessment and Traceability Needs

Purpose
  • To define the assessment strategy for the test effort
  • To define traceability and coverage requirements
Steps
Input Artifacts: Resulting Artifacts:
Frequency: This activity is typically conducted multiple times per iteration. .
Role: Test Analyst
Tool Mentors:
More Information:

Workflow Details:

Identify assessment and traceability requirements To top of page

Purpose: To understand the deliverables for the software assessment process and elicit the associated requirements.

Review the Iteration Plan and identify specific assessment needs for this forthcoming body of work. Ask stakeholders what they require from both assessment and traceability.

Also, consider whether the test effort will be formally audited either during or at the conclusion of the testing effort. Formal audit requirements may necessitate the retention of additional documentation and records as proof that sufficient testing has been undertaken.

Consider constraints To top of page

Purpose: To identify the constraints that will effect the ability (or the necessity) to implement the requirements.

While there is typically a unending list of "wants" you might be tempted to consider as requirements for traceability and assessment strategies, it's important to focus on the most important "needs" that a) Provide essential information to the project team and b) Can actually be tracked and measured. It is unlikely that you will have enough resource available for your strategy to cater for more than what is essentially needed.

Sub-topics:

Acceptable quality level To top of page

It's important to identify what level of quality will be considered "good enough", and develop an appropriate assessment strategy. Note that often quality dimensions wax and wane in importance and quality levels rise and fall in the eyes of the stakeholders throughout the project lifecycle

Review the QA Plan, Software Development Plan and interview the important stakeholders themselves directly to determine what they consider will be an acceptable quality level.

Process and tool enablement To top of page

While you can probably imagine a world of effortless traceability and assessment at a low-level of granularity, the reality is that it's difficult and usually uneconomic to implement such approaches. With sophisticated tool support, it can still be difficult and time-consuming to manage low-level approaches to traceability; without supporting tools, almost impossible. The software engineering process itself may place constraints on traceability: for example, if traceability from tests to motivating requirements is desired, but the requirements themselves are not being carefully managed, it may be impossible to implement this traceability.

Consider the constraints and limitations of both your software engineering process and tools, and choose an appropriate, workable traceability and assessment approach accordingly.

Consider possible strategies To top of page

Purpose: To identify and outline one or more strategies that will facilitate the required assessment process.

Now that you have a better understanding of the assessment and traceability requirements, and of the constraints placed on them by the desired quality level and available process and tool support, you need to consider the potential assessment or evaluation strategies you could employ. For a more detailed treatment of possible startegies, we suggest you read Cem Kaner's paper "Measurement of the Extent of Testing", October 2000.

Sub-topics:

Test Coverage Analysis To top of page

There are many different approaches to test coverage, and no one coverage measure alone provides all the coverage information necessary to form an assessment of the extent or completeness of the test effort. Note that different coverage strategies take more or less effort to implement, and with any particular measurement category, there will usually be a depth of coverage analysis at which point it becomes uneconomic to record more detailed information.

Some categories of test coverage measurement include: Requirements, Source Code, Product Claims and Standards. We recommend you consider incorporating more than one coverage caegory in your test assessment strategy. In most cases, test coverage refers to the planning and implementation of specific tests in the first instance. However, test coverage metrics and their analysis are also useful to consider in conjunction with test results or defect analysis.

Test Results Analysis To top of page

A common approach to test results analysis is to simply refer to the number of results that were positive or negative as a percentage of the total number of tests run. Our opinion, and the opinion of other practitioner in the test community, is that this is a simplistic and incomplete approach to analyzing test results.

Instead, we recommend you analyze your test results in terms of relative trend over time, and within each test cycles, consider the relative distribution of test failures across different dimensions such as the functional area being tested, the type of quality risks being explored, the relative complexity of the tests and the test resources applied to each functional area. This information

Defect Analysis To top of page

While defects themselves are obviously related to the results of the test effort, the analysis of defect data does not provide any useful information about the progress of the test effort or the completeness or thoroughness of that effort. However, a mistake made by some test teams and project managers is to use the current defect count to measure the progress of testing or as a gauge to the quality of the developed software. Our opinion, and the opinion of other practitioner in the test community, is that this is a meaningless approach.

Instead, we recommend you analyze the relative trend of the defect count over time to provide a measure of relative stability. For example, assuming the test effort remains relatively constant, you would typically expect to see the new defect discovery rate as measured against a regular time period "bell curve" over the course of the iteration; an increasing discovery rate that peaks then tails off toward the end of the iteration. However, you'll need to provide this information in conjunction with an analysis of other defect metrics such as: defect resolutions rates, including an analysis of the resolution type; distribution of defects by severity; distribution of defects by functional area.

With sophisticated tool support, you can perform complex analysis of defect data relatively easily; without appropriate tool support it is a much more difficult proposition.

Discuss possible strategies with stakeholders To top of page

Purpose: To gather feedback through initial stakeholder review and adjust the strategies as necessary.

Present the possible strategies to the various stakeholders. Typically you'd expect this to include a group made up from the following roles; Project Manager, the Software Architect, the Development Manager, the System Analyst, the Configuration & Change Manager, the Deployment Manager and the Customer Representative. Each of these roles has a stakeholding in how quality is assessed.

Depending on the culture of the project, you should choose an appropriate format to present the possible strategies. This may range from one or more informal meetings to a formal presentation or workshop session.

Define and agree on the assessment strategy To top of page

Purpose: To gain stakeholder agreement on the strategy that will be used.

Take the feedback your recieve from the discussions and refibe the assessment strategy to a single strategy that addresses the needs of all stakeholders.

Present the assessment strategy for final agreement and approval.

Define tool requirements To top of page

Purpose: To define the supporting tool configuration requirements that will enable the assessment process.

As mentioned previously, with sophisticated tool support you can perform complex analysis of measurement data relatively easily; without appropriate tool support it is a much more difficult proposition.

For the following categories, consider what tool support you will need: Coverage and Traceability, Defect Analysis.

Evaluate and verify your results To top of page

Purpose: To verify that the activity has been completed appropriately and that the resulting artifacts are acceptable.

Now that you have completed the work, it is beneficial to verify that the work was of sufficient value, and that you did not simply consume vast quantities of paper. You should evaluate whether your work is of appropriate quality, and that it is complete enough to be useful to those team members who will make subsequent use of it as input to their work. Where possible, use the checklists provided in RUP to verify that quality and completeness are "good enough".

Have the people performing the downstream activities that rely on your work as input take part in reviewing your interim work. Do this while you still have time available to take action to address their concerns. You should also evaluate your work against the key input artifacts to make sure you have represented them accurately and sufficiently. It may be useful to have the author of the input artifact review your work on this basis.

Try to remember that that RUP is an iterative process and that in many cases artifacts evolve over time. As such, it is not usually necessary—and is often counterproductive—to fully-form an artifact that will only be partially used or will not be used at all in immediately subsequent work. This is because there is a high probability that the situation surrounding the artifact will change—and the assumptions made when the artifact was created proven incorrect—before the artifact is used, resulting in wasted effort and costly rework. Also avoid the trap of spending too many cycles on presentation to the detriment of content value. In project environments where presentation has importance and economic value as a project deliverable, you might want to consider using an administrative resource to perform presentation tasks.



Copyright  © 1987 - 2001 Rational Software Corporation


Display Rational Unified Process using frames

Rational Unified Process