STEP :Systematic Test and Evaluation Process
===========
Overview
===========
The major techniques employed in evaluation are analysis, review and test. STEP focuses on testing as the most complex of the three, but stresses overall coordination and planning of all aspects of evaluation as a key to success. It stresses the prevention potential of testing, with defect detection and demonstration of capability as secondary goals.
分析,复审和测试是产品评估的重要技术和手段。 STEP主要侧重于测试,因为测试是这三者中最为复杂的。
STEP Architecture:
Step 1 Plan the Strategy (Manager should focus on it at the begining of a project.)
P1 Establish the master test plan.
P2 Develop the detailed test plans.
Step 2 Acquire the Testware (Analyst and technician)
A1 Inventory the test objectives (requirements-based, design-based, and implementation-based).
A2 Design the tests (architecture and environment, requirements-based, design-based, and implementation-based).
A3 Implement the plans and designs.
Step 3 Measure the Behavior (Reviwer)
M1 Execute the tests.
M2 Check the adequacy of the test set.
M3 Evaluate the software and testing process.
Work Products of STEP
IEEE Std. 829-1998 Standard for Software Test Documentation Template for Test Documents Contents
1.Test Plan
Used for the master test plan and level-specific test plans.
2.Test Design Specification
Used at each test level to specify the test set architecture and coverage traces.
3.Test Case Specification
Used as needed to describe test cases or automated scripts.
4.Test Procedure Specification
Used to specify the steps for executing a set of test cases.
5.Test Log
Used as needed to record the execution of test procedures.
6.Test Incident Report
Used to describe anomalies that occur during testing or in production.
These anomalies may be in the requirements, design, code, documentation, or the test cases themselves.
Incidents may later be classified as defects or enhancements.
7.Test Summary Report
Used to report completion of testing at a level or a major test objective within a level.
Roles and Responsibilities
Role Description of Responsibilities
Manager Communicate, plan, and coordinate.
Analyst Plan, inventory, design, and evaluate.
Technician Implement, execute, and check.
Reviewer Examine and evaluate.
===========
Risk analysis
===========
A latent defect is an existing defect that has not yet caused a failure because the exact set of conditions has never been met.
A masked defect is an existing defect that hasn't yet caused a failure, because another defect has prevented that part of the code from being executed.
Risk involves the probability or likelihood of an event occurring and the negative consequences or impact of that event.
Risk Management is the process of controlling risk and monitoring the effectiveness of the control mechanisms.
Risk Analysis is the process of identifying, estimating, and evaluating risk.
Risk Analysis can be separated into two key activities:
* software risk analysis
* analysis of planning risks and contingencies
Software Risk Analysis
Why?
The purpose of a software risk analysis is to determine what to test, the testing priority, and the depth of testing.
Who?
Ideally, the risk analysis should be done by an interdisciplinary team of experts.
When?
A risk analysis should be done as early as possible in the software lifecycle.
How?
Step 1: Form a Brainstorming Team
Include users (or pseudo-users such as business analysts), developers, testers, marketers, customer service representatives, support personnel, and anyone else that has knowledge of the business and/or product, and is willing and able to participate.
The purpose of Part One of a brainstorm session is to increase the number of ideas that the group generates. As a general rule of thumb:
* Do not allow criticism or debate.
* Let your imagination soar.
* Shoot for quantity.
* Mutate and combine ideas.
The purpose of Part Two of the brainstorming session is to reduce the list of ideas to a workable size. As a general rule of thumb, the methods for doing this include:
* Voting with campaign speeches
* Blending ideas
* Applying criteria
* Using scoring or ranking systems
Step 2: Compile a List of Features
Examples of attributes to consider may include:
* accessibility
* availability
* compatibility
* maintainability
* performance
* reliability
* scalability
* security
* usability
Step 3: Determine the Likelihood
Assign an indicator for the relative likelihood of failure. H,M,L.
Step 4: Determine the Impact
The users are particularly important in assigning values for impact, since the impact is usually driven by business issues rather than by the systemic nature of the system.
Especially in larger systems, many users may only be experts in one particular area of functionality, while experienced testers often have a much broader view. It is this broad view that is most useful in determining the relative impact of failure.
Step 5: Assign Numerical Values
In this step of the risk analysis, the brainstorming team should assign numerical values for H, M, and L for both likelihood and impact.
Step 6: Compute the Risk Priority
The values assigned to the likelihood of failure and the impact of failure should be added together.
The overall risk priority is a relative value for the potential impact of failure of a feature or attribute of the software weighted by the likelihood of it failing.
Step 7: Review/Modify the Value
Step 8: Prioritize the Features
Step 9: Determine the "Cut Line
Step 10: Consider Mitigatio
Risk mitigation helps reduce the likelihood of a failure, but does not affect the impact.
Planning Risks and Contingencies
The only possible contingencies that exist are:
* reduce the scope
* delay implementation
* add resources
* reduce quality processes