Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Using UML, Patterns, and Java
Object-Oriented Software Engineering
Chapter 11: Testing
These slides have been altered by Mohammad Ariful Hyder for CSE
321 using additional resources. (See references)
Outline
Terminology
Types of errors
Dealing with errors
Quality assurance vs Testing
Component Testing
Unit testing
Integration testing
Testing Strategy
Design Patterns & Testing
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
2
Terminology
Reliability:
The measure of success with which the observed behavior of a
system confirms to some specification of its behavior. IEEE(The ability of a system or
component to perform its required functions under stated conditions for a specified period of time. It is often expressed
as a probability.)
Failure: Any deviation of the observed behavior from the
specified behavior
Error: The system is in a state such that further processing by
the system will lead to a failure.
Stress or overload errors, Capacity or boundary errors, Timing errors, Throughput or
performance errors
Fault (Bug): The mechanical or algorithmic cause of an error.
Quality: IEEE(the degree to which a system, component, or process
meets customer or user needs or expectations.)
There are many different types of errors and different ways how
we can deal with them.
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
3
Examples of Faults and Errors
Faults in the Interface
specification
Mismatch between what the
client needs and what the
server offers
Mismatch between
requirements and
implementation
Algorithmic Faults
Missing initialization
Branching errors (too soon,
too late)
Missing test for nil
Bernd Bruegge & Allen H. Dutoit
Mechanical Faults (very
hard to find)
Documentation does not
match actual conditions or
operating procedures
Errors
Stress or overload errors
Capacity or boundary errors
Timing errors
Throughput or performance
errors
Object-Oriented Software Engineering: Using UML, Patterns, and Java
4
Dealing with Errors
Verification:
Assumes hypothetical environment that does not match real environment
Proof might be buggy (omits important constraints; simply wrong)
Modular redundancy:
Expensive
Declaring a bug to be a “feature”
Bad practice
Patching
A patch (sometimes called a "fix") is a quick-repair job for a piece of programming. During a software product's beta test distribution or try-out
period and later after the product is formally released, problems (called bug) will almost invariably be found. A patch is the immediate solution that
is provided to users; it can sometimes be downloaded from the software maker's Web site. The patch is not necessarily the best solution for the
problem and the product developers often find a better solution to provide when they package the product for its next release. A patch is usually
developed and distributed as a replacement for or an insertion in compiled code (that is, in a binary file or object module). In larger operating
systems, a special program is provided to manage and keep track of the installation of patches.
Slows down performance
Testing (this lecture)
Testing is never good enough
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
5
Another View on How to Deal with Errors
Error prevention (before the system is released):
Use good programming methodology to reduce complexity
Use version control to prevent inconsistent system
Apply verification to prevent algorithmic bugs
Error detection (while system is running):
Testing: Create failures in a planned way
Debugging: Start with an unplanned failuresMonitoring: Deliver
information about state. Find performance bugs
Error recovery (recover from failure once the system is released):
Data base systems (atomic transactions)
Modular redundancy
Recovery blocks
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
6
Some Observations
It is impossible to completely test any nontrivial module or any
system
Theoretical limitations: Halting problem
Practial limitations: Prohibitive in time and cost
Testing can only show the presence of bugs, not their absence
(Dijkstra).
Successful test cases:
In unit and integration testing: A successful test case is one that
finds a previously undiscovered error.
In acceptance testing: A successful test case is one that successfully
implements a requirement.
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
7
Testing takes creativity
Testing often viewed as dirty work.
To develop an effective test, one must have:
Detailed understanding of the system
Knowledge of the testing techniques
Skill to apply these techniques in an effective and efficient manner
Testing is done best by independent testers
We often develop a certain mental attitude that the program should
work in a certain way when in fact it does not.
Programmers often stick to the data set that makes the program
work - "Don’t mess up my code!"
A program often does not work when tried by somebody else. Don't
let this be the end-user.
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
8
Testing Activities
Subsystem
Code
Subsystem
Code
Unit
Test
Unit
Test
Tested
Subsystem
Tested
Subsystem
Requirements
Analysis
Document
System
Design
Document
Integration
Test
Integrated
Subsystems
Functional
Test
User
Manual
Functioning
System
Tested Subsystem
Subsystem
Code
Unit
Test
Bernd Bruegge & Allen H. Dutoit
All tests by developer
Object-Oriented Software Engineering: Using UML, Patterns, and Java
9
Testing Activities continued
Client’s
Understanding
of Requirements
Global
Requirements
Validated
Functioning
System PerformanceSystem
Test
Accepted
System
Acceptance
Test
User
Environment
Installation
Test
Tests by client
Tests by developer
User’s understanding
Tests (?) by user
Bernd Bruegge & Allen H. Dutoit
Usable
System
Object-Oriented Software Engineering: Using UML, Patterns, and Java
System in
Use
10
Fault Handling Techniques
Fault Handling
Fault Avoidance
Design
Methodology
Verification
Fault Tolerance
Fault Detection
Atomic
Transactions
Reviews
Modular
Redundancy
Configuration
Management
Debugging
Testing
Unit
Testing
Bernd Bruegge & Allen H. Dutoit
Integration
Testing
System
Testing
Correctness
Debugging
Object-Oriented Software Engineering: Using UML, Patterns, and Java
Performance
Debugging
11
Quality Assurance encompasses Testing
Quality Assurance
Usability Testing
Scenario
Testing
Fault Avoidance
Verification
Prototype
Testing
Product
Testing
Fault Tolerance
Configuration
Management
Atomic
Transactions
Modular
Redundancy
Fault Detection
Reviews
Walkthrough
Inspection
Unit
Testing
Bernd Bruegge & Allen H. Dutoit
Debugging
Testing
Integration
Testing
System
Testing
Correctness
Debugging
Object-Oriented Software Engineering: Using UML, Patterns, and Java
Performance
Debugging
12
Types of Testing
Unit Testing:
Perfomed on individual method or class
Carried out by developers
Goal: Confirm that method or class is correctly coded and carries
out the intended functionality
Sometimes refers to testing on a subsystem
Integration Testing:
Groups of subsystems (collection of classes) and eventually the
entire system
Carried out by developers
Goal: Test the interface among the subsystems
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
13
System Testing
System Testing:
The entire system
Carried out by developers
Goal: Determine if the system meets the requirements (functional
and global)
Acceptance Testing:
Evaluates the system delivered by developers
Carried out by the client. May involve executing typical
transactions on site on a trial basis
Goal: Demonstrate that the system meets customer requirements
and is ready to use
Implementation (Coding) and testing go hand in hand
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
14
Unit Testing
Informal:
Incremental coding
Static Analysis:
Hand execution: Reading the source code (This is called desk checking)
Walkthrough (informal presentation to others)
Code Inspection (formal presentation to others)
Automated Tools checking for
syntactic and semantic errors
departure from coding standards
Dynamic Analysis:
Black-box testing (Test the input/output behavior)
White-box or Glass-box testing (Test the internal logic of the subsystem or
object)
Data-structure based testing (Data types determine test cases)
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
15
Black-box Testing
Focus: I/O behavior. If for any given input, we can predict the
output, then the module passes the test.
Almost always impossible to generate all possible inputs
("test cases")
Goal: Reduce number of test cases by equivalence partitioning:
Divide input conditions into equivalence classes
Choose test cases for each equivalence class.
Example: If an object is supposed to accept a negative number,
testing one negative number is enough
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
16
Black-box Testing (Continued)
Selection of equivalence classes (No rules, only guidelines):
Input is valid across range of values. Select test cases from 3 equivalence
classes:
Below the range
Within the range
Above the range
Input is valid if it is from a discrete set. Select test cases from 2 equivalence
classes:
Valid set member
Invalid set member
Input is valid if it is the Boolean value true or false. Select
a true value
a false value
Input is valid if it is an exact value of a particular data type. Select
the value
any other value of that type
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
17
More on Equivalence Classes
Example
The specifications for a DBMS state that the product must
handle any number of records between 1 and 16,383 (214–1)
If the system can handle 34 records and 14,870 records,
then it probably will work fine for 8,252 records
Basic Idea
If the system works for any one test case in the range
(1..16,383), then it will probably work for any other test
case in the range
Range (1..16,383) constitutes an equivalence class
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
18
Equivalence Classes and Boundary Value Analysis
Principle of Equivalence Classes
Any one member of an equivalence class is as good a test case as any
other member of the equivalence class
Range Example: (1..16,383)
Defines three different equivalence classes:
Equivalence Class 1: Fewer than 1 record
Equivalence Class 2: Between 1 and 16,383 records
Equivalence Class 3: More than 16,383 records
Principle of Boundary Value Analysis
Select test cases on or just to one side of the boundary of
equivalence classes
This greatly increases the probability of detecting a fault
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
19
More on Database Example
Test case 1:
Test case 2:
Test case 3:
Test case 4:
Test case 5:
Test case 6:
Test case 7:
Bernd Bruegge & Allen H. Dutoit
0 records
Member of equivalence class 1 and
adjacent to boundary value
1 record
Boundary value
2 records
Adjacent to boundary value
723 records
Member of equivalence class 2
16,382 records Adjacent to boundary value
16,383 records Boundary value
16,384 records Member of equivalence class 3 and
adjacent to boundary value
Object-Oriented Software Engineering: Using UML, Patterns, and Java
20
Equivalence Testing of Output Specifications
We also need to perform equivalence testing of the output
specifications
Example:
In 2003, the minimum Social Security (OASDI) deduction from any
one paycheck was $0.00, and the maximum was $5,394.00
Test cases must include input data that should result in deductions
of exactly $0.00 and exactly $5,394.00
Also, test data that might result in deductions of less than $0.00 or
more than $5,394.00
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
21
Overall Strategy
Equivalence classes together with boundary value analysis to
test both input specifications and output specifications
This approach generates a small set of test data with the potential
of uncovering a large number of faults
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
22
White-box Testing
Focus: Thoroughness (Coverage). Every statement in the
component is executed at least once.
Four types of white-box testing
Statement Testing
Branch Testing
Path Testing
Loop Testing
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
23
White-box Testing (Continued)
Statement Testing:
Every statement should be tested at least once.
Branch Testing (Conditional Testing):
Make sure that each possible outcome from a condition is tested
at least once
if ( i = TRUE) printf("YES\n"); else printf("NO\n");
Test cases: 1) i = TRUE; 2) i = FALSE
Path Testing:
Make sure all paths in the program are executed.
This is infeasible. (Combinatorial explosion)
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
24
White-box Testing Example
FindMean(float Mean, FILE ScoreFile)
{SumOfScores = 0.0; NumberOfScores = 0; Mean = 0;
Read(ScoreFile, Score); /*Read in and sum the scores*/
while (!EOF(ScoreFile) {
if ( Score > 0.0 ) {
SumOfScores = SumOfScores + Score;
NumberOfScores++;
}
Read(ScoreFile, Score);
}
/* Compute the mean and print the result */
if (NumberOfScores > 0 ) {
Mean = SumOfScores/NumberOfScores;
printf("The mean score is %f \n", Mean);
} else
printf("No scores found in file\n");
}
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
25
White Box Testing: Path Testing
Draw a Flow Graph (see example next slide)
Use McCabe’s Complexity Measure
Definition
Number of predicates (expressions returning a T/F value) + 1
#Edges – #Nodes + 2
Number of regions in flow graph
Purpose:
To determine a basis (the test cases)
To limit the number of test cases
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
26
White-box Testing Example: Determining the Paths
FindMean (FILE ScoreFile)
{ float SumOfScores = 0.0;
int NumberOfScores = 0;
1
float Mean=0.0; float Score;
Read(ScoreFile, Score);
2 while (!EOF(ScoreFile) {
3 if (Score > 0.0 ) {
SumOfScores = SumOfScores + Score;
4
NumberOfScores++;
}
5
Read(ScoreFile, Score);
6
}
/* Compute the mean and print the result */
7 if (NumberOfScores > 0) {
Mean = SumOfScores / NumberOfScores;
printf(“ The mean score is %f\n”, Mean);
} else
printf (“No scores found in file\n”);
9
}
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
8
27
Constructing the Logic Flow Diagram
Start
1
F
2
T
3
T
F
5
4
6
7
T
F
9
8
Exit
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
28
Determine a Basis: Use example on slides 27 and 28
Determine McCabe’s Complexity Measure
The easiest formula to evaluate is to read the code:
Number of predicates (3) + 1 = 4
Check your answer by determining the number of regions:
Number of closed regions (3) + number of outside regions (always 1) = 4
Determine a basis (a linearly independent set of paths): Will contain 4 paths.
Use the following technique:
Take the shortest path.
Add a single predicate to the next path.
Basis:
Path 1: 1, 2, 7, 9, Exit (Could have the number 10) //at end of file after reading
Path 2: 1, 2, 3, 5, 6, 2, 7, 9, Exit
// reads a value that is negative or zero;
// then at end of file
Path 3: 1, 2, 3, 4, 5, 6, 2, 7, 8, Exit
// reads a single value that is positive, so
// completes mean
Path 4: 1, 2, 7, 8, Exit:
// infeasible path
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
29
Finding the Test Cases
Start
1
a (Covered by any data)
2
b
(Positive score) d
c
4
(At end of file)
e
f
5
6
(NumberOfScores >i 0
8
k
Bernd Bruegge & Allen H. Dutoit
(File must contain at least one value)
3
7
(Negative score)
h (Reached through g if either f
or e is reached)
g
j (NumberOfScores < 0)
9
Exit
l
Object-Oriented Software Engineering: Using UML, Patterns, and Java
30
Another Example: Binary search flow graph
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
31
Independent paths
1, 2, 3, 4, 5, 14
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 14
1, 2, 3, 4, 5, 6, 7, 11, 12, 5, …
1, 2, 3, 4, 6, 7, 2, 11, 13, 5, …
Test cases should be derived so that all of these paths are
executed
A dynamic program analyser may be used to check that paths
have been executed
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
32
Comparison of White & Black-box Testing
White-box Testing:
Potentially infinite number of
paths have to be tested
White-box testing often tests
what is done, instead of what
should be done
Cannot detect missing use cases
Black-box Testing:
Potential combinatorical
explosion of test cases (valid &
invalid data)
Often not clear whether the
selected test cases uncover a
particular error
Does not discover extraneous
use cases ("features")
Bernd Bruegge & Allen H. Dutoit
Both types of testing are needed
White-box testing and black box
testing are the extreme ends of a
testing continuum.
Any choice of test case lies in
between and depends on the
following:
Number of possible logical paths
Nature of input data
Amount of computation
Complexity of algorithms and
data structures
Object-Oriented Software Engineering: Using UML, Patterns, and Java
33
The 4 Testing Steps
1. Select what has to be
measured
3. Develop test cases
Analysis: Completeness of
requirements
Design: tested for cohesion
Implementation: Code tests
2. Decide how the testing is
done
Code inspection
Proofs (Design by Contract)
Black-box, white box,
Select integration testing
strategy (big bang, bottom
up, top down, sandwich)
Bernd Bruegge & Allen H. Dutoit
A test case is a set of test
data or situations that will be
used to exercise the unit
(code, module, system) being
tested or about the attribute
being measured
4. Create the test oracle
An oracle contains the
predicted results for a set of
test cases
The test oracle has to be
written down before the
actual testing takes place
Object-Oriented Software Engineering: Using UML, Patterns, and Java
34
Review: Design Principles
Cohesion: The interaction (or relatedness) between items
within a model.
Goal: High
A module should have one purpose so that it is more likely to be
reusable and maintainable.
Ex. In a class, the data and methods should be highly related.
Coupling: The interaction among modules
Goal: Low
Limit the coupling between two modules to promote reuse.
Ex. The number of modules called by the calling module should be
7±2
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
35
Guidance for Test Case Selection
Use analysis knowledge about
functional requirements (blackbox testing):
Use cases
Expected input data
Invalid input data
Use implementation knowledge
about algorithms:
Examples:
Force division by zero
Use sequence of test cases for
interrupt handler
Use design knowledge about
system structure, algorithms, data
structures (white-box testing):
Control structures
Test branches, loops, ...
Data structures
Test records fields, arrays, ...
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
36
Unit-testing Heuristics
1. Create unit tests as soon as object
design is completed:
Black-box test: Test the use
cases & functional model
White-box test: Test the
dynamic model
Data-structure test: Test the
object model
5. Create a test harness
Test drivers and test stubs are
needed for integration testing
6. Describe the test oracle
Often the result of the first
successfully executed test
2. Develop the test cases
Goal: Find the minimal number
of test cases to cover as many
paths as possible
7. Execute the test cases
Don’t forget regression testing
Re-execute test cases every time a
change is made.
3. Cross-check the test cases to eliminate
duplicates
Don't waste your time!
8. Compare the results of the test with the test
oracle
Automate as much as possible
4. Desk check your source code
Reduces testing time
Bernd Bruegge & Allen H. Dutoit
Object-Oriented Software Engineering: Using UML, Patterns, and Java
37