Download Figure 1 Automated Grading Service

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Event Driven Automated Testing
for GUI-based Java Programs
Presented by
Yanhong Sun
Thesis Advisor: Dr. Edward L. Jones
April 28, 2003
1
Outline
 Introduction
 Related Work




Test strategy / Test plan
Test case generation
Test execution
Test verification
 Goals of My Thesis




Implementing the initial AGUIJ
The AGUIJ Features
Case study
The future work
 References
2
Introduction
 To grade GUI-based Java program by
hand is a redundant, boring and time
intensive work
 The Automated grader for GUI-based
Java programs (AGUIJ) is a fast and
efficient technique for checking
programming assignments submitted by
numerous students
3
Introduction (cont.)
 All programs have interfaces


Text interface
Graphical user interface
 The applications with text interfaces are
driven by data

Automated grader can be readily accomplished with the
help of I/O redirection
 Currently, I am trying to extend the automated
grading service for GUI-based Java programs
4
Introduction (Cont.)
 GUIs are event-driven
 The
type of input is more varied
button clicks, mouse movements, etc.
 It is not so easy to reassign the input stream
with the help of I/O redirection
 Special tool is needed to simulate the input and
capture the output
5
Related Work
 Testing life cycle:
Specification
Analysis
Test Strategy/
Plan
Design
Test Cases
Implementation
Test Script, Data,
Driver
Execution
Test Results
Problem Report
Evaluation
Figure 1 Testing life cycle
6
Related Work (Cont.)
 Test strategy / test plan



Most well-known coverage criteria include statement
coverage, branch coverage, and path coverage
Since the input to a GUI consists of a sequence of events,
such criteria can not apply for GUI testing
Memon, Soffa and Pollack [6] presented new coverage
criteria.
These coverage criteria use events and event
sequences to specify a measure of test adequacy
for GUI testing
7
Related Work (Cont.)
 Test case generation


One of the main difficulties in testing GUI-based
applications lies in describing the input and output.
Memon, Pollack and Soffa[8] present PATHS, Planning
Assisted Tester for graphical user interface Systems, a
new technique to automatically generate test cases for
GUIs.
Given: a set of operators, an initial state, and
a goal state
Produces: a sequence of the operators that will
transform the initial state to the goal state
8
Related Work (Cont.)

Chen and Subramamiam [7] present a prototype VESP,
a Visual Environment for manipulating test Specification
of GUI-based application in Java.
Testers can easily modify the test specification

Ostrand, Anodide, Foster and Goradia [9] have
implemented an experimental test development
environment (TDE) which replaces the low-level scripting
language with a high-level scenario language and
provide a visual model for modifying and creating
variations of recorded sequences
9
Related Work (Cont.)
 Test Execution



Almost all of the testing tools for GUI-based applications
are based on capture/replay technique
The capture/replay technique is good for regression
testing
There exist various commercial capture/replay tools
XRunner [10] -- an automated tool for X windows
application
JavaStar [11] – SunTest’s GUI testing tool,
accesses the Java AWT
components
Marathon [12] – allows the tester to play and record
scripts against a Java swing UI
10
Related Work (Cont.)
 Verification
 The
traditional verification tool is a test oracle, a
separate program that generate expected
results for a test case and compares them with
actual results.
 Most existing automation tools usually cannot
efficiently compare graphic objects.
 Takahashi [13] develops a technique that aids
automatic behavior verification for verifying the
graphic objects
11
Expected goals of my thesis
 In summary, the anticipated goals of my
thesis are:
 Implementing
the initial AGUIJ to seek the
feasibility of the automated testing for GUIbased Java programs
 Making AGUIJ work for multiple frames
 Implementing the Test Generator Engine which
generates the Jemmy test engine source file
automatically
 Implementing the format of test specification
(grading plan) which can be read by Test
Generator Engine
12
Implementing the initial AGUIJ
 At the very beginning, a bright idea was assigned to
me by Dr. Jones to construct an automated grader for
GUI-based Java program
 By study and research, I found out that almost all of
the testing tools for GUI-based applications are based
on capture/replay technique
pros: techniques are available and free to use
cons: i) needs oracle support
ii) can’t start automated test until the tester
manually tests a perfect program
13
Implementing the initial AGUIJ
(cont.)
 Suddenly, another technique called
Jemmy caught my attention.
[1] is a JavaTM library that is used to
create automated tests for Java GUI applications
 Jemmy contains methods to reproduce all user
actions that can be performed on Swing/AWT
components (i.e. button pushing, text typing).
 Pros: i) can perform test execution and test
verification in one program
ii) can start automated test right away
 Jemmy
14
Implementing the initial AGUIJ
(Cont.)
Specification
Design Test Cases
class file(s)
Student Programs
Compile
Jemmy test engine
(Java)
compile
Jemmy test engine
(class)
Not compile
Grading Log files
Figure 1 Automated Grading Service
15
AGUIJ Features
 Test modes (position based or object
based)
 In
position-based mode, the selection of an
item on the screen is determined by the
program based on the position of the item
within the window
 In object-based mode, the test software sends
the event not to a given position in the window,
but to a specific object.
 AGUIJ is object-based test mode
16
AGUIJ Features (Cont.)
 Output checking
 Bitmap
comparison compares graphical images
bit-by bit
 Content-based comparison ignores the
presentation, but compares the underlying data
itself
 The AGUIJ use content-based comparison
checker to verify whether the program runs
correctly
17
Case study
 I used the initial AGUIJ to grade the Java GUI
programs, the outcome was optimistic
Automated Grading
Service
Compiling
7 mins
Running
19 mins
Total
About 30 mins
Manual Grading Service
About 4-5 hours without
any interruption
Table 1 Time consumed in grading
 The ratio of correctness: 92.4%
18
Future Works
 Although the initial AGUIJ tends to make the
GUI test automated, there are still pitfalls



To implement Jemmy test engine source file is not a fully
automated step
The service produces lower productivity
The service can deal only with one single frames
 The future work should involve in AGUIJ
generalization to achieve more automateion
19
Future Works (Cont.)
 Working for multiple frames

I will extend the automated grading service to
work for GUI-based Java programs with
multiple frames
 Implementing the Test Generator
Engine

I will develop the Test Generator Engine, which
can transform a formatted grading plan to a
Jemmy test engine source file.
20
Future Works (Cont.)
 Implementing the format of the grading plan
 The grading plan must meet the following criteria:
•
•
•

Easy to write
Easy to understand
Amenable to automated generation of the Jemmy test
engine source file
The information are needed to generate the
Jemmy test engine source file:
•
•
•
•
The name of class file to be tested
The title of frames
The components in the GUI
The test cases.
21
Future Works (Cont.)
 I intend to use the format like HTML to implement grading plan.
<class name>
FAMUCurrencyExchange.class
</class name>
<component>
type = JComboBox; order = 1
type = JComboBox; order = 2
type = JTextField; order = 1
type = JTextField; order = 2
type = JButton; name = Convert
</components>
The class name
The components in
the GUI
The test cases
<test cases>
number = 1
<action>
type = JComboBox; order = 1; name = Currency Type 1; action = select item 2;
item name = U.S.Dollar
type = JTextField; order = 1; action = type 200.0
type = JComboBox; order = 2; name = Currency Type 2; action = select item 4;
item name = Japanese Yen
type = JButton; action = push
</action>
<expect>
type = JTextField; order = 2; content = 21640.34 or 21620
</expect>
<deduction>
deduction = 25
</deduction>
User inputs
The expect
outputs
deductions
</tested cases>
22
Table 2 A sample of formatted grading plan (HTML style)
Future Works (Cont.)
 The automated grading service will be updated to the following:
Specification
Write grading plan
Test Engine Generator
Jemmy
test
engine
(Java)
class file
Student Programs
Compile
Jemmy test
engine (class)
compile
Not compile
Grading Log files
Figure 3 The updated Automated Grading Service System
23
References
[1] Jemmy Module. See http://jemmy.netbeans.org
[2] Alan Walworth. Java GUI Testing. Dr. Dobb’s Journal February
1997
[3] Tessella Support Services PLC. Automated GUI Testing.
Tessella Scientific software solutions January 1999, issue
V1.R1.M1
[4] J. Skrivanek and A. Sotona. Tesing ForteTM for JAVATM
[5] Cay S. Horstmann, Gary Cornell. The core of JAVA, second
edition.
[6] A. M. Memon, M. L. Soffa, and M. E. Pollack. Coverage Criteria
for GUI Testing.
[7] J. Chen, and S. Subramaniam. A GUI Environment to
Manipulate FSMs for Testing GUI-Based Applications in Java. In
proceedings of the 34th Hawaii international Conference on
System Sciences, 2001
24
References (Cont.)
[8] A. M. Memon, M. E. Pollack, and M. L. Soffa. Hierarchical GUI
test case generation using automated planning. IEEE
Transactions on Software Engineering, 27(2): 144-155,
Feb.2001
[9] T. Ostrand, A. Aanodide, H. Foster, and T. Goradia. A Visual Test
Development environment for GUI Systems. In proceedings of
the ACM SIGSOFT International Symposium on Software
Testing and Analysis(ISSTA-98), pages 82-92, New York, Mar.25 1998. ACM Press.
[10] XRunner. See http://www.merc-int.com/products /xrunner6
[11] JavaStar. See http://www.sun.com/suntest/products/index.html
[12] Marathon. See http://marathonman.sourceforge.net
[13] J. Takahashi. An Automated Oracle for Verifying GUI Objects.
Software Engineering Notes, vol. 26 no 4: 83-88, July 2001
[14] Fewster, Mark. Software Test Automation, Addison Wesley,
New York, 1999
25
26