Download Introduction

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Open Database Connectivity wikipedia , lookup

Concurrency control wikipedia , lookup

Database wikipedia , lookup

Microsoft Jet Database Engine wikipedia , lookup

Relational model wikipedia , lookup

Database model wikipedia , lookup

Clusterpoint wikipedia , lookup

ContactPoint wikipedia , lookup

Transcript
1. INTRODUCTION .................................................................................................................................... 1
1.1 SUMMARY ............................................................................................................................................ 1
1.2 TEST PLAN OBJECTIVES .................................................................................................................... 1
1.3 REFERENCES TO RELATED DOCUMENTS ........................................................................................... 1
2. TEST ITEMS ........................................................................................................................................... 1
3. FEATURES TO BE TESTED ............................................................................................................... 2
3.1 USER INTERFACE ................................................................................................................................ 2
3.2 DATABASE OPERATIONS .................................................................................................................... 2
3.3 SYSTEM INTERFACE ............................................................................................................................ 2
3.4 REPORTING ......................................................................................................................................... 2
3.5 SECURITY ............................................................................................................................................ 2
3.6 GLOBAL USER BASE .......................................................................................................................... 3
4. FEATURES NOT TO BE TESTED .................................................................................................... 3
4.1 BETA PROGRAMS ................................................................................................................................ 3
4.2 MULTI-PLATFORM TESTING ................................................................................................................. 3
5. APPROACH............................................................................................................................................ 3
5.1 USER INTERFACE ................................................................................................................................ 3
5.2 DATABASE OPERATIONS .................................................................................................................... 3
5.3 REPORTING ......................................................................................................................................... 3
5.4 SECURITY ............................................................................................................................................ 4
5.5 CONSTRAINTS ..................................................................................................................................... 4
6. ITEM PASS/FAIL CRITERIA ............................................................................................................... 4
6.1 USER INTERFACE ................................................................................................................................ 4
6.2 DATABASE .......................................................................................................................................... 5
6.3 REPORTING ......................................................................................................................................... 5
6.4 SECURITY ............................................................................................................................................ 5
7. TEST DELIVERABLES ........................................................................................................................ 5
7.1 TEST DOCUMENTS .............................................................................................................................. 5
7.2 INPUT/OUTPUT OF TEST ITEMS........................................................................................................... 5
7.2.1 User Interface .......................................................................................................................... 5
7.2.2 Database ................................................................................................................................... 5
7.2.3 Reporting .................................................................................................................................. 7
7.2.4 Security ..................................................................................................................................... 8
7.3 TEST TOOLS ........................................................................................................................................ 8
8. ENVIRONMENT NEEDS ...................................................................................................................... 8
9. STAFF AND TRAINING NEEDS ......................................................................................................... 8
9.1 STAFFING NEEDS ................................................................................................................................ 8
9.2 TRAINING OPTIONS .............................................................................................................................. 8
10. RISK AND CONTINGENCIES ........................................................................................................... 9
11. APPROVALS ....................................................................................................................................... 9
Test Plan Identifier: Two Wheels Courier System v. 1.0 11/06/03
1. Introduction
1.1 Summary
The purpose of this document is to describe the test plan for the Two
Wheels Courier Company System for Susan VandeVen, instructor for
SWE 4624.
This system is designed to provide a package tracking system for the Two
Wheels Courier Company. This system must be able to schedule package
pickups and deliveries, cancel customer requests if needed, track each
package, record customer information, and provide price quotes for
package service. Managers will be able to access the report section of this
system for reports on package status, billing/money received, and daily
activity summary reports.
1.2 Test Plan Objectives




To ensure a working product
To ensure the database is functioning properly according to the
requirements
To ensure that the user interfaces function according to the
requirements
1.3 References to Related Documents
Bonneau, Tiffany.; Hall, Vianco.; Simmons, Bernard.; Walker, Norris. “Two
Wheels Local Courier Design Document” Final Version 1.0, 23 October
2003.
Bonneau, Tiffany.; Hall, Vianco.; Simmons, Bernard.; Walker, Norris. “Two
Wheels Local Courier System Specification Document” Final Version 1.0,
25 September 2003.
2. Test Items
The test will cover all functionality in the Two Wheels Courier System, which
includes its underlying database system, user interfaces, security issues (with
respect to customer privacy), and report printing capabilities.
1
3. Features to be Tested
3.1 User Interface








Check that address is in delivery distance
Redundancy of new customer accounts with already existing
How long has a record been on file that hasn't had usage in 90 days
Weight limit: package
Admin functions only accessible by admin
Address, name, phone number are filled in before placing an order
No more than two users can be logged in at same time
Date valid: cannot allow a day or year past the current date
3.2 Database Operations





The ability to recover from failures such as power loss, operator error,
crashes and system software failure through checkpoint facilities and
the setting of transaction boundaries.
The handling of concurrent access through locking mechanisms.
The performance of the database as measured by the response time
to user queries
The insertion, deletion, update, and querying of tables in the database.
The enforcement of entity integrity and referential integrity constraints.
3.3 System Interface
In this particular project, there will be no need for testing of system interfaces
given that this is a single server database system with no other external
interfaces.
3.4 Reporting


Proper reports printing of customer account history given a unique
customer id (telephone number.)
Proper reports printing of outstanding deliveries for a given the status
of all packages for the current date.
3.5 Security



Backup of Customer Information Database
o Does it back up correctly?
o Can the system be easily restored using the backup?
User Access Levels do not have more privileges than it is designed for
Unauthorized users cannot get access to the system.
2
3.6 Global User Base
For this project, there will be no global user base because this application is
specifically designed for a local user base.
4. Features Not to be Tested
4.1 Beta programs
For this project, no beta programs will be done due to the fact that only a
prototype of the application will be created, and thus, it will be more or less
serve as a beta-like version.
4.2 Multi-platform testing
For the Two Wheels Courier System, no multi-platform is necessary given
that the system is limited only to a WinTel operating system.
5. Approach
5.1 User Interface
To test the features of the user interface, dependent upon the feature under
test, erroneous data may be entered or data will be purposely left out to verify
that possible operator errors that occur will be caught by the user interfaces
before they are exported to the system database.
5.2 Database Operations
All database tables will be created prior to the start of testing. The database
will also be populated with valid test data before testing begins. The tester will
follow the test scenario script, and log any anomalies detected in the test log.
During testing the spooling facility of the Database Management System
should be turned on so that all outputs and error messages are stored to a
file. A print out of the spooling file should be attached to the test log at the end
of testing.
5.3 Reporting
To test the reports printing with respect to the Customer Account report, a
customer id# will be entered in hopes that the corresponding customer data—
e.g., customer’s full name, street address, telephone number, and pickup/delivery history—will appear in the outputted report listing. In regards to
the Outstanding Deliveries report, a specified amount of package ids will be
3
marked with a “picked-up” status, and thus, pending delivery; then, an
Outstanding Deliveries report will be generated to verify that each of the
specified packages (those marked “picked-up” in the database) shows up in
the resulting report with their corresponding information—e.g., package id#,
sender, sender’s address, recipient, and destination address.
5.4 Security
This system is protected by unique logins for each user of the system. Each
user will only be allowed to access information in the system based on their
appropriate user level (Administrator or Clerk) set at the time of login in
creation. This system is not designed for use outside of an internal company
network, if networked. If this system is networked to a computer network that
is accessible from outside of the company, it is the user’s responsibility to
make sure that all computers that are networked to this system have the
appropriate security measures installed to ensure that the database of user
information is not hacked from an outside source or by unauthorized users.
Backups of customer information should be maintained both onsite and
offsite. This system does provide the ability to back up the system, both the
customer and user databases.
5.5 Constraints
Since the Two Wheels Courier System designers will have full access to the
facilities of the university, there will be no constraints on the testing.
6. Item Pass/Fail Criteria
6.1 User Interface
If any of the following features fail, some or all of the user interfaces
(depending on what features fail) will be given a failed status.







A new customer account is created in spite of one already existing
Address is out of delivery radius
Record can remain even after 90 days in the system
Package to be picked-up/delivered is above maximum weight limit
Data are saved to the database without all entry fields being filled
A date prior to the current date can be used to schedule a package
pick-up or delivery
A third user can login into the system
4
6.2 Database
The database module will be said to have failed a test if the results of a test
do not match the expected result listed in the Test Case Specification.
6.3 Reporting
Both the Customer Account and the Outstanding Deliveries reports will be
tested three times repeatedly to determine pass/fail status of each item. If the
item does indeed fail the three attempts, the testers will mark the item as
failed. Afterwards, the Two Wheels Courier System designers will meet and
discuss the causes for the failure and the possible measures to correct the
failure, and hence, what is need to give the item a pass status.
6.4 Security
If any security violations—e.g., a clerk level user can login as an admin level
user, the security portion of the test will be given a failed status.
7. Test Deliverables
7.1 Test Documents


Test Plan Document
Design Document
7.2 Input/Output of Test Items
7.2.1 User Interface
In the case of the user interfaces, no specific input/output (sample) data
were created because it was agreed that the data should be random given
that it will be as such when an operator attempts to enter it into the
system.
7.2.2 Database
Test case ID: DB 001
Test Description: To test the database response to a set of valid and
invalid records. (An invalid record contains one or more invalid fields, e.g.
a telephone number with too many digits).
Input
1. Enter record using the SQL insert command
5
2. If you do not get an error message, use the SQL select command
to confirm that the record is in the database table
Output
Valid records will be successfully entered into the database. Attempts
to enter an invalid record will produce an error message.
Test case ID: DB 002
Test description: To ensure that the Database Management System
enforces entity integrity constraints by attempting to enter a record in the
package table with the Package ID field blank.
Input
1. Enter record using SQL insert command, omitting the primary key.
2. Use the SQL select command to check if the record has been
entered in the Package table.
Output
An error message will be produced and the record will not be entered
in the Package table.
Test case ID: DB 003
Test description: To verify that database response time is fast to enough
to meet system response time performance requirements. The data base
response time should be faster than overall system response time, since
response time may fall when database is integrated with other system
components.
Input
1. Enter a valid package ID in package tracking screen.
2. Click submit.
3. Measure time until package status report is complete.
Output
Package status report should be complete within 5 seconds of clicking
the submit button.
Test case ID: DB 004
Test description: To verify the ability of the database to recover from
system failure.
6
Input
1.
2.
3.
4.
5.
6.
7.
8.
Make backup of all database tables on removable disk.
Print out all tables in the database with the SQL select * command.
Close all other programs that may be running.
Create checkpoint in database.
Start inserting a record in the customer table
Midway into the entry restart computer.
Restart the DBMS program.
Print out all tables in the database with the SQL select * command.
Output
There should be no loss of data, except for the interrupted transaction.
7.2.3 Reporting
To test the functionality of the reports printing, the following sample
tabulated data will be used to temporarily create three customer accounts
(assuming that the Create New Account logic is correct) in the database to
verify that the output corresponds to the inputted data.
Customer Name
Jason Rogers
Amber Simmons
Melissa Jones
Phone#
770-321-1234
678-807-1981
770-864-1357
Package ID#
JxR-30062-1234-001
AxR-30060-1981-001
MxJ-30067-1357-001
Test case ID: Reports 001
Test description: verify proper printing of Customer Account report.
Input
1) 770-321-1234
2) 770-864-1357
3) 678-807-1981
Output
1) Customer information for Jason Rogers’ account
2) Customer information for Melissa Jones’ account
3) Customer information for Amber Simmons’ account
Test case ID: Reports 002
Test description: confirm proper printing of Outstanding Deliveries report.
7
Input
MxJ-30067-1357-001
JxR-30062-1234-001 *
AxR-30060-1981-001 *
* Package ID is marked as “picked-up” in the database.
Output
Packages pending delivery for…
Jason Rogers and Amber Simmons
7.2.4 Security
The input of the security test will simply be two or three random user
names and passwords where one is given a admin level and the others a
clerk level status. With respect to the output, it will not be data, but rather,
the result—whether it be getting an error message rejecting the user login
or the bypass of the system security levels.
7.3 Test Tools
In this project, there will be no special tools used to test the functionality of the
application, but rather, typical code debuggers will be used to test and
troubleshoot any issues that arise.
8. Environment Needs
The Two Wheels Courier System design team will not have any environmental
concerns given that the application will be installed in a small office building as
opposed to a harsh one.
9. Staff and Training Needs
9.1 Staffing Needs
Clerks and administrators will need basic training for the first couple of weeks
after the initial installation of the software.
9.2 Training options
Clerks and administrators will have a basic training session twice a week for
the first couple of weeks after the initial installation of the application.
8
10. Risk and Contingencies
If the system is compromised, the user should be able to restore the system
from the backup of the customer information and table of authorized users.
11. Approvals
After all testing has been completed, the final approval for the Two Wheels
Courier System will be given by Professor Susan VandeVen at which time the
implementation of the Two Wheels Courier System prototype will begin.
9