Download What is Real Application Testing?

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Relational algebra wikipedia , lookup

Entity–attribute–value model wikipedia , lookup

DBase wikipedia , lookup

Ingres (database) wikipedia , lookup

Oracle Database wikipedia , lookup

Microsoft Access wikipedia , lookup

Team Foundation Server wikipedia , lookup

Tandem Computers wikipedia , lookup

Database wikipedia , lookup

Concurrency control wikipedia , lookup

Microsoft Jet Database Engine wikipedia , lookup

Btrieve wikipedia , lookup

Database model wikipedia , lookup

Clusterpoint wikipedia , lookup

Null (SQL) wikipedia , lookup

Relational model wikipedia , lookup

Open Database Connectivity wikipedia , lookup

Microsoft SQL Server wikipedia , lookup

SQL wikipedia , lookup

PL/SQL wikipedia , lookup

Transcript
Oracle Database 11g
Real Application Testing
What is Real Application Testing?
• New database option available with EE only
• Includes two new features
• Database Replay (DB Replay)
• SQL Performance Analyzer (SPA)
• SQL Tuning Set (STS) functionality is shared between
Real Application Testing option and EM Tuning Pack
• Advance functionality requires additional packs
• Diagnostic Pack for DB Replay
• Tuning Pack for SPA
• Price (?)
2
Real Application Testing
• Value
• Reduces testing cost
• Improves testing quality
• Business Benefit
• Faster technology adoption
• Lower risk
Deploy
Test
Change
Remediate
Solution for the Agile Business
3
Database Replay
4
The Need for Database Replay
•
•
•
•
System changes like hardware/software upgrades are fact of life
Customers want to identify full impact of change before going live
Extensive testing and validation can be expensive in time and $$$
Despite expensive testing success rate low
•
•
Many issues go undetected
Changes can impact system availability and performance
negatively
• Cause of low success rate
•
Inability to properly test with real world production workloads
results in many issues to go undetected
• Database Replay makes it possible to do real world testing
5
Database Replay
• Recreate actual production database workload in test
environment
• Identify, analyze and fix potential instabilities before making
changes to production
• Capture Workload in Production
•
•
Capture full production workload with real load &
concurrency info
Move the captured workload to test system
• Replay Workload in Test
•
•
•
Make the desired changes in test system
Replay workload with production load & concurrency
Honor commit ordering
• Analyze & Report
•
•
•
6
Errors
Data divergence
Performance divergence
Analysis & Reporting
Database Replay: Supported Changes
Client
Client
…
Client
Changes
Unsupported
Middle Tier
Changes Supported
•Database Upgrades, Patches
•Schema, Parameters
Recording of
External Client
Requests
•RAC nodes, Interconnect
•OS Platforms, OS Upgrades
•CPU, Memory
•Storage
•Etc.
7
Storage
Comparison of LoadRunner & DB Replay
Testing e-Business Suite
80
Time Taken (Days)
80
60
24
40
20
20
5
20
4
0
Install & Setup
0
Understand
Application
Usage
0
Identify Key
Transactions
2
Generate
Workload
Total Testing Time
DB Replay: 2 weeks
LoadRunner: 30 weeks
8
5
Run Test
LoadRunner
DB Replay
Why DB Replay?
150
Days
9
From:
To:
Artificial workloads
Production workloads
Partial workflows
Complete workflows
Months of development
Days of development
Manual intensive
Automated
High risk
Low risk
10
Days
Database Replay Overview
10
Step 1: Workload Capture
•
All external client requests
captured in binary files
Production System
Client
•
Client
System background, internal
activity excluded
…
Client
File System
•
Minimal performance
overhead for capture
•
For RAC, shared and local
file system supported
•
Specify interesting time
period for capture, e.g., peak
workload, month-end
processing, etc.
Middle Tier
File 1
File 2
…
File n
Storage
11
Capture Options
• Workload can be filtered to customize what is captured
• Filter Types
• Inclusion Filters: Specifies which sessions should be captured
• Exclusion Filters: Specifies which sessions should NOT be
captured
• Filter Attributes: Workload capture can be filtered using any of the
following session attributes
• User
• Program
• Module
• Action
• Service
• Session ID
• Workload capture can be run on-demand or scheduled to run at
later time
12
Step 2: Process Workload Files
• Setup test system
• Logically similar data as
production
• Use RMAN to physically restore
production db from backup
• Use Snapshot standby
• Use imp/exp, Data Pump, etc.
• Processing transforms captured
data into replay files and
generates necessary meta-data
• Must be done on same version of
database as replay system
• Recommended to process on test
system
• Once processed, workload can be
replayed many times
• For RAC, if using local file
system, copy all capture files to
single location for replay
13
Test System
File 1
File 1
File 2
File 2
…
…
File n
File n
Metadata
Capture Files
Replay Files
Step 3: Replay Workload
•
Replay Driver is a special client
program that consumes
processed workload and sends
requests to the replay system
•
Replay requests preserve
timing, concurrency and
dependencies seen on the
capture system
•
Replay Driver consists of one or
more clients. For workloads with
high concurrency, it may be
necessary to start multiple
clients to drive workload
•
Once all replay clients are
started, workload replay is then
initiated by user
14
Replay Driver
Replay
Client
Replay
Client
File 1
File 2
…
File n
Metadata
Replay Files
Replay Options
• Synchronized Replay
•
•
•
•
Workload is replayed in full synchronized mode
Exact same concurrency and timing as production workload
Transaction commit order is honored
Ensures minimal data divergence
• Unsynchronized Replay
•
•
•
•
Workload can be replayed in unsynchronized mode
Useful for load/stress testing
High Data Divergence
Three (3) parameters provided to control degree of synchronization
• Think time synchronization
• Commit order synchronization
• Connect (logon) time synchronization
15
Replay Options
• Unsynchronized Replay Parameters
•
Think time synchronization
• Controls think time between database calls
• Auto (Default): Adjusts think time so as to maintain captured request rate
• Percentage
•
•
•
•
•
0% No think time, highest possible request rate
<100% Higher request rate
100% Exact think time
>100% Lower request rate
Commit order synchronization
• Controls commit order between transactions
• In asynchronous mode, commit order not honored – transactions are
committed as soon as commit call is issued
•
Connect (logon) time synchronization
• Controls when sessions are created
• 0%: All session are connected immediately
• 100% (Default): Sessions connect at same time as in captured system
16
Replay Options
• Number of Replay Clients
• Configurable by user
• Client Calibration Advisor recommends number of replay
clients needed for specific workload
• Replay clients are multithreaded clients that can drive multiple
workload sessions each
17
Analysis & Reporting
• Comprehensive reports are provided for analysis purposes
• There (3) types of divergences are reported
•
•
•
18
Data Divergence: Number of rows returned by each call are
compared and divergences reported
Error Divergence: For each call error divergence is reported
• New: Error encountered during replay not seen during capture
• Not Found: Error encountered during capture not seen during
replay
• Mutated: Different error produced in replay than during capture
Performance Divergence
• Capture and Replay Report: Provides high-level performance
information
• ADDM Report: Provides in-depth performance analysis
• AWR, ASH Report: Facilitates comparative or skew analysis
Workload Types Supported
Supported
• All SQL (DML, DDL, PLSQL) with practically all types of binds
• Full LOB functionality (Cursor based and direct OCI)
• Local transactions
• Login/Logoffs
• Session switching
• Limited PL/SQL RPCs
Limitations
• Direct path load, import/export
• OCI based object navigation (ADTs) and REF binds
• Streams, non-PL/SQL based AQ
• Distributed txns, remote describe/commit operations
• Flashback
• Shared Server
19
EM Interface: DB Replay Summary
20
Best Practices
•
Capture
•
•
•
•
Test System Setup
•
•
•
•
•
Processing workload has performance overhead and can possibly take a
long time
Process workload on test system after rather than production system
Replay
•
21
Ensure data in test is identical to production as of capture start time to
minimize data divergence during replay
Use RMAN backup/restore or Snapshot Standby feature to setup test
system
Reset system clock to same time as production if application logic involves
SYSDATE usage
Process Workload
•
•
Provide adequate disk space for captured workload (binary files)
Database restart (Optional): Recommended to minimize divergence during
replay
For RAC, use shared file system
Use Client Calibration Advisor to identify number of replay clients needed to
replay workload properly
DB Replay Security Model
• Any “Non-SYS” user with DBA Role
• SYSDBA role is not mandatory
• Does not have to be the user whose workload is captured
• “Execute” privileges on DBMS_WORLOAD_CAPTURE/REPLAY
procedures
• DBMS_WORKLOAD_CAPTURE
• START_CAPTURE, FINISH_CAPTURE, REPORT(), ADD_FILTER,
DELETE_FILTER
• DBMS_WORKLOAD_REPLAY
• PROCESS_CAPTURE,INITIALIZE_REPLAY, PREPARE_REPLAY(),
START_REPLAY(), CANCEL(), REPORT, ADD_FILTER,
REMAP_CONNECTION
• Capture and Replay user can be different if they have
appropriate privileges
22
SQL Performance
Analyzer (SPA)
23
SQL Performance Analyzer (SPA)
• Test impact of change on SQL query performance
• Capture SQL workload in production including statistics & bind
variables
• Re-execute SQL queries in test environment
Client
Client
…
Client
Production Test
Re-execute SQL Queries
Middle Tier
Capture SQL
Oracle DB
Storage
24
…
…
Use SQL Tuning
Advisor to tune
regression
SPA Benefits
• Enables identification of SQL performance regressions before
end-users can be impacted
• SPA can help with any change that impacts SQL execution plan
•
•
•
DB upgrades
Optimizer statistics refresh
New indexes, Materialized Views, Partitions, etc.
• Fix regressed SQL with SQL Tuning Advisor and SQL Plan
Baselines
• Integrated with query optimizer
• Captures SQL workload with low overhead
25
SQL Performance Analyzer Workflow
Make Change
Initial
Environment
Execute
Production SQL
and Gather Stats
Tune SQL
Post-Change
Environment
Re-execute
SQL
SQL Tuning
Advisor
Execute SQL
Compare Performance
26
Tuned
Environment
Step 1: Capture SQL Workload
•
•
Cursor Cache
SQL Tuning Set (STS) used to store SQL
workload
STS includes:
• SQL Text
• Bind variables
• Execution plans
• Execution statistics
Incremental Capture
•
Incremental capture used to populate
STS from cursor cache over a time
period
•
SQL tuning set’s filtering and ranking
capabilities filters out undesirable SQL
SQL Tuning Set
Production Database
27
Step 2: Move SQL Workload to Test System
Cursor Cache
SQL Tuning Set
Production Database
•
•
•
28
Export/Import
SQL Tuning Set
Test Database
Copy SQL tuning set to staging table (“pack”)
Transport staging table to test system (datapump, db link, etc.)
Copy SQL tuning set from staging table (“unpack”)
Step 3: Execute SQL Before Making Change
SQL Tuning Set
•
“Before change” SQL performance
version is the SQL workload
performance baseline
•
SQL Performance = execution plans +
execution statistics
•
Test-Execute SQL in SQL tuning set:
• produce execution plans and statistics
• execute SQL serially (no concurrency)
• every SQL is executed only once
• skip DDL/DML effects
•
Explain plan SQL in SQL tuning set to
generate SQL plans only
Fetch Next SQL
Test Execute
Execution Plan &
Statistics
Save Result
SQL Performance Analyzer
29
Step 4: Execute SQL After Making Change
•
SQL Tuning Set
Manually implement the planned
change:
Fetch Next SQL
•
•
•
•
•
Database upgrade
Implementation of tuning recommendations
Schema changes
Statistics gathering
Database parameter changes,
Test Execute
•
OS/hardware changes, etc.
Execution Plan &
Statistics
•
Re-execute SQL after change:
•
Save Result
Completed
SQL Performance Analyzer
30
•
Test-Execute SQL in SQL tuning set
to generate SQL execution plans and
statistics
Explain plan SQL in SQL tuning set to
generate SQL plans
Step 5: Compare & Analyze Performance
Completed
Completed
•
•
•
•
•
•
•
•
•
Compare
SQL Performance
•
•
•
31
Elapsed Time
Parse Time
Execute Elapsed Time
Execute CPU Time
Buffer Gets
Disk Reads
Disk Writes
Optimizer Cost
SPA Report shows impact of change
for each SQL
•
•
•
•
Analysis Report
SQL Performance Analyzer
Compare performance using different
metrics
Improved SQL
Regressed SQL
Unchanged SQL
SQL with Errors
Tune regressed SQL using SQL Tuning
Advisor
Analysis results can be used to seed
SQL Plan Management repository
EM Interface: SPA Report
32
SPA Security Model
• Based on DBMS_SQLTUNE/SQL Tuning Advisor
• Requires “Advisor”, “Administer [Any] SQL Tuning”
privileges
• Create/Drop/Alter SQL Profile privileges
33