Download voor dia serie SNS-Utrecth/`t Gooi

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
METRICS:
A System Architecture for
Design Process Optimization
Stephen Fenstermaker*, David George*,
Andrew B. Kahng, Stefanus Mantik and
Bart Thielges*
UCLA CS Dept., Los Angeles, CA *OxSigen
LLC, San Jose, CA
Motivations
 How do we improve design productivity ?
 Does our design technology / capability yield better
productivity than it did last year ?
 How do we formally capture best known methods,
and how do we identify them in the first place ?
 Does our design environment support continuous
improvement of the design process ?
 Does our design environment support what-if /
exploratory design ? Does it have early predictors of
success / failure?
 Currently, there are no standards or infrastructure for
measuring and recording the semiconductor design
process
Purpose of METRICS
 Standard infrastructure for the collection and the
storage of design process information
 Standard list of design metrics and process metrics
 Analyses and reports that are useful for design
process optimization
METRICS allows: Collect, Data-Mine,
Measure, Diagnose, then Improve
Related Works
 OxSigen LLC (Siemens 97-99)
 Enterprise- and project-level metrics (“normalized
transistors”) Numetrics Management Systems DPMS
 Other in-house data collection systems

e.g., TI (DAC 96 BOF)
 Web-based design support
 IPSymphony,
WELD, VELA, etc.
 E-commerce infrastructure
 Toolwire,
iAxess, etc.
 Continuous process improvement
 Data mining and visualization
Outline
 Data collection process and potential benefits
 METRICS system architecture
 METRICS standards
 Current implementation
 Issues and conclusions
Potential Data Collection/Diagnoses
 What happened within the tool as it ran? what was
CPU/memory/solution quality? what were the key
attributes of the instance? what iterations/branches
were made, under what conditions?
 What else was occurring in the project? spec
revisions, constraint and netlist changes, …
 User performs same operation repeatedly with nearly
identical inputs
 tool
is not acting as expected
 solution quality is poor, and knobs are being twiddled
Benefits
 Benefits for project management
 accurate
resource prediction at any point in design
cycle

up front estimates for people, time, technology, EDA
licenses, IP re-use...
 accurate


project post-mortems
everything tracked - tools, flows, users, notes
no “loose”, random data left at project end
 management

console
web-based, status-at-a-glance of tools, designs and
systems at any point in project
 Benefits for tool R&D
 feedback
on the tool usage and parameters used
 improve benchmarking
Outline
 Data collection process and potential benefits
 METRICS system architecture
 METRICS standards
 Current implementation
 Issues and conclusions
METRICS System Architecture
Tool
Transmitter
Tool
Tool
Transmitter
Transmitter
wrapper
Java
Applets
API
XML
Inter/Intra-net
Web
Server
DB
Reporting
Data
Mining
Metrics Data Warehouse
METRICS Performance
 Transmitter
 low


CPU overhead
multi-threads / processes – non-blocking scheme
buffering – reduce number of transmissions
 small

memory footprint
limited buffer size
 Reporting
 web-based

platform and location independent
 dynamic

report generation
always up-to-date
Example Reports
donkey 2%
rat 1%
bull 2%
100
LVS %
98
hen 95%
% aborted per machine
96
94
92
90
synthesis ATPG
22%
20%
postSyntTA
13%
placedTA
physical
7%
18%
BA 8%
funcSim
7%
LVS 5%
% aborted per task
88
0
100
200
300
400
time
LVS convergence
500
600
Current Results
 CPU_TIME = 12 + 0.027 NUM_CELLS (corr = 0.93)
 More plots are accessible at
 http://xenon.cs.ucla.edu:8080/metrics
Outline
 Data collection process and potential benefits
 METRICS system architecture
 METRICS standards
 Current implementation
 Issues and conclusions
METRICS Standards
 Standard metrics naming across tools
name  same meaning, independent of tool
supplier
 generic metrics and tool-specific metrics
 no more ad hoc, incomparable log files
 same
 Standard schema for metrics database
 Standard middleware for database interface
 For complete current lists see:
http://vlsicad.cs.ucla.edu/GSRC/METRICS
Generic and Specific Tool Metrics
Generic Tool Metrics
tool_name
tool_version
tool_vendor
compiled_date
start_time
end_time
tool_user
host_name
host_id
cpu_type
os_name
os_version
cpu_time
string
string
string
mm/dd/yyyy
hh:mm:ss
hh:mm:ss
string
string
string
string
string
string
hh:mm:ss
Placement Tool Metrics
num_cells
num_nets
layout_size
row_utilization
wirelength
weighted_wl
integer
integer
double
double
double
double
Routing Tool Metrics
num_layers
integer
num_violations integer
num_vias
integer
wirelength
double
wrong-way_wl double
max_congestion double
Partial list of metrics now being collected in Oracle8i
Outline
 Data collection process and potential benefits
 METRICS system architecture
 METRICS standards
 Current implementation
 Issues and conclusions
Testbed I: Metricized P&R Flow
DEF
Capo Placer
Placed DEF
LEF
QP ECO
Legal DEF
WRoute
Routed DEF
Congestion
Map
Incr. WRoute
Final DEF
CongestionAnalysis
M
E
T
R
I
C
S
Testbed II: Metricized Cadence SLC
Flow
QP
DEF
Incr. Placed DEF
LEF
GCF,TLF
CTGen
Clocked DEF
Constraints
QP Opt
Optimized DEF
WRoute
Routed DEF
Pearl
M
E
T
R
I
C
S
Outline
 Data collection process and potential benefits
 METRICS system architecture
 METRICS standards
 Current implementation
 Issues and conclusions
Conclusions
 Current status
 complete prototype of METRICS system with Oracle8i, Java
Servlet, XML parser, and transmittal API library in C++
 METRICS wrapper for Cadence and Cadence-UCLA flows,
front-end tools (Ambit BuildGates and NCSim)
 easiest proof of value: via use of regression suites
 Issues for METRICS constituencies to solve
 security: proprietary and confidential information
 standardization: flow, terminology, data management, etc.
 social: “big brother”, collection of social metrics, etc.
 Ongoing work with EDA, designer communities to identify tool
metrics of interest
 users: metrics needed for design process insight,
optimization
 vendors: implementation of the metrics requested, with
standardized naming / semantics
http://vlsicad.cs.ucla.edu/GSRC/METRICS
Related documents