Download CEG 221: Week 1 Lesson 1

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Matrix calculus wikipedia , lookup

Non-negative matrix factorization wikipedia , lookup

Gaussian elimination wikipedia , lookup

Matrix multiplication wikipedia , lookup

Transcript
CEG 221
Lesson 5: Algorithm Development II
Mr. David Lippa
Overview
• Algorithm Development II
– Review of basic algorithm development
– Advanced algorithm development
• Optimization of algorithm
• Optimization of code
• Questions
What is an Algorithm?
• An algorithm is a high-level set of clear,
step-by-step actions taken in order to
solve a problem, frequently expressed in
English or pseudo code.
• Examples of Algorithms:
– Computing the remaining angles and side in
an SAS Triangle
– Computing an integral using rectangle
approximation method (RAM) or the
Trapezoidal Rule
Example: Triangulation with SAS
A
b = 100
B
a = 60
42°
C
• If we return to the
SAS triangle, there’s
nothing really much to
be done to improve
speed or efficiency, as
the computation is
very straightforward.
Example: Trapezoidal Rule
If the interval here is [0, 1], then we
need to compute: SUM = ( f(0) +
f(0.25) + f(0.25) + f(0.5) + f(0.5) +
f(0.75) + f(0.75) + f(1.0) ). Then,
AREA = 0.5 * SUM.
4 trapezoids  9 computations of f
8 trapezoids  15 computations of f
1024 trapezoids  2047 computations
Notice a pattern?
• You notice that we
compute f(x) more times
than is necessary for all
the inner values
• Let’s compute SUM2 = 2
* ( f(0) + f(0.25) + f(0.5) +
f(0.75) + f(1.0) ) – f(0) –
f(1.0).
• AREA2 = 0.5 * SUM2
•
•
•
4 trapezoids  8 computations of f
8 trapezoids  12 computations of f
1024 trapezoids  1028 computations
Trapezoidal Rule Improvements
• For a small number of trapezoids, this method is
slightly more work.
• For, say 1024 trapezoids, this is significantly
more efficient in terms of number of
mathematical calculations.
• CONCLUSION: Given that greater accuracy
comes with more trapezoids, this optimization is
sufficient, since this algorithm will rarely be used
with few trapezoids.
Optimizing Implemented Code
• There are other ways to speed up code
– Sacrifice memory for improved speed (ie.
Always try to work from memory, not from
disk)
– Avoid algorithms where the ratio of work
required to number of elements processed is
n, namely an n2 algorithm.
– Pass by reference or pointer where
appropriate to prevent unnecessary memory
copies of large structures
– Use algorithm analysis to try to find the cause
of the lack of speed
Algorithm Analysis
• Big-Oh notation – how much work is
required to process n inputs in terms of n
– Constants are less important for Big-Oh
notation
– O(1), O(log2 n), O(n), O(n log2 n), O(n2),
O(n3), O(2n), O(n!)
– Associate algorithms with each
• Matrix, Integration, SAS, factorial
• Formal definition
Using Algorithm Analysis
• Analyze an algorithm by computing the
number of operations performed per unit
input
• Avoid O(n2) or worse algorithms
• Convert code to pseudo code if needed, to
do a theoretical analysis
Algorithm Analysis: Example
• Matrix Multiplication
– Pseudo code – To multiply an m x n matrix [A] and an
n x p matrix [B], dot product each row of [A] with each
column of [B].
• Results in m * p dot products (see previous notes for pseudo
code details)
• Dot product
– Pseudo code – to dot product two vectors, multiply
the first element of each, the second, the third, and so
on and add them all together
• Results in n multiplication and addition operations (see
previous notes for pseudo code details)
• RESULT: Matrix multiplication is an O(m * n * p)
operation. With square matrices, it is O(n3)
Next Time
•
•
•
•
Building Libraries
Using Libraries
Tradeoffs
Questions
Questions?