Lecture3_IO_BLG336E_2022
Lecture3_IO_BLG336E_2022
Analysis of Algorithms II
Lecture 3:
Growth of Functions, Asymptomatic Analysis, Runtimes,
Big-O Notation, Theta, Omega
1
Algorithm Analysis
Thinking about how the resource requirements of the algorithms will
scale with increasing input size.
Resources:
time
space
2
Computational Tractability
3
How to measure complexity?
• Accurate running time is not a good measure
• It depends on input
• It depends on the machine you used and who
implemented the algorithm
4
Machine-independent
• A generic uniprocessor random-access machine
(RAM) model
• No concurrent operations
• Each simple operation (e.g. +, -, =, *, if, for) takes 1 step.
• Loops and subroutine calls are not simple operations.
• All memory equally expensive to access
• Constant word size
• Unless we are explicitly manipulating bits
• No memory hierarchy (caches, virtual mem) is modeled
5
Running Time
• Running Time:T(n): Number of primitive
operations or steps executed for an input of size n.
• Running time depends on input
• already sorted sequence is easier to sort
• Parameterize running time by size of input
• short sequences are easier to sort than long ones
• Generally, we seek upper bounds on running time
• everybody likes a guarantee
6
Kinds of Analysis
• Worst-case: (usually)
• T(n) = maximum time of algorithm on any input of size n
• Average-case: (sometimes)
• T(n) = expected time of algorithm over all inputs of size n
• Need assumption about statistical distribution of inputs
• Best-case: (difficult to determine)
• Cheat with a slow algorithm that works fast on some input
7
Worst-Case Analysis
Not only too slow to be useful, it is an intellectual cop-out.
Provides us with absolutely no insight into the structure of the
problem.
9
Exhaustive Search is Slow!
• Because it tries all n! permutations, it is much slower
to use when there are more than 10-20 points. [For
n=1000, will not be achieved in your lifetime!]
Desirable scaling property. When the input size doubles, the algorithm
should only slow down by some constant factor C.
11
Polynomial-Time
Desirable scaling property. When the input size doubles, the algorithm
should only slow down by some constant factor C.
12
Worst-Case Polynomial-Time
Exceptions.
Some poly-time algorithms do have high constants and/or
exponents, and are useless in practice.
Some exponential-time (or worse) algorithms are widely used
because the worst-case instances seem to be rare.
13
Why It Matters
We try to express that an algorithm’s worst case running time is at
most proportional to some function f(n).
Function f(n) becomes a bound on the running time of the algorithm.
Pseudo-code style.
counting the number of pseudo-code steps.
step. Assigning a value to a variable, looking up an entry in an array,
following a pointer, a basic arithmetic operation…
“On any input size n, the algorithm runs for at most 1.62n 2 + 3.5n + 8
steps.”
15
Big-Oh Notation
16
Asymptotic Analysis
17
Example: One-Loop
A. O(1)
B. O(n)
C. O(logn)
D. O(n2)
18
Example: Two-Loops
A. O(1)
B. O(n)
C. O(logn)
D. O(n2)
19
Example: Two Nested Loops
A. O(1)
B. O(n)
C. O(logn)
D. O(n2)
20
Example: Two Nested Loops II
A. O(1)
B. O(n)
C. O(logn)
D. O(n2)
21
Big-Oh Definition
22
Big-Oh formal Definition
23
Example 1
24
Omega Notation
25
Theata Notation
c2∙g(n)
26
Little o: Definition
• For a given function g(n), o(g(n)) is the set of
functions:
o(g(n))= {f(n): for any positive constant c,
there exists a constant n0
such that 0 ≤ f(n) < c g(n)
for all n ≥ n0 }
27
Little o
• Note the < instead of ≤ in the definition of Little-o:
0 ≤ f(n) < c g(n) for all n ≥ n0
• Contrast this to the definition used for Big-O:
0 ≤ f(n) ≤ c g(n) for all n ≥ n0
• Little-o notation denotes an upper bound that is not
asymptotically tight. We might call this a loose
upper bound.
• Examples: 2n o(n2) but 2n2 o(n2)
28
Little o: Definition
• Given that f(n) = o(g(n)), we know that g grows
strictly faster than f. This means that you can
multiply g by a positive constant c and beyond n0, g
will always exceed f.
• No graph to demonstrate little-o, but here is an
example:
n2 = o(n3) but
n2 ≠ o(n2).
Why? Because if c = 1, then f(n) = c g(n), and the
definition insists that f(n) be less than c g(n).
29
Little omega: Definition
• For a given function g(n), (g(n)) is the set of
functions:
(g(n))= {f(n): for any positive constant c,
there exists a constant n0
such that 0 ≤ c g(n) < f(n)
for all n ≥ n0 }
30
Little omega: Definition
• Note the < instead of ≤ in the definition:
0 ≤ c g(n) < f(n)
• Contrast this to the definition used for Big-:
0 ≤ c g(n) ≤ f(n)
• Little-omega notation denotes a lower bound that is
not asymptotically tight. We might call this a loose
lower bound.
• Examples:
n (n2) n( ) n n (lg n)
31
Little omega
• No graph to demonstrate little-omega, but here is
an example:
n3 is (n2) but
n3 ≠ (n3).
Why? Because if c = 1, then f(n) = c g(n), and the
definition insists that c g(n) be strictly less than f(n).
32
Asymptotic Notation
c2∙g(n)
c∙g(n)
f(n)
f(n)
c1∙g(n)
n0 n0
Input Size (n) Input Size (n)
f(n)
c∙g(n)
n0 Big
Input Size (n)
Omega
33
Example
34
Where does this notation come from??
35
Asymptotic Order of Growth
36
Asymptotically Tight Bounds
Answer:
If the following limit exists and is equal to a constant c>0, then f(n) is (g(n)).
f ( n)
lim
n g ( n )
Or
show that f(n) is both O(g(n)) and (g(n))
37
Notation
38
Properties
Transitivity.
If f = O(g) and g = O(h) then f = O(h).
If f = (g) and g = (h) then f = (h).
If f = (g) and g = (h) then f = (h).
Additivity.
If f = O(h) and g = O(h) then f + g = O(h).
If f = (h) and g = (h) then f + g = (h).
If f = (h) and g = (h) then f + g = (h).
39
Asymptotic Bounds for Some Common Functions
40
2.4 A Survey of Common Running Times
Linear Time: O(n)
Linear time. Running time is at most a constant factor times the size
of the input.
max a1
for i = 2 to n {
if (ai > max)
max ai
}
42
Linear Time: O(n)
i = 1, j = 1
while (both lists are nonempty) {
if (ai bj) append ai to output list and increment i
else(ai bj)append bj to output list and increment j
}
append remainder of nonempty list to output list
43
O(n log n) Time
O(n log n) solution. Sort the time-stamps. Scan the sorted list in
order, identifying the maximum gap between successive time-stamps.
44
Quadratic Time: O(n2)
Closest pair of points. Given a list of n points in the plane (x1, y1), …,
(xn, yn), find the pair that is closest.
Remark. (n2) seems inevitable, but this is just an illusion. see chapter 5
45
Cubic Time: O(n3)
O(n3) solution. For each pairs of sets, determine if they are disjoint.
foreach set Si {
foreach other set Sj {
foreach element p of Si {
determine whether p also belongs to Sj
}
if (no element of Si belongs to Sj)
report that Si and Sj are disjoint
}
}
46
Polynomial Time: O(nk) Time
Independent set of size k. Given a graph, are there k nodes such that
no two are joined by an edge? k is a constant
Check whether S is an independent set = O(k2).
Number of k element subsets =
O(k2 nk / k!) = O(nk).
poly-time for k=17,
but not practical
47
Exponential Time
S*
for each subset S of nodes {
check whether S in an independent set
if (S is largest independent set seen so far)
update S* S
}
}
48
Sublinear Time
49
Recap: factorial, exponential
Number of operations
slower
inputs
Number of
Time Versus Space Complexities
Two main characteristics for programs
• Time complexity: ≈ CPU usage
• Space complexity: ≈ RAM usage
54
Heap
A binary tree with n nodes and of height h is
almost complete iff its nodes correspond to the
nodes which are numbered 1 to n in the complete
binary tree of height h.
A heap is an almost complete binary tree that
satisfies the heap property:
max-heap: For every node i other than the root:
A[Parent(i)] ≥ A[i]
min-heap: For every node i other than the root:
A[Parent(i)] ≤ A[i]
Max-Heap
A max-heap is an almost complete binary tree that
satisfies the heap property:
For every node i other than the root,
A[PARENT(i)] ≥ A[i]
What does this mean?
the value of a node is at most the value of its
parent
the largest element in the heap is stored in the
root
subtrees rooted at a node contain smaller values
than the node itself
57
Propose-And-Reject Algorithm (Gale Shapley)
58
Implementing the Gale Shapley Algorithm
59
Take-home lesson
Basics of Graphs
Breadth First Search
Depth First Search
Testing Bi-partite
Topological Ordering
61