Cs141 Home/Lecture4

Cs141 Home | Cs141 Home | recent changes | Preferences

Showing revision 5
math for asymptotic analysis

(under construction)

Review of chapter 3.

polynomials:

for example, n2 + 3*n, or n1/2 (square root of n)

For any polynomial p(n), multiplying n by a constant factor changes p(n) by at most a constant factor.

If your algorithm has worst-case running time bounded by a polynomial in the size of the input, that is good!

exponentials:

bk = b*b*b*...*b (k times).
bi * bj = bi+j
(bi)j = bij
b-i = (1/b)i

Exponentials with base > 1 (e.g. 2^n) grow large very fast. Exponentials with base < 1 (e.g. 2^{-n}) grow small very fast.

For any exponential function f(n) (such as 2n), if you increase n by a constant factor (say a factor of 2) how much can the function increase by?

If your algorithm takes time exponential in the size of the input, it won't be useful for solving very large problems!

logarithms:

logb(n) is roughly the number of times you have to divide n by b to get it down to 1 or less.
log2(n) is proportional to log10(n) --- the base of the logarithm doesn't matter (up to constant factors).
log(a*b) = log(a) + log(b)

log(n) grows quite slowly. e.g. log10(10100) is only 100.

If you increase n by a constant factor (say a factor of 2) how much does log(n) increase by?

summations: i=ab f(i)

geometric sums such as i=0n 2n = 1+2+4+ ... + 2n are proportional to their largest term
for other sums such as i=1n i2, get upper and lower bounds neglecting constant factors.

Worst-case analysis of running times of algorithms

1. We try to bound the running time from above by a function of the size of the input.

For example, "to bubblesort an n-item list takes at most time proportional to n2". Here we take the number of items in the list as a measure of the input size.

"Euclid's algorithm takes at most time proportional to log(min(i,j))"? What is the size of the input? (Trick question. Generally, by the size of an input instance, we mean the number of characters it takes to write down the instance. So, the size of an integer n, in this sense is proportional to log(n).) Euclid's algorithm runs in polynomial time.

2. We usually neglect constant factors in the running time. ("the time is at most proportional to..." something.)

3. Running time is proportional to the number of basic operations made by the algorithm. What is a basic operation? Need to think about what machine has to do to support the operation. Basic arithmetic, array access, expression evaluation, etc.


Moore's law: Every few years, computers get faster by a constant factor.

If you get a computer that is faster by a constant factor than your old one, the new one can solve bigger problems than your old one. How much bigger? Depends on the algorithm:

polynomial time --- constant factor increase
exponential time --- bigger by an additive constant

next week: big-O notation

Cs141 Home | Cs141 Home | recent changes | Preferences
This page is read-only | View other revisions | View current revision
Edited January 12, 2005 10:42 pm by Neal (diff)
Search: