# Cs141 Home/Lecture2

Cs141 Home | Cs141 Home | recent changes | Preferences

/Lecture1 /Lecture3

Euclid's algorithm for greatest common divisor, upper bound on worst-case running time

###### Euclid's algorithm for greatest common divisor

```def gcd3(i,j):
# assumption: i and j are positive integers
if i == j: return i
if i == 1 or j == 1: return 1

if i mod j == 0: return j
if j mod i == 0: return i

if i < j: return gcd3(i, j mod i)
if i > j: return gcd3(i mod j, j)
```

Does the algorithm terminate? Yes, because max(i,j) decreases with each recursive call.

correctness: Is the algorithm correct? Prove that if i < j, the g.c.d. of i and j equals the g.c.d. of i and j mod i. The correctness of the algorithm follows by induction.

running time: Prove that if i < j, then j mod i < j/2. Consequently, j decreases by a factor of two every other recursive call. Thus, the number of recursive calls is at most? 2 * log2(j), proportional to log(j).

By the same argument, the number of recursive calls is at most proportional to log(i).

Since each recursive call does only a constant number of operations (outside of the recursion), this means the running time is at most proportional to log(min(i,j)).

Note that we have only shown an upper bound on the running time. That's why we say the running time is "at most" proportional to log(min(i,j)). A-priori, it's possible that better upper bounds can be proven, but this is a pretty good one. Also, note that we've bounded the worst-case running time, because our bound holds for all inputs.

###### lower bounds
How would you prove that no better upper bound can be shown? E.g., "the running time is at most proportional to log(log(i))"?

Not enough to show it for one input! You need to show a sequence of larger and larger inputs for which this is true.

Consider the fib. sequence f(0), f(1), ..., f(n) defined by the relation

f(0) = f(1) = 0
f(n) = f(n-1) + f(n-2) for integer n > 1

Claim: gcd3(f(n),f(n-1)) takes time proportional to n.
To see why, note that f(n-1) < f(n) and

f(n) mod f(n-1) = f(n-1)+f(n-2) mod f(n-1) = f(n-2) mod f(n-1) = f(n-2)
so gcd3(f(n), f(n-1)) calls gcd3(f(n-2), f(n-1)) (and so on down the line).

Claim: n is proportional to log f(n).
Proved in /Lecture3

Assuming the claim is true, we can conclude that there are arbitrarily large inputs (i,j) such that gcd3(i,j) takes time proportional to log(min(i,j)).

Combining this with our upper bound from before, we conclude that the worst case running time of gcd3(i,j) is proportional to log(min(i,j)).

continued... /Lecture3

Cs141 Home | Cs141 Home | recent changes | Preferences