# Week 3. Complexity Analysis

## Projects and Exercises

#### 4.2 Projects and Exercises

This week, I ask you not to start with the compulsory projects. The Wednesday Group Work section contains two warm-up problems, and then a discussion exercise which I want to discuss in the plenary debrief, after your group discussion.

You should be able to complete those three problems in the group session, and you may have time to proceed with the compulsory projects.

##### 4.2.1 Wednesday Group Work

Problem 4.1 (Goodrich *et al* R-4.1 rephrased) The number of operations executed by algortihms
A and B is $8nlogn$
and $2{n}^{3}$,
respectively. Determine ${n}_{0}$
such that A| is better than B for $n\ge {n}_{0}$

Problem 4.2 (Goodrich *et al* R-4.7 rephrased) Order the following functions by
asymptotic growth rate:

Problem 4.3 (Goodrich *et al* C-4.43 rephrased) Claim: *In any flock of sheep, all sheep
have the same colour*

An alleged proof goes as follows.

*Base Case:*
$n=1$
A single sheep clearly has the same colour as itself.

*Induction Step:* Consider a flock of
$n>1$
sheep. Take one sheep
$a$
out. The remaining
$n-1$
sheep have the same colour by induction. Now, replace
$a$
and take a different sheep
$b$
out. Again, the remaining
$n-1$
sheep have the same colour by induction. Hence, all the sheep in the flock have the same
colour.

In reality, we know that a flock with black and white sheep has been observed, so what is wrong with the argument?

##### 4.2.2 Compulsory Assignment

Problem 4.4 (Goodrich *et al* C-4.49 rephrased) Let
$p\left(x\right)$
be a polynomial, i.e.

- 1.
- Describe an $O\left({n}^{2}\right)$-time algorithm to compute $p\left(x\right)$.
- 2.
- Improve the algorithm to $\left(nlogn\right)$ time by improving the calculation of ${x}^{i}$.
- 3.
- Now consider rewriting as
$$p\left(x\right)={a}_{0}+x\left({a}_{1}+x\left({a}_{2}+x\left({a}_{3}+\cdots +x\left({a}_{n-1}+x{a}_{n}\right)\cdots \phantom{\rule{0.3em}{0ex}}\right)\right)\right).$$
How many arithmetic operations do you need to calculate this (in Big-O notation).

Problem 4.5 (Goodrich *et al* R-4.31 rephrased) Al and Bob are arguing about their algorithms.
Al claims his $O\left(nlogn\right)$-time
algorithm is *always* faster than Bob’S $O\left({n}^{2}\right)$-time
algorithm. To settle the issue they run a set of experiments. To Al’s dismay, they find that if
$n<100$,
the $O\left({n}^{2}\right)$-algorithm
runs faster, and only when $n\ge 100$
is the $O\left(nlogn\right)$-time
one better. Explain how this is possible.

##### 4.2.3 Practice Problems

Problem 4.6 (Goodrich *et al* C-4.35 rephrased) Show that

Problem 4.7 (Goodrich *et al* R-4.28 rephrased) Given an $n$-element
array $X$.
Algorithm A chooses $logn$
elements from $X$
at random and executes an $O\left(n\right)$-time
calculation for each. What is the worst-cae running time of B.

Problem 4.8 (Goodrich *et al* R-4.30 rephrased) Given an $n$-element
array $X$.
Algorithm D calls Algorithm E on each element ${X}_{i}$.
Algorithm E runs in $O\left(i\right)$
time when called on ${X}_{i}$.
What is the worst-case run time of D?

Problem 4.9 (Goodrich *et al* P-4.55 rephrased) Make an experimental analysis to test the
hypothesis that Java’s Array.sort() method runs in
$O\left(n\cdot logn\right)$
time on average.

- 1.
- Proof techniques C-4.$x$