Suppose that you have to choose between three algorithms A1, A2 and A3 for a given problem. The worst-caserunning time for each algorithm is given by:
$W_{1}(n) = 100_{n}$;
$W_{2}(n) = {2n}log_{10}n$;
$W_{3}(n) = 0.1n^2$
a) Order the algorithms from fastest to slowest in the sense of asymptotic worst-case running time.
(b) Suppose that we run each algorithm on a large problem instance of size n. Then we feed in an input of size $100n$ and re-run the algorithms on the new input. What would you expect to happen to the running times, and why?
(c) Suppose that hardware limitations mean that your algorithm can only accept input of size at most $10^6$. Which algorithm would you prefer, and why? Answer the same question with $10^6$ replaced by $10^3$.
(d) For which problem sizes is A1 the best algorithm to use? Answer the same question for A2 and then for A3.
(e) Suppose that we have $10^9$ time units available. What is the maximum input size that can be processed by each of the algorithms in that time?
Can someone please help me with these questions please? I am struggling.