Summations while analyzing algorithms.
I'm learning about the analysis of algorithms and I'm encountering some steps in an explanation which I can't quite understand (log is a logarithm with base 2 in this context).
The step from the second to third doesn't quite click, but it goes on:
The above step is a bit hazy too...
Which is a worst-case running time of
Basically I'm lacking insight into alot of the simplifications/reductions/steps taken to come to I would very much appreciate some clarification on steps taken so I can apply this when I need to analyze algorithms myself :)
Re: Summations while analyzing algorithms.
Remember that sum_(i=0)^(r) a = a(r+1). Lots of the steps are simplified by this fact. Basically the sum over a term that doesn't depend on the index of the sum is unity plus the difference of its bounds times the term. By the way, are all of these steps really less than or equal to? Is it necessary to use less than or equal to in all of the steps?