Saturday, November 1, 2014

Week 8 Meet [O and Ω]

First of all, lets review definition of Big-O. The precise definition is "The set of functions that are eventually no more than f, to within a constant factor." If g belongs to O(f) then it means "g grows no faster than f (or equivalently f is an upper-bound for g).

When we compare g and f, we need to remember these three points:
1. count the number of steps;
2. constant factors don't matter;
3. only highest-order term matter.

Larry told us an example which was very easy to understand: "chicken size" is in O("turkey size") means a chicken grows slower than a turkey in the sense that, after a certain breakpoint, a chicken will always be smaller than a turkey.

Functions are in O(n^2): n^2, 2n^2+4n+1.99999, n^2/3.4890 +1999n


Ω, is the opposite of O. " If g belongs to Ω(f) then it means "g grows no slower than f (or equivalently f is an lower-bound for g).
Here are the formal definitions of O and Ω.
Functions are in Ω(n^2): 0.000001n^3, n^3-0.000001n









Then, let's analyse two algorithms: Insertion sort and Maximum slice.

We think about i which has n-1 iterations; then think about j which has iterations in worst case. Dont forget to add loop guard for each loop. Loop guard means you always check the first line of the loop at the end.




After comparing this two pictures, we understand that it is ok to over-estimate the number of steps of upper-bound and under-estimate the number of steps of lower-bound.





No comments:

Post a Comment