Sunday, November 30, 2014

Week 12 [Computability] and [Countability]

In week 10's lecture we talked about halt. Here is the prove nobody can write halt(f, n). We assume we could write it and get the contradiction.
def clever(f):
    while halt(f, f):
        pass
    return 42
Now consider two cases: case 1: assume clever(clever) halts, then halt(clever, clever) is true, then entering and infinite loop, then clever(clever) does not halt; case 2: assume clever(clever) does not halt, then halt(clever, clever) is false, then return 42, then clever(clever) halts. We get contradiction in both cases, so we cannot write halt(f, n). We conclude halt(f, n) is well-defined and non-computable. ( 1. A function f is well-defined iff we can tell what f(x) is for every x in some domain. 2. A function f is computable iff it is well-defined and we can tell how to compute f(X) for every x n the domain.)

How do we know if the function is computable or not? We can use reductions. If function f(n) can be implemented by extending another function g(n), then we say f reduces to g. If g is computable then f is computable; if f is non-computable then g is non-computable.

Back to the halt example: a computable initialized would allow a computable halt. No way that's possible. All we need to show here is halt(f, n) reduces to initialized(f, v). In other words, implement halt(f, n) using initialized(f, v).



My friend Udara mentioned two good videos in his post about halt http://udaracsc165.blogspot.ca/2014/11/omega-theta-halt.html



In week 12's lecture, we use mapping f: X-->Y to explain well-defined: each x maps to a unique y; 1-1: each y mapped to by at most one x; onto : every y has some x maps to it.
We also talked about countable: any set A that satisfies size of A smaller or equal to size of N is countable. (Z is countable Q is countable R is uncountable).  Remember we only program countably infinitely many python functions; there are uncountably many functions that we cannot program in python.


Sunday, November 16, 2014

Week 10 [Big-Ω Proof]

Let's review the definition of  Ω(n^2) : a function f(n) is in Ω(n^2) iff there exists c in R+, there exists B in N, such that for all n in N, n>=B then f(n) >= c(n^2). It is the lower bound. From the definition we can find that for big-Omega proof, the most important part is also about picking c and B.

First example we talked in class is to prove n^2+n ∈ Ω(15n^2+3). There are two tips: tip 1:pick B =1 assume n >= 1; tip 2: pick a c small enough to make the right side an lower bound. There are some important tricks: under-estimation tricks: --> remove a positive term ( 3n^2+2n >= 3n^2) ; --> multiply a negative term (5n^2-n >= 5n^2 -nxn = 4n^2) over-estimation tricks: --> remove a neagtive term ( 3n^2-2n <= 3n^2) ; --> multiply a positive term (5n^2+n <= 5n^2 + nxn = 6n^2). For big-O proof, we over-estimate f(n) and under-estimate g(n); for big-Ω proof, we under-estimate f(n) and over-estimate g(n).

Then we proved some general statements about big-Oh. We introduce set F(The set of all functions that take a natural number as input and return a non-negative real number.) in this kind of proof. 
Lets talk about an example Prove for all f, g in F, f in O(g) => g in Ω(f). Intuition: if g grows no faster than g, then g grows no slower than f. Assume f in O(g): n>= B => f<= cg ; g in Ω(f): n >= B' => g>= c'f. We need to pick the right B', c' so try to find the relationship between B and B', c and c'.
Since f<=cg, g>= 1/c*f. So we can pick c' = 1/c B = B'.
Here is the proof: 

There is a very trick proof we talked about in class which is disproof :


 so we proof the negation of the statement. The trick part is to pick n wisely.

Thursday, November 6, 2014

Week 9 [Big-O Proof]

Let's review the definition of O(n^2) : a function f(n) is in O(n^2) iff there exists c in R+, there exists B in N, such that for all n in N, n>=B then f(n) <= c(n^2). From the definition we can find that for big-Oh proof, the most important part is about picking c and B. So here comes the question, how to pick c and B.

First example we talked in class is to prove 3n^2+2n ∈ O(n^2). There are two tips: tip 1: c should probably be larger than 3 (the constant factor of the highest-order term); tip 2: see what happens when n = 1=> 3n^2+2n=3+2=5=5n^2 so pick c=5 with B=1 is probably good. What happen when we add a constant? For example to prove 3n^2+2n+5 ∈ O(n^2). Same thing tip 1: c should be larger than 3; tip 2 see what happens when n =1=> 3n^2+2n+5=3+2+5=10=10n^2 so pick c=10 with B=1 is probably good.

After that we talked about a more complex example: to prove 7n^6 - 5n^4 +2n^3∈ O(6n^8-4n^5+n^2). The thinking process is:
                            6n^8-4n^5+n^2
--> assume n>=1       6n^8-4n^5
--> upper-bound         6n^8-4n^58 = 2n^8
the left side by overestimating
--> lower-bound        9n^6  <= 9/2 *(2n^8)
the right side by underestimating  7n^6 +2n^6 =9n^6
--> choose a c that connects the two bounds 7n^6 +2n^3
                                                                       7n^6 -5n^4+2n^3
The proof structure looks like:

Larry also talked about an example which is to prove n^3 O(3n^2). The process is to negate the definition first and then prove the negation.

The key is to figure out c and B. In general, larger c or larger B are good. These two examples are polynomials. How about non-polynomials? We use limit to prove it. The intuition is that if the ratio f(n)/g(n) approaches infinity when n becomes larger and larger that means f(n) grows faster than g(n).

Saturday, November 1, 2014

Week 8 Meet [O and Ω]

First of all, lets review definition of Big-O. The precise definition is "The set of functions that are eventually no more than f, to within a constant factor." If g belongs to O(f) then it means "g grows no faster than f (or equivalently f is an upper-bound for g).

When we compare g and f, we need to remember these three points:
1. count the number of steps;
2. constant factors don't matter;
3. only highest-order term matter.

Larry told us an example which was very easy to understand: "chicken size" is in O("turkey size") means a chicken grows slower than a turkey in the sense that, after a certain breakpoint, a chicken will always be smaller than a turkey.

Functions are in O(n^2): n^2, 2n^2+4n+1.99999, n^2/3.4890 +1999n


Ω, is the opposite of O. " If g belongs to Ω(f) then it means "g grows no slower than f (or equivalently f is an lower-bound for g).
Here are the formal definitions of O and Ω.
Functions are in Ω(n^2): 0.000001n^3, n^3-0.000001n









Then, let's analyse two algorithms: Insertion sort and Maximum slice.

We think about i which has n-1 iterations; then think about j which has iterations in worst case. Dont forget to add loop guard for each loop. Loop guard means you always check the first line of the loop at the end.




After comparing this two pictures, we understand that it is ok to over-estimate the number of steps of upper-bound and under-estimate the number of steps of lower-bound.