How to minimize cost

Hypothesis and Cost

Simplified hypothesis

H(x) = Wx  ( b=0)

What cost(W) looks like?

 * W=1, cost(W) = 0

 * W=0, cost(W) = 4.67

 * W=2, cost(W) = 4.67

W에 대한 cost(W) 값을 그려보자 .

가운데 값 찾아내는 것!

Gradient descent algorithm

(경사 )      (감소)

* Minimize cost function

* Gradient descent is used many minimization problems

* For a given cost function, cost(W,b), it will find W,b to minimize cost

* It can be applied to more general function : cost(w1,w2,...)

 several parameter also possible.

How it works?

How would you find the lowest point?

Gradient 를 따라서 움직이다  Gradient =  0인 곳에 결국 수렴한다.

 How it works?

* Start with initial guesses

- Start 0,0 (or any other value)

- Keeping changing W and b a little bit to try and reduce cost(W,b)

* Each time you change the parameters, you select the gradient which reduces cost(W,b) the most possible

*미분을 이용하자

* Repeat

* Do so until you converge to a local minimum
* Has an interesting property
-Where you start can determine which minimum you end up

m 이나 2m 이나 minimal 하는 과정에서는 동일.

해당 기울기를 구해서 기울기가 (-)라면 W 를 감소시킨다. vice versa.

Gradient descent algorithm

Convex function

이런 경우에는 수렴하지 않을 가능성이 존재한다. (Not convex function)

Linear regression을 적용하기 위해서는 cost function 이 convex function 인지를 확인해줘야 한다. 

+ Recent posts