# Advanced Optimization

-optimize for faster gradient descent

-scalling efficiently for lots of features

-scalling efficiently for lots of features

the new algorithm will optimize the cost function and it's derrivative terms

after we write code at right... we now be able to write optimization code

given options that stared,

optimset = set of options for optimization

exit flag is there to makes sure the cost function is converged...

initialTheta has to be 2 element vector minimal