regularized both gradient descent of cost function and the more advanced optimization that includes cost function and derrivative
prone to overfitting
over complex many high order polynomials
Regularization can helps many overfitting problem
the Regularization for logistic regression is almost identical with the one in linear regression, unless of course differ in calculating the hypothesis
now for the more advanced optimization...
remember that actually octave indexes at 1, so theta zero is theta one in octave, and so on...
the way that we have learned so far has better understanding from many engineer that making tons of money at silicon Valley