Regularized Linear Regression

regularized both gradient descent and normal equation algorithms for linear regression
calculate min theta for minimizing J theta
Regularization penalize all thetas except theta zero
so the box purple is the regularized derrivative for J theta
and cyan box is derrivative J theta without Regularization
then the separated both zero(?)? and the rest function become one line that shown at the bottom
m often high because of many training set making it really small, in addition becoming quadratic
next for normal equation....