• SVM seems like making a better job for decision boundary(mark by black color). It chooses line with the largest margin, make it more robust compared to logistic regression.The optimization problem provided by the earlier slides will lead to this margin, to be detailed in the next video.


  • The second test case will show how C affects the function of our learning algorithms
  • The C will be valued by some huge value, for example 100.000
  • SVM is very sensitive to the outliers. Here SVM has manage an excellent job by drawn a black line classifier with large margin. But then there's an outliers drawn at the bottom left.
  • Remember that C acts as contradiction of lambda. By making C very large, it will overfit the data. SVM will then change the line to magenta, making it no longer ideal classifier.
  • What we want to is not making C very lage, will then making it ignoring outliers, as well as non-separable by linear outliers, as shown by outliers drawn among the blue circles.
  • With ideal C value, SVM will retain the perfect classifier drawn by black line eventhough there's outliers

  • Next discused more how the conditioned classifer will eventually lead to large margin classifier.