First Kaggle Competition
This Tutorial is from:
![] (/galleries/Kaggle1st/1.jpg, raw = true)
Recently, i have entered Kaggle competition for data scince. i have ranked 342 out of almost 800 other competitors. Pretty impressive eh? Here's how i got to it.
Boosting, Post-decessor
- This is incorrect, as weakest can do better than chance, then at least all weak learner will perform the boosting just fine.
Boosting
- So instead of taking uniformly randomly subsets of data, take "hardest examples" from it
Instance Based Learning and Others
- This is the learning as before, where given data inputs, make a function/model that generalize, map the output are.
Tools for Neural Networks & Others
- So the way gradient descent works is take a derrivative of the formula
Polynomial Regression
- Inverse can't be implemented directly as X is not a squared matrix. That's way XtX to produce square matrix