- last video, how collaborative filtering first, given features, predict users rating. Then given, parameters of users get features for movies
- now we are trying to put it all together
- so instead of doing it sequentially, we may try to solve both of the steps in one go.
- Earlier, we talk about how we randomly initialized the users rating and set the features. Then based on users rating, predict features of the movies, and predict movies' rating. Now we learn how to do it simultaneously
- if we look at both formulas, it actually has the same minimized cost.
- The formulas try to sum all of movies'rated and sum all of the movies. So we can merge the cost function
- next we observe the optimization objective for both thetas and parameters (regularization). We can do that by just adding the regularization term for thetas and parameters
- finally the cost function will be minimized with respect to thetas and features. Instead of earlier when we separate he cost function with respect to thetas only or parameters only
- finally, a get rid of the x0, the interceptor
- earlier, we hardcode so x won't be all zeros
- now x become more flexible and can set 1 if it want to. We also do this so it has same n, sync between the theta and the parameters.
- Now for generalized steps for collaborative filtering
- first, we try to randomly initialized thetas and parameters by small value, similar to what we did in neural networks
- second, we apply gradient descent to minimize our cost function, which has partial derrivative. Keep in mind that because x0 is no more, so is theta. So there is no longer need for a separate cost functions if theta0 be the case.
- And third, we try to predict the user's rating of the movies he may never seen before...
- So that's how we predict rating for all different users and parameters for all different movies. This is the collaborative filtering algorithm. And also can give ratings for users never seen before.