17/11/03 MWhite's learning notes
1. Classification
Linear Regression :y∈R
Classification(logistic regression) : y∈{0,1,...}
1.1 Hypothesis Representation
1.2 Logistic regression's Cost Function
Simplified Cost Function
1.3 Logistic regression's Gradient Descent
Vectorized implementation:
1.4 Advanced Optimization
library function——fminunc()
function [jVal, gradient] = costFunction(theta)
jVal = [...code to compute J(theta)...];
gradient = [...code to compute derivative of J(theta)...];
end
options = optimset('GradObj', 'on', 'MaxIter', 100);
initialTheta = zeros(2,1);
[optTheta, functionVal, exitFlag] = fminunc(@costFunction, initialTheta, options);
2. Multiclass Classification
One-vs-all
3. Overfitting
skips θ0
3.1 Regularized Linear Regression
-
Cost Function
-
Gradient descent
-
Normal Equation
3.2 Regularized Logistic Regression
-
Cost Function
-
Gradient descent