每次上组合数学都会想:
是谁这么机智想出这么巧的方法的!
晚上和一个妹子做大数据实验,三个实验,HDFS、HBase、NoSQL都做了,开心耶,数据库初体验。
我终于知道当时为什么一直找不到命令,就是因为教程上的命令打错了,中文-和英文-是有区别的,啊啊啊啊啊啊啊啊!
大量不cd到文件夹,打多/,复制的时候$也复制上了,复制的时候#也复制上了,打错文件名,复制的时候忘改版本号,眼花缭乱,我是不是真的不适合学计算机?
我觉得吧,向量化一个很大的用途就是不用写for loop了。
我感觉这里的x是列向量,写成X矩阵的时候,就转置成传统数据库的行形式。
完了,我已经忘记regularized和unregularized是在哪里出现的了。
function [J, grad] = lrCostFunction(theta, X, y, lambda)
%LRCOSTFUNCTION Compute cost and gradient for logistic regression with
%regularization
% J = LRCOSTFUNCTION(theta, X, y, lambda) computes the cost of using
% theta as the parameter for regularized logistic regression and the
% gradient of the cost w.r.t. to the parameters.
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
grad = zeros(size(theta));
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
%
% Hint: The computation of the cost function and gradients can be
% efficiently vectorized. For example, consider the computation
%
% sigmoid(X * theta)
%
% Each row of the resulting matrix will contain the value of the
% prediction for that example. You can make use of this to vectorize
% the cost function and gradient computations.
%
% Hint: When computing the gradient of the regularized cost function,
% there're many possible vectorized solutions, but one solution
% looks like:
% grad = (unregularized gradient for logistic regression)
% temp = theta;
% temp(1) = 0; % because we don't add anything for j = 0
% grad = grad + YOUR_CODE_HERE (using the temp variable)
%
J=(-y'*log(sigmoid(X*theta))-(1-y)'*log(1-sigmoid(X*theta)))/m+lambda * sum(theta(2:end).^2)/(2*m);
%grad=(X'*sigmoid(X*theta)-y)/m; 括号写着写着就忘了,辣鸡matlab
grad=(X'*(sigmoid(X*theta)-y))/m;
%theta_s = [0; theta(2:end)]; 下面用的temp把bias值置为0
temp=theta;
temp(1)=0;
%grad=grad+lambda/m*sum(temp); 这里公式抄错了,是没有sigma的
grad=grad+lambda/m*temp;
% =============================================================
grad = grad(:);
end