神经网络作业: NN LEARNING Coursera Machine Learning(Andrew Ng) WEEK 5

时间:2021-12-21 15:25:21

在WEEK 5中,作业要求完成通过神经网络(NN)实现多分类的逻辑回归(MULTI-CLASS LOGISTIC REGRESSION)的监督学习(SUOERVISED LEARNING)来识别阿拉伯数字。作业主要目的是感受如何在NN中求代价函数(COST FUNCTION)和其假设函数中各个参量(THETA)的求导值(GRADIENT DERIVATIVE)(利用BACKPROPAGGATION)。

难度不高,但问题是你要习惯使用MATLAB的矩阵QAQ,作为一名蒟蒻,我已经狗带了。以下代核心部分的代码希望给被作业卡住的同学一些帮助。但请不要照搬代码哦~不要~不要~

 ty = zeros(m, num_labels);

 for i=:m
for j=:num_labels
if y(i)==j
ty(i,j) = ;
end
end
end a1 = X;
a1 = [ones(size(a1,),) a1];
z2 = a1 * Theta1';
a2 = sigmoid(z2);
a2 = [ones(size(a2,),) a2];
z3 = a2 * Theta2';
a3 = sigmoid(z3); for i=:m
for j=:num_labels
J = J - log(-a3(i,j))*(-ty(i,j))/m-log(a3(i,j))*ty(i,j)/m;
end
end %size(J,)
%size(J,) d3 = a3 - ty;
d2 = (d3 * Theta2(:,:end)).*sigmoidGradient(z2);
Theta1_grad = Theta1_grad + d2'*a1/m;
Theta2_grad = Theta2_grad + d3'*a2/m; % -------------------------------------------------------------
JJ=; for i=:size(Theta1,)
for j=:size(Theta1,)
JJ = JJ + Theta1(i,j)*Theta1(i,j)*lambda/(m*);
end
end
size(Theta1,);
size(Theta1,); for i=:size(Theta2,)
for j=:size(Theta2,)
JJ = JJ + Theta2(i,j)*Theta2(i,j)*lambda/(*m);
end
end
size(Theta2,);
size(Theta2,);
%J = J + (lambda/(*m)) * (Theta1(:,:end).*Theta1(:,:end)+Theta2(:end,:).*Theta2(:end,:));
J =J+JJ; Theta1_gradd = zeros(size(Theta1));
Theta2_gradd = zeros(size(Theta2)); for i=:size(Theta1,)
for j=:size(Theta1,)
Theta1_gradd(j,i) = Theta1(j,i)*lambda/m;
end
end for i=:size(Theta2,)
for j=:size(Theta2,)
Theta2_gradd(j,i) = Theta2(j,i)*lambda/m;
end
end Theta1_grad = Theta1_gradd+Theta1_grad;
Theta2_grad = Theta2_gradd+Theta2_grad;

PS:博主蒟蒻强迫自己下次要写矩阵运算,不能再套循环啦!!!