知识点:
这个博主总结的很好,还强调了重点,对于回顾和写代码都有不少帮助,十分感谢:Linear Regression 知识点总结
作业:
这个过程中加深了对知识点的理解,代码出错想不通的时候,自己又重新推导了一遍公式,看了一些别人的博客,发现不少博客总结的很棒,比如开头的那篇。
主要是通过梯度下降法,找到使代价函数 J 最小的参数值。代码主要部分添加了注释,方便日后回顾。
- 单变量线性回归:
导入数据:
%% ======================= Part 2: Plotting =======================
fprintf('Plotting Data ...\n')
data = load('ex1data1.txt');
X = data(:, 1); y = data(:, 2);
m = length(y); % number of training examples
% Plot Data
% Note: You have to complete the code in plotData.m
plotData(X, y);
fprintf('Program paused. Press enter to continue.\n');
pause;
其中,plotData函数如下:(作图)
function plotData(x, y)
%PLOTDATA Plots the data points x and y into a new figure
% PLOTDATA(x,y) plots the data points and gives the figure axes labels of
% population and profit.
figure; % open a new figure window
% ====================== YOUR CODE HERE ======================
% Instructions: Plot the training data into a figure using the
% "figure" and "plot" commands. Set the axes labels using
% the "xlabel" and "ylabel" commands. Assume the
% population and revenue data have been passed in
% as the x and y arguments of this function.
%
% Hint: You can use the 'rx' option with plot to have the markers
% appear as red crosses. Furthermore, you can make the
% markers larger by using plot(..., 'rx', 'MarkerSize', 10);
plot(x, y, 'rx', 'MarkerSize', 10);
xlabel('Population of city in 10,000s');
ylabel('Profit in $10,000s');
% ============================================================
end
运行效果:
代价函数和梯度下降:
%% =================== Part 3: Cost and Gradient descent ===================
X = [ones(m, 1), data(:,1)]; % Add a column of ones to x
%为每个数据集 增加特征一个点 X0 == 1,使得参数theta(0)也和矩阵X有关。此时有n+1个特征和n个变量(theta)
theta = zeros(2, 1); % initialize fitting parameters theta(0)和theta(1)
% Some gradient descent settings
iterations = 1500; %指定迭代次数
alpha = 0.01; %学习率
%%==========================测试不同theta,损失函数的计算结果不同======================
fprintf('\nTesting the cost function ...\n')
% compute and display initial cost theta给初始值的时候,损失函数的值
J = computeCost(X, y, theta);
fprintf('With theta = [0 ; 0]\nCost computed = %f\n', J);
fprintf('Expected cost value (approx) 32.07\n');
% further testing of the cost function 更改theta,损失函数的值
J = computeCost(X, y, [-1 ; 2]);
fprintf('\nWith theta = [-1 ; 2]\nCost computed = %f\n', J);
fprintf('Expected cost value (approx) 54.24\n');
fprintf('Program paused. Press enter to continue.\n');
pause;
%%============================梯度下降函数============================
fprintf('\nRunning Gradient Descent ...\n')
% run gradient descent
theta = gradientDescent(X, y, theta, alpha, iterations);
%其中调用了损失函数computeCost
%%============================显示找到最佳的参数theta========================
% print theta