# stanford coursera 机器学习编程作业 exercise 3（逻辑回归实现多分类问题）

①样本数据的可视化

②使用逻辑回归来实现多分类问题(one-vs-all)

hθ(1)(x)，输出 预测为晴天(y==1)的概率

hθ(2)(x)，输出 预测为阴天(y==2)的概率

hθ(3)(x)，输出 预测为雨天(y==3)的概率

③Matlab代码实现

function [all_theta] = oneVsAll(X, y, num_labels, lambda)
%ONEVSALL trains multiple logistic regression classifiers and returns all
%the classifiers in a matrix all_theta, where the i-th row of all_theta
%corresponds to the classifier for label i
%   [all_theta] = ONEVSALL(X, y, num_labels, lambda) trains num_labels
%   logisitc regression classifiers and returns each of these classifiers
%   in a matrix all_theta, where the i-th row of all_theta corresponds
%   to the classifier for label i

% Some useful variables
m = size(X, 1);% num of samples
n = size(X, 2);% num of features

% You need to return the following variables correctly
all_theta = zeros(num_labels, n + 1);

% Add ones to the X data matrix
X = [ones(m, 1) X];

% ====================== YOUR CODE HERE ======================
% Instructions: You should complete the following code to train num_labels
%               logistic regression classifiers with regularization
%               parameter lambda.
%
% Hint: theta(:) will return a column vector.
%
% Hint: You can use y == c to obtain a vector of 1's and 0's that tell use
%       whether the ground truth is true/false for this class.
%
% Note: For this assignment, we recommend using fmincg to optimize the cost
%       function. It is okay to use a for-loop (for c = 1:num_labels) to
%       loop over the different classes.
%
%       fmincg works similarly to fminunc, but is more efficient when we
%       are dealing with large number of parameters.
%
% Example Code for fmincg:
%
%     % Set Initial theta
%     initial_theta = zeros(n + 1, 1);
%
%     % Set options for fminunc
%     options = optimset('GradObj', 'on', 'MaxIter', 50);
%
%     % Run fmincg to obtain the optimal theta
%     % This function will return theta and the cost
%     [theta] = ...
%         fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), ...
%                 initial_theta, options);
%
initial_theta = zeros(n + 1, 1);

for c = 1:num_labels %num_labels 为逻辑回归训练器的个数，num of logistic regression classifiers
all_theta(c, :) = fmincg(@(t)(lrCostFunction(t, X, (y == c),lambda)), initial_theta,options );
end
% =========================================================================
end

lrCostFunction，完全可参考：http://www.cnblogs.com/hapjin/p/6078530.html 里面的 正则化的逻辑回归模型实现costFunctionReg.m文件

num_labels 为分类器个数，共10个，每个分类器(模型)用来识别10个数字中的某一个。

initial_theta = zeros(n + 1, 1); % 模型参数θ的初始值(n == 400)

all_theta是一个10*401的矩阵，每一行存储着一个分类器(模型)的模型参数θ 向量，执行上面for循环，就调用fmincg库函数 求出了 所有模型的参数θ 向量了。

function p = predictOneVsAll(all_theta, X)
%PREDICT Predict the label for a trained one-vs-all classifier. The labels
%are in the range 1..K, where K = size(all_theta, 1).
%  p = PREDICTONEVSALL(all_theta, X) will return a vector of predictions
%  for each example in the matrix X. Note that X contains the examples in
%  rows. all_theta is a matrix where the i-th row is a trained logistic
%  regression theta vector for the i-th class. You should set p to a vector
%  of values from 1..K (e.g., p = [1; 3; 1; 2] predicts classes 1, 3, 1, 2
%  for 4 examples)

m = size(X, 1);
num_labels = size(all_theta, 1);

% You need to return the following variables correctly
p = zeros(size(X, 1), 1);

% Add ones to the X data matrix
X = [ones(m, 1) X];

% ====================== YOUR CODE HERE ======================
% Instructions: Complete the following code to make predictions using
%               your learned logistic regression parameters (one-vs-all).
%               You should set p to a vector of   (from 1 to
%               num_labels).
%
% Hint: This code can be done all vectorized using the max function.
%       In particular, the max function can also return the index of the
%       are in rows, then, you can use max(A, [], 2) to obtain the max
%       for each row.
%

[~,p] = max( X * all_theta',[],2); % 求矩阵(X*all_theta')每行的最大值，p 记录矩阵每行的最大值的索引
% =========================================================================
end
本文转自hapjin博客园博客，原文链接：http://www.cnblogs.com/hapjin/，如需转载请自行联系原作者

|
4月前
|

Python用逻辑回归、决策树、SVM、XGBoost 算法机器学习预测用户信贷行为数据分析报告
Python用逻辑回归、决策树、SVM、XGBoost 算法机器学习预测用户信贷行为数据分析报告
86 1
|
20天前
|

【8月更文挑战第29天】本文将带你走进人工智能的奇妙世界，一起探索如何从零开始构建一个机器学习模型。我们将一步步解析整个过程，包括数据收集、预处理、模型选择、训练和测试等步骤，让你对AI编程有一个全面而深入的理解。无论你是AI初学者，还是有一定基础的开发者，都能在这篇文章中找到你需要的信息和启示。让我们一起开启这段激动人心的AI编程之旅吧！ 【8月更文挑战第29天】在这篇文章中，我们将探索移动应用开发的奇妙世界。无论你是刚刚踏入这个领域的新手，还是已经有一定经验的开发者，这篇文章都将为你提供有价值的信息和指导。我们将从基础开始，逐步深入到更复杂的主题，包括移动操作系统的选择、开发工具的使用、
51 2
|
1月前
|

【人工智能】机器学习、分类问题和逻辑回归的基本概念、步骤、特点以及多分类问题的处理方法

23 1
|
1月前
|

47 1
|
1月前
|

【机器学习】SVM面试题：简单介绍一下SVM？支持向量机SVM、逻辑回归LR、决策树DT的直观对比和理论对比，该如何选择？SVM为什么采用间隔最大化？为什么要将求解SVM的原始问题转换为其对偶问题？

48 3
|
1月前
|

【机器学习】支持向量机SVM、逻辑回归LR、决策树DT的直观对比和理论对比，该如何选择（面试回答）？

37 1
|
1月前
|

【机器学习】逻辑回归LR的推导及特性是什么，面试回答？

29 1
|
25天前
|

【8月更文挑战第24天】本文将带你走进人工智能编程的奇妙世界，从基础理论到实践操作，一步步构建你的首个机器学习模型。我们将通过一个简单的分类问题，展示如何收集数据、选择算法、训练模型并进行评估。文章末尾附有代码示例，助你理解并实现自己的AI项目。
38 0
|
2月前
|

AI算法：机器学习之逻辑回归
AI算法：机器学习之逻辑回归
46 1
|
3月前
|

【机器学习】逻辑回归：原理、应用与实践

339 2