Fisher判别分析简述

简介: Supervised Dimension ReductionGreater dimensionality always brings about more difficult learning tasks. Here we introduce a supervised dimension reduction method based on linear dimension

Supervised Dimension Reduction

Greater dimensionality always brings about more difficult learning tasks. Here we introduce a supervised dimension reduction method based on linear dimension reduction as introduced in

http://blog.csdn.net/philthinker/article/details/70212147

which can also be simplified as:

z=Tx,xRd,zRm,m<d

Of course, centeralization in the first place is necessary:
xixi1ni=1nxi

Fisher discrimination analysis is one of the most basic supervised linear dimension reduction methods, where we seek for a T to make samples of the same label as close as possible and vice versa. To begin with, define within-class class matrix S(w) and between-class matrix S(b) as:

S(w)=y=1ci:yi=y(xiμy)(xiμy)TR(d×d)S(b)=y=1cnyμyμTyR(d×d)

where
μy=1nyi:yi=yxi

i:yi=y stands for the sum of y satisfying yi=y , ny is the amount of samples belonging to class y .
Then we can define the projection matrix T :
maxTRm×dtr((TS(w)TT)1TS(b)TT)

It is obvious that our optimization goal is trying to maximize within-class matrix TS(w)TT as well as minimize between-class matrix TS(b)TT .
This optimization problem is solvable once we carry out some approaches similar to the one used in Unsupervised Dimension Reduction, i.e.
S(b)ξ=λS(w)ξ

where the normalized eigenvalues are λ1λd0 and corresponded eigen-vectors are ξ1,,ξd . Taking the largest m eigenvalues we get the solution of T :
Tˆ=(ξ1,,ξm)T
n=100; x=randn(n,2);
x(1:n/2,1)=x(1:n/2,1)-4;
x(n/2+1:end,1)=x(n/2+1:end, 1)+4;
x=x-repmat(mean(x),[n,1]);
y=[ones(n/2,1);2*ones(n/2,1)];

m1=mean(x(y==1,:));
x1=x(y==1,:)-repmat(m1,[n/2,1]);
m2=mean(x(y==2,:));
x2=x(y==2,:)-repmat(m2,[n/2,1]);
[t,v]=eigs(n/2*(m1')*m1+n/2*(m2')*m2,x1'*x1+x2'*x2,1);

figure(1); clf; hold on; axis([-8 8 -6 6]);
plot(x(y==1,1),x(y==1,2),'bo');
plot(x(y==2,1),x(y==2,2),'rx');
plot(99*[-t(1) t(1)],99*[-t(2) t(2)],'k-');

Fisher

Attention please: when samples have several peeks, the output fails to be ideal. Local Fisher Discrimination Analysis may work yet.

相关文章
|
8月前
|
机器学习/深度学习 自然语言处理 算法
【模式识别】探秘判别奥秘:Fisher线性判别算法的解密与实战
【模式识别】探秘判别奥秘:Fisher线性判别算法的解密与实战
169 0
|
机器学习/深度学习 数据挖掘
R实战|从文献入手谈谈logistic回归、Cox回归以及Lasso分析(一)
R实战|从文献入手谈谈logistic回归、Cox回归以及Lasso分析(一)
791 0
|
机器学习/深度学习 存储 算法
机器学习面试笔试知识点之非监督学习-K 均值聚类、高斯混合模型(GMM)、自组织映射神经网络(SOM)
机器学习面试笔试知识点之非监督学习-K 均值聚类、高斯混合模型(GMM)、自组织映射神经网络(SOM)
128 0
|
8月前
|
机器学习/深度学习 Python
python实现判别分析
python实现判别分析
102 1
|
8月前
|
机器学习/深度学习 计算机视觉
数据分享|R语言GLM广义线性模型:逻辑回归、泊松回归拟合小鼠临床试验数据(剂量和反应)示例和自测题
数据分享|R语言GLM广义线性模型:逻辑回归、泊松回归拟合小鼠临床试验数据(剂量和反应)示例和自测题
|
8月前
|
SQL 数据可视化 数据挖掘
R语言线性判别分析(LDA),二次判别分析(QDA)和正则判别分析(RDA)
R语言线性判别分析(LDA),二次判别分析(QDA)和正则判别分析(RDA)
|
8月前
|
机器学习/深度学习 算法
SVM算法、朴素贝叶斯算法讲解及对iris数据集分类实战(附源码)
SVM算法、朴素贝叶斯算法讲解及对iris数据集分类实战(附源码)
270 0
|
算法 数据可视化 数据库
Apriori关联算法讲解以及利用Python实现算法软件设计
Apriori关联算法讲解以及利用Python实现算法软件设计
203 1
Apriori关联算法讲解以及利用Python实现算法软件设计
|
Python
计量经济学笔记之OLS回归的推导
计量经济学笔记之OLS回归的推导
232 0
计量经济学笔记之OLS回归的推导
|
机器学习/深度学习
2022年数模国赛C题(岭回归、区间预测、矩阵热力图、Fisher判别分类模型)——总结心得(附最后一次数模经历,Matlab\SPSS\Lingo的理解综合)
2022年数模国赛C题(岭回归、区间预测、矩阵热力图、Fisher判别分类模型)——总结心得(附最后一次数模经历,Matlab\SPSS\Lingo的理解综合)
688 0
2022年数模国赛C题(岭回归、区间预测、矩阵热力图、Fisher判别分类模型)——总结心得(附最后一次数模经历,Matlab\SPSS\Lingo的理解综合)