Fisher判别分析简述

简介: Supervised Dimension ReductionGreater dimensionality always brings about more difficult learning tasks. Here we introduce a supervised dimension reduction method based on linear dimension

Supervised Dimension Reduction

Greater dimensionality always brings about more difficult learning tasks. Here we introduce a supervised dimension reduction method based on linear dimension reduction as introduced in

http://blog.csdn.net/philthinker/article/details/70212147

which can also be simplified as:

z=Tx,xRd,zRm,m<d

Of course, centeralization in the first place is necessary:
xixi1ni=1nxi

Fisher discrimination analysis is one of the most basic supervised linear dimension reduction methods, where we seek for a T to make samples of the same label as close as possible and vice versa. To begin with, define within-class class matrix S(w) and between-class matrix S(b) as:

S(w)=y=1ci:yi=y(xiμy)(xiμy)TR(d×d)S(b)=y=1cnyμyμTyR(d×d)

where
μy=1nyi:yi=yxi

i:yi=y stands for the sum of y satisfying yi=y , ny is the amount of samples belonging to class y .
Then we can define the projection matrix T :
maxTRm×dtr((TS(w)TT)1TS(b)TT)

It is obvious that our optimization goal is trying to maximize within-class matrix TS(w)TT as well as minimize between-class matrix TS(b)TT .
This optimization problem is solvable once we carry out some approaches similar to the one used in Unsupervised Dimension Reduction, i.e.
S(b)ξ=λS(w)ξ

where the normalized eigenvalues are λ1λd0 and corresponded eigen-vectors are ξ1,,ξd . Taking the largest m eigenvalues we get the solution of T :
Tˆ=(ξ1,,ξm)T
n=100; x=randn(n,2);
x(1:n/2,1)=x(1:n/2,1)-4;
x(n/2+1:end,1)=x(n/2+1:end, 1)+4;
x=x-repmat(mean(x),[n,1]);
y=[ones(n/2,1);2*ones(n/2,1)];

m1=mean(x(y==1,:));
x1=x(y==1,:)-repmat(m1,[n/2,1]);
m2=mean(x(y==2,:));
x2=x(y==2,:)-repmat(m2,[n/2,1]);
[t,v]=eigs(n/2*(m1')*m1+n/2*(m2')*m2,x1'*x1+x2'*x2,1);

figure(1); clf; hold on; axis([-8 8 -6 6]);
plot(x(y==1,1),x(y==1,2),'bo');
plot(x(y==2,1),x(y==2,2),'rx');
plot(99*[-t(1) t(1)],99*[-t(2) t(2)],'k-');

Fisher

Attention please: when samples have several peeks, the output fails to be ideal. Local Fisher Discrimination Analysis may work yet.

相关文章
|
8月前
|
机器学习/深度学习 自然语言处理 算法
【模式识别】探秘判别奥秘:Fisher线性判别算法的解密与实战
【模式识别】探秘判别奥秘:Fisher线性判别算法的解密与实战
162 0
|
机器学习/深度学习 数据挖掘
R实战|从文献入手谈谈logistic回归、Cox回归以及Lasso分析(一)
R实战|从文献入手谈谈logistic回归、Cox回归以及Lasso分析(一)
770 0
|
机器学习/深度学习 存储 算法
机器学习面试笔试知识点之非监督学习-K 均值聚类、高斯混合模型(GMM)、自组织映射神经网络(SOM)
机器学习面试笔试知识点之非监督学习-K 均值聚类、高斯混合模型(GMM)、自组织映射神经网络(SOM)
127 0
|
8月前
|
SQL 数据可视化 数据挖掘
R语言线性判别分析(LDA),二次判别分析(QDA)和正则判别分析(RDA)
R语言线性判别分析(LDA),二次判别分析(QDA)和正则判别分析(RDA)
|
8月前
|
机器学习/深度学习 算法 数据可视化
【模式识别】探秘分类奥秘:最近邻算法解密与实战
【模式识别】探秘分类奥秘:最近邻算法解密与实战
75 0
|
Windows
生信教程|最大似然系统发育推断
生信教程|最大似然系统发育推断
90 0
|
数据可视化 Linux
PCA分析基本知识和数学原理
PCA分析基本知识和数学原理
|
Python
计量经济学笔记之OLS回归的推导
计量经济学笔记之OLS回归的推导
226 0
计量经济学笔记之OLS回归的推导
|
机器学习/深度学习 人工智能 移动开发
【机器学习】线性分类——朴素贝叶斯分类器NBC(理论+图解+公式推导)
【机器学习】线性分类——朴素贝叶斯分类器NBC(理论+图解+公式推导)
180 0
【机器学习】线性分类——朴素贝叶斯分类器NBC(理论+图解+公式推导)
|
机器学习/深度学习 人工智能 移动开发
【机器学习】线性分类——高斯判别分析GDA(理论+图解+公式推导)
【机器学习】线性分类——高斯判别分析GDA(理论+图解+公式推导)
412 0
【机器学习】线性分类——高斯判别分析GDA(理论+图解+公式推导)

热门文章

最新文章