【Python机器学习】实验03 逻辑回归3

简介: 【Python机器学习】实验03 逻辑回归3

2.7 定义正则化的梯度下降算法

如果我们要使用梯度下降法令这个代价函数最小化,因为我们未对w 0 进行正则化,所以梯度下降算法将分两种情形:

5da49e03d22cea3bc3b2e6d6b5bdd6b.png

def grandient_reg(X,w,y,iter_num,alpha,lambd):
    y=y.reshape((X.shape[0],1))
    w=np.zeros((X.shape[1],1))
    cost_lst=[] 
    for i in range(iter_num):
        y_pred=h(X,w)-y
        temp=np.zeros((X.shape[1],1))
        for j in range(0,X.shape[1]):
            if j==0:
                right_0=np.multiply(y_pred.ravel(),X[:,0])
                gradient_0=1/(X.shape[0])*(np.sum(right_0))
                temp[j,0]=w[j,0]-alpha*(gradient_0)
            else:
                right=np.multiply(y_pred.ravel(),X[:,j])
                reg=(lambd/X.shape[0])*w[j,0]
                gradient=1/(X.shape[0])*(np.sum(right))
                temp[j,0]=w[j,0]-alpha*(gradient+reg)          
        w=temp
        cost_lst.append(cost_reg(X,w,y,lambd))
    return w,cost_lst
ter_num,alpha,lambd=600000,0.001,1
w2,cost_lst=grandient_reg(X2,w,y2,iter_num,alpha,lambd)
plt.plot(range(iter_num),cost_lst)
[<matplotlib.lines.Line2D at 0x1422dddef40>]

请注意等式中的"reg" 项。还注意到另外的一个“学习率”参数。这是一种超参数,用来控制正则化项。现在我们需要添加正则化梯度函数:

就像在第一部分中做的一样,初始化变量。

实验1 计算基于正则化得到的准确率

y_pred=[1 if item>=0.5 else 0  for item in sigmoid(X2@w).ravel()]
y_pred=np.array(y_pred)
np.sum(y_pred==y2)/y2.shape[0]
0.8305084745762712

现在,让我们尝试调用新的默认为0的w的正则化函数,以确保计算工作正常。最后,我们可以使用第1部分中的预测函数来查看我们的方案在训练数据上的准确度。

2.8 试试sklearn

from sklearn import linear_model#调用sklearn的线性回归包
model = linear_model.LogisticRegression(penalty='l2', C=1.0)
model.fit(X2, y2.ravel())
LogisticRegression()
model.score(X2, y2)
0.8389830508474576

参考

[1] Andrew Ng. Machine Learning[EB/OL]. StanfordUniversity,2014.https://www.coursera.org/course/ml

[2] 李航. 统计学习方法[M]. 北京: 清华大学出版社,2019.

import sklearn.datasets as datasets
from sklearn.linear_model import LogisticRegression
import matplotlib.pyplot as plt

3.1 准备数据

X, y = datasets.make_blobs(n_samples=200, n_features=2, centers=2, random_state=0)
X.shape, y.shape
((200, 2), (200,))
X
array([[ 2.8219307 ,  1.25395648],
       [ 1.65581849,  1.26771955],
       [ 3.12377692,  0.44427786],
       [ 1.4178305 ,  0.50039185],
       [ 2.50904929,  5.7731461 ],
       [ 0.30380963,  3.94423417],
       [ 1.12031365,  5.75806083],
       [ 0.08848433,  2.32299086],
       [ 1.92238694,  0.59987278],
       [-0.65392827,  4.76656958],
       [ 1.45895348,  0.84509636],
       [ 0.51447051,  0.96092565],
       [ 1.35269561,  3.20438654],
       [-0.27652528,  5.08127768],
       [ 2.15299249,  1.48061734],
       [ 0.17286041,  3.61423755],
       [-0.20029671, -0.12484318],
       [ 3.52184624,  1.7502156 ],
       [ 2.5763324 ,  0.32187569],
       [ 2.89689879,  0.64820508],
       [ 1.36742991, -0.31641374],
       [-0.33963733,  3.84220272],
       [ 2.07592967,  4.95905106],
       [ 0.206354  ,  4.84303652],
       [ 2.89921211,  5.78430212],
       [ 0.340424  ,  4.98022062],
       [ 1.78753398, -0.23034767],
       [ 1.18454506,  5.28042636],
       [ 1.61434489,  0.61730816],
       [-0.60390472,  1.50398318],
       [-0.19685333,  6.24740851],
       [ 0.72100905, -0.44905385],
       [ 2.96544643,  1.21488188],
       [ 1.06975678, -0.57417135],
       [ 0.90802847,  6.01713005],
       [-0.17119857,  3.86596728],
       [ 1.36321767,  2.43404071],
       [ 1.24190326, -0.56876067],
       [ 1.33263648,  5.0103605 ],
       [ 0.62835793,  4.4601363 ],
       [ 0.70826671,  5.10624372],
       [ 2.8285205 , -0.28621698],
       [ 1.57561171,  1.51802196],
       [ 0.94808785,  4.7321192 ],
       [ 1.0427873 ,  4.60625923],
       [ 2.19722068,  0.57833524],
       [-0.29421492,  5.27318404],
       [ 0.02458305,  2.96215652],
       [ 2.16429987,  4.62072994],
       [ 4.31457647,  0.85540651],
       [ 0.86640826,  0.39084731],
       [ 1.5528609 ,  4.09548857],
       [ 1.44193252,  2.76754364],
       [ 0.93698726,  3.13569383],
       [ 2.21177406,  1.1298447 ],
       [ 0.46546494,  3.12315514],
       [ 3.13950603,  5.64031528],
       [ 0.9867701 ,  6.08965782],
       [ 1.74438135,  0.99506383],
       [ 0.89791226,  0.58537141],
       [ 2.74904067,  0.73809022],
       [ 4.01117983,  1.28775698],
       [-0.09448254,  5.35823905],
       [ 0.62227617,  2.92883603],
       [ 3.35941485,  5.24826681],
       [ 2.1047625 ,  1.39150044],
       [ 1.01001416,  2.10880895],
       [ 2.63378902,  1.24731812],
       [ 2.15504965,  4.12386249],
       [ 0.28170222,  4.15415279],
       [ 4.35918422, -0.16235216],
       [ 0.4666179 ,  3.86571303],
       [ 0.11898772,  1.08644226],
       [ 1.69057398,  1.05436752],
       [ 1.92156596,  1.97540747],
       [ 2.84159548,  0.43124456],
       [ 1.89760051,  3.15438716],
       [ 0.74874067,  2.55579434],
       [ 0.1631238 ,  2.57750473],
       [ 1.45661358, -0.21823333],
       [ 1.14294357,  4.93881876],
       [ 2.03824711,  1.2768154 ],
       [-1.57671974,  4.95740592],
       [-0.73000011,  6.25456272],
       [ 1.37125662,  2.55721446],
       [ 2.84382904,  5.20983199],
       [-0.51498751,  4.74317903],
       [ 2.01309607,  0.61077647],
       [ 1.67038771,  0.99201525],
       [ 1.59167155,  1.37914513],
       [ 1.37861172,  3.61897724],
       [-0.02394527,  2.75901623],
       [ 0.11504439,  6.21385228],
       [ 2.11567076,  3.06896151],
       [ 1.91931782,  2.03455502],
       [ 2.03958541,  1.05859183],
       [ 1.84836385,  1.77784257],
       [ 0.52073758,  4.32126649],
       [ 1.0220286 ,  4.11660348],
       [ 1.2911236 , -0.54012781],
       [ 0.34194798,  3.94104616],
       [ 2.5490093 ,  0.78155972],
       [ 1.15369622,  3.90200639],
       [ 0.60708824,  4.06440815],
       [-0.63762777,  4.09104705],
       [ 1.28933778,  3.44969159],
       [-0.12811326,  4.35595241],
       [ 0.08080352,  4.69068983],
       [ 3.20759909,  1.97728225],
       [ 0.06344785,  5.42080362],
       [ 2.80245586, -0.2912813 ],
       [ 2.20656076,  5.50616718],
       [ 1.7373078 ,  4.42546234],
       [ 1.70536064,  4.43277024],
       [ 0.47823763,  6.23331938],
       [ 2.6225578 ,  0.67498856],
       [ 0.21219797,  0.41968966],
       [ 1.76343016,  0.13617145],
       [ 1.09932252,  0.55168188],
       [ 1.86461403,  0.50281415],
       [ 1.59034945,  5.225994  ],
       [ 2.48152625,  1.57457169],
       [ 0.58894326,  4.00148458],
       [ 1.35056725,  1.84092438],
       [ 0.3571617 ,  1.28494414],
       [ 2.7216506 ,  0.43694387],
       [ 1.92352205,  4.14877723],
       [ 2.0309414 ,  0.15963275],
       [ 2.69858199, -0.67295975],
       [ 1.83310069,  3.65276173],
       [ 1.45795145,  0.65974193],
       [ 1.37227679,  3.21072582],
       [ 0.54111653,  6.15305106],
       [ 2.57915855,  0.98608575],
       [ 0.23151526,  3.47734879],
       [ 2.84382807,  3.32650945],
       [-0.24916544,  5.1481503 ],
       [ 1.40285894,  0.50671028],
       [ 2.74508569,  2.19950989],
       [ 3.70340245,  1.06189142],
       [ 1.42013331,  4.63746165],
       [ 0.47232912,  1.50804304],
       [ 1.8971289 ,  4.62251498],
       [ 0.10547293,  3.72493766],
       [ 2.32978388,  0.00674858],
       [ 1.60150153,  2.70172967],
       [ 0.30193742,  4.33561789],
       [-0.31658683,  4.5708382 ],
       [ 2.34161121,  1.50650749],
       [ 1.94472686,  1.91783637],
       [ 1.40297392,  0.37647435],
       [ 0.06897171,  4.35573272],
       [ 1.74806063,  5.12729148],
       [ 1.49954674,  4.132241  ],
       [ 0.63120661,  0.40434378],
       [ 1.27450825,  5.63017322],
       [ 0.66471755,  4.35995267],
       [ 1.42717996,  0.41663654],
       [ 2.9871159 ,  1.23762864],
       [ 1.33566313,  0.08467067],
       [ 0.92844171,  0.16698591],
       [ 2.46452227,  6.1996765 ],
       [ 2.85942078,  2.95602827],
       [ 2.69539905, -0.71929238],
       [ 1.70183577, -0.71881053],
       [ 1.11082127,  0.48761397],
       [ 0.23670708,  5.84680192],
       [ 1.1312175 ,  4.68194985],
       [ 0.33265168,  2.08038418],
       [-0.07228289,  2.88376939],
       [ 1.74625455, -0.77834015],
       [ 1.93710348,  0.21748546],
       [ 3.41979937,  0.20821448],
       [ 1.10318217,  4.70577669],
       [ 2.33570923, -0.09545995],
       [ 1.64856484,  4.71124916],
       [ 1.92569089,  4.39133857],
       [ 0.57309313,  5.5262324 ],
       [ 3.54975207, -1.17232137],
       [ 2.45431387, -1.8749291 ],
       [ 0.89908509,  1.67886176],
       [ 1.84070628,  3.56162231],
       [ 1.99364112,  0.79035838],
       [ 2.102906  ,  3.22385582],
       [ 0.87305123,  4.71438583],
       [ 0.5626511 ,  3.55633252],
       [ 2.75372467,  0.90143455],
       [ 2.09389807, -0.75905144],
       [ 1.32967014, -0.4857003 ],
       [-0.05797276,  4.98538185],
       [ 1.51240605,  1.31371371],
       [ 0.87781755,  3.64030904],
       [ 0.29937694,  1.34859812],
       [ 2.33519212,  0.79951327],
       [ 2.91319145,  2.03876553],
       [ 2.74680627,  1.5924128 ],
       [ 2.47034915,  4.09862906],
       [ 3.2460247 ,  2.84942165],
       [ 1.9263585 ,  4.15243012],
       [-0.18887976,  5.20461381]])
plt.scatter(X[:, 0], X[:, 1], c=y)
<matplotlib.collections.PathCollection at 0x142327368e0>

实验2 完成3.2 调用逻辑回归模型完成分类

3.2 调用普通的逻辑回归模型来进行多分类(调用1.4的梯度下降算法)

X=np.insert(X,0,1,axis=1)
X
array([[ 1.        ,  2.8219307 ,  1.25395648],
       [ 1.        ,  1.65581849,  1.26771955],
       [ 1.        ,  3.12377692,  0.44427786],
       [ 1.        ,  1.4178305 ,  0.50039185],
       [ 1.        ,  2.50904929,  5.7731461 ],
       [ 1.        ,  0.30380963,  3.94423417],
       [ 1.        ,  1.12031365,  5.75806083],
       [ 1.        ,  0.08848433,  2.32299086],
       [ 1.        ,  1.92238694,  0.59987278],
       [ 1.        , -0.65392827,  4.76656958],
       [ 1.        ,  1.45895348,  0.84509636],
       [ 1.        ,  0.51447051,  0.96092565],
       [ 1.        ,  1.35269561,  3.20438654],
       [ 1.        , -0.27652528,  5.08127768],
       [ 1.        ,  2.15299249,  1.48061734],
       [ 1.        ,  0.17286041,  3.61423755],
       [ 1.        , -0.20029671, -0.12484318],
       [ 1.        ,  3.52184624,  1.7502156 ],
       [ 1.        ,  2.5763324 ,  0.32187569],
       [ 1.        ,  2.89689879,  0.64820508],
       [ 1.        ,  1.36742991, -0.31641374],
       [ 1.        , -0.33963733,  3.84220272],
       [ 1.        ,  2.07592967,  4.95905106],
       [ 1.        ,  0.206354  ,  4.84303652],
       [ 1.        ,  2.89921211,  5.78430212],
       [ 1.        ,  0.340424  ,  4.98022062],
       [ 1.        ,  1.78753398, -0.23034767],
       [ 1.        ,  1.18454506,  5.28042636],
       [ 1.        ,  1.61434489,  0.61730816],
       [ 1.        , -0.60390472,  1.50398318],
       [ 1.        , -0.19685333,  6.24740851],
       [ 1.        ,  0.72100905, -0.44905385],
       [ 1.        ,  2.96544643,  1.21488188],
       [ 1.        ,  1.06975678, -0.57417135],
       [ 1.        ,  0.90802847,  6.01713005],
       [ 1.        , -0.17119857,  3.86596728],
       [ 1.        ,  1.36321767,  2.43404071],
       [ 1.        ,  1.24190326, -0.56876067],
       [ 1.        ,  1.33263648,  5.0103605 ],
       [ 1.        ,  0.62835793,  4.4601363 ],
       [ 1.        ,  0.70826671,  5.10624372],
       [ 1.        ,  2.8285205 , -0.28621698],
       [ 1.        ,  1.57561171,  1.51802196],
       [ 1.        ,  0.94808785,  4.7321192 ],
       [ 1.        ,  1.0427873 ,  4.60625923],
       [ 1.        ,  2.19722068,  0.57833524],
       [ 1.        , -0.29421492,  5.27318404],
       [ 1.        ,  0.02458305,  2.96215652],
       [ 1.        ,  2.16429987,  4.62072994],
       [ 1.        ,  4.31457647,  0.85540651],
       [ 1.        ,  0.86640826,  0.39084731],
       [ 1.        ,  1.5528609 ,  4.09548857],
       [ 1.        ,  1.44193252,  2.76754364],
       [ 1.        ,  0.93698726,  3.13569383],
       [ 1.        ,  2.21177406,  1.1298447 ],
       [ 1.        ,  0.46546494,  3.12315514],
       [ 1.        ,  3.13950603,  5.64031528],
       [ 1.        ,  0.9867701 ,  6.08965782],
       [ 1.        ,  1.74438135,  0.99506383],
       [ 1.        ,  0.89791226,  0.58537141],
       [ 1.        ,  2.74904067,  0.73809022],
       [ 1.        ,  4.01117983,  1.28775698],
       [ 1.        , -0.09448254,  5.35823905],
       [ 1.        ,  0.62227617,  2.92883603],
       [ 1.        ,  3.35941485,  5.24826681],
       [ 1.        ,  2.1047625 ,  1.39150044],
       [ 1.        ,  1.01001416,  2.10880895],
       [ 1.        ,  2.63378902,  1.24731812],
       [ 1.        ,  2.15504965,  4.12386249],
       [ 1.        ,  0.28170222,  4.15415279],
       [ 1.        ,  4.35918422, -0.16235216],
       [ 1.        ,  0.4666179 ,  3.86571303],
       [ 1.        ,  0.11898772,  1.08644226],
       [ 1.        ,  1.69057398,  1.05436752],
       [ 1.        ,  1.92156596,  1.97540747],
       [ 1.        ,  2.84159548,  0.43124456],
       [ 1.        ,  1.89760051,  3.15438716],
       [ 1.        ,  0.74874067,  2.55579434],
       [ 1.        ,  0.1631238 ,  2.57750473],
       [ 1.        ,  1.45661358, -0.21823333],
       [ 1.        ,  1.14294357,  4.93881876],
       [ 1.        ,  2.03824711,  1.2768154 ],
       [ 1.        , -1.57671974,  4.95740592],
       [ 1.        , -0.73000011,  6.25456272],
       [ 1.        ,  1.37125662,  2.55721446],
       [ 1.        ,  2.84382904,  5.20983199],
       [ 1.        , -0.51498751,  4.74317903],
       [ 1.        ,  2.01309607,  0.61077647],
       [ 1.        ,  1.67038771,  0.99201525],
       [ 1.        ,  1.59167155,  1.37914513],
       [ 1.        ,  1.37861172,  3.61897724],
       [ 1.        , -0.02394527,  2.75901623],
       [ 1.        ,  0.11504439,  6.21385228],
       [ 1.        ,  2.11567076,  3.06896151],
       [ 1.        ,  1.91931782,  2.03455502],
       [ 1.        ,  2.03958541,  1.05859183],
       [ 1.        ,  1.84836385,  1.77784257],
       [ 1.        ,  0.52073758,  4.32126649],
       [ 1.        ,  1.0220286 ,  4.11660348],
       [ 1.        ,  1.2911236 , -0.54012781],
       [ 1.        ,  0.34194798,  3.94104616],
       [ 1.        ,  2.5490093 ,  0.78155972],
       [ 1.        ,  1.15369622,  3.90200639],
       [ 1.        ,  0.60708824,  4.06440815],
       [ 1.        , -0.63762777,  4.09104705],
       [ 1.        ,  1.28933778,  3.44969159],
       [ 1.        , -0.12811326,  4.35595241],
       [ 1.        ,  0.08080352,  4.69068983],
       [ 1.        ,  3.20759909,  1.97728225],
       [ 1.        ,  0.06344785,  5.42080362],
       [ 1.        ,  2.80245586, -0.2912813 ],
       [ 1.        ,  2.20656076,  5.50616718],
       [ 1.        ,  1.7373078 ,  4.42546234],
       [ 1.        ,  1.70536064,  4.43277024],
       [ 1.        ,  0.47823763,  6.23331938],
       [ 1.        ,  2.6225578 ,  0.67498856],
       [ 1.        ,  0.21219797,  0.41968966],
       [ 1.        ,  1.76343016,  0.13617145],
       [ 1.        ,  1.09932252,  0.55168188],
       [ 1.        ,  1.86461403,  0.50281415],
       [ 1.        ,  1.59034945,  5.225994  ],
       [ 1.        ,  2.48152625,  1.57457169],
       [ 1.        ,  0.58894326,  4.00148458],
       [ 1.        ,  1.35056725,  1.84092438],
       [ 1.        ,  0.3571617 ,  1.28494414],
       [ 1.        ,  2.7216506 ,  0.43694387],
       [ 1.        ,  1.92352205,  4.14877723],
       [ 1.        ,  2.0309414 ,  0.15963275],
       [ 1.        ,  2.69858199, -0.67295975],
       [ 1.        ,  1.83310069,  3.65276173],
       [ 1.        ,  1.45795145,  0.65974193],
       [ 1.        ,  1.37227679,  3.21072582],
       [ 1.        ,  0.54111653,  6.15305106],
       [ 1.        ,  2.57915855,  0.98608575],
       [ 1.        ,  0.23151526,  3.47734879],
       [ 1.        ,  2.84382807,  3.32650945],
       [ 1.        , -0.24916544,  5.1481503 ],
       [ 1.        ,  1.40285894,  0.50671028],
       [ 1.        ,  2.74508569,  2.19950989],
       [ 1.        ,  3.70340245,  1.06189142],
       [ 1.        ,  1.42013331,  4.63746165],
       [ 1.        ,  0.47232912,  1.50804304],
       [ 1.        ,  1.8971289 ,  4.62251498],
       [ 1.        ,  0.10547293,  3.72493766],
       [ 1.        ,  2.32978388,  0.00674858],
       [ 1.        ,  1.60150153,  2.70172967],
       [ 1.        ,  0.30193742,  4.33561789],
       [ 1.        , -0.31658683,  4.5708382 ],
       [ 1.        ,  2.34161121,  1.50650749],
       [ 1.        ,  1.94472686,  1.91783637],
       [ 1.        ,  1.40297392,  0.37647435],
       [ 1.        ,  0.06897171,  4.35573272],
       [ 1.        ,  1.74806063,  5.12729148],
       [ 1.        ,  1.49954674,  4.132241  ],
       [ 1.        ,  0.63120661,  0.40434378],
       [ 1.        ,  1.27450825,  5.63017322],
       [ 1.        ,  0.66471755,  4.35995267],
       [ 1.        ,  1.42717996,  0.41663654],
       [ 1.        ,  2.9871159 ,  1.23762864],
       [ 1.        ,  1.33566313,  0.08467067],
       [ 1.        ,  0.92844171,  0.16698591],
       [ 1.        ,  2.46452227,  6.1996765 ],
       [ 1.        ,  2.85942078,  2.95602827],
       [ 1.        ,  2.69539905, -0.71929238],
       [ 1.        ,  1.70183577, -0.71881053],
       [ 1.        ,  1.11082127,  0.48761397],
       [ 1.        ,  0.23670708,  5.84680192],
       [ 1.        ,  1.1312175 ,  4.68194985],
       [ 1.        ,  0.33265168,  2.08038418],
       [ 1.        , -0.07228289,  2.88376939],
       [ 1.        ,  1.74625455, -0.77834015],
       [ 1.        ,  1.93710348,  0.21748546],
       [ 1.        ,  3.41979937,  0.20821448],
       [ 1.        ,  1.10318217,  4.70577669],
       [ 1.        ,  2.33570923, -0.09545995],
       [ 1.        ,  1.64856484,  4.71124916],
       [ 1.        ,  1.92569089,  4.39133857],
       [ 1.        ,  0.57309313,  5.5262324 ],
       [ 1.        ,  3.54975207, -1.17232137],
       [ 1.        ,  2.45431387, -1.8749291 ],
       [ 1.        ,  0.89908509,  1.67886176],
       [ 1.        ,  1.84070628,  3.56162231],
       [ 1.        ,  1.99364112,  0.79035838],
       [ 1.        ,  2.102906  ,  3.22385582],
       [ 1.        ,  0.87305123,  4.71438583],
       [ 1.        ,  0.5626511 ,  3.55633252],
       [ 1.        ,  2.75372467,  0.90143455],
       [ 1.        ,  2.09389807, -0.75905144],
       [ 1.        ,  1.32967014, -0.4857003 ],
       [ 1.        , -0.05797276,  4.98538185],
       [ 1.        ,  1.51240605,  1.31371371],
       [ 1.        ,  0.87781755,  3.64030904],
       [ 1.        ,  0.29937694,  1.34859812],
       [ 1.        ,  2.33519212,  0.79951327],
       [ 1.        ,  2.91319145,  2.03876553],
       [ 1.        ,  2.74680627,  1.5924128 ],
       [ 1.        ,  2.47034915,  4.09862906],
       [ 1.        ,  3.2460247 ,  2.84942165],
       [ 1.        ,  1.9263585 ,  4.15243012],
       [ 1.        , -0.18887976,  5.20461381]])
#调用梯度下降算法
iter_num,alpha=600000,0.001
w,cost_lst=grandient(X,y,iter_num,alpha)
#绘制误差曲线
plt.plot(range(iter_num),cost_lst,"b-o")
[<matplotlib.lines.Line2D at 0x1423849dc70>]

X[y==0,1]
array([ 2.50904929,  0.30380963,  1.12031365,  0.08848433, -0.65392827,
        1.35269561, -0.27652528,  0.17286041, -0.33963733,  2.07592967,
        0.206354  ,  2.89921211,  0.340424  ,  1.18454506, -0.19685333,
        0.90802847, -0.17119857,  1.33263648,  0.62835793,  0.70826671,
        0.94808785,  1.0427873 , -0.29421492,  2.16429987,  1.5528609 ,
        1.44193252,  0.93698726,  0.46546494,  3.13950603,  0.9867701 ,
       -0.09448254,  0.62227617,  3.35941485,  2.15504965,  0.28170222,
        0.4666179 ,  0.1631238 ,  1.14294357, -1.57671974, -0.73000011,
        2.84382904, -0.51498751,  1.37861172, -0.02394527,  0.11504439,
        2.11567076,  0.52073758,  1.0220286 ,  0.34194798,  1.15369622,
        0.60708824, -0.63762777,  1.28933778, -0.12811326,  0.08080352,
        0.06344785,  2.20656076,  1.7373078 ,  1.70536064,  0.47823763,
        1.59034945,  0.58894326,  1.92352205,  1.83310069,  1.37227679,
        0.54111653,  0.23151526,  2.84382807, -0.24916544,  1.42013331,
        1.8971289 ,  0.10547293,  1.60150153,  0.30193742, -0.31658683,
        0.06897171,  1.74806063,  1.49954674,  1.27450825,  0.66471755,
        2.46452227,  2.85942078,  0.23670708,  1.1312175 ,  0.33265168,
       -0.07228289,  1.10318217,  1.64856484,  1.92569089,  0.57309313,
        1.84070628,  2.102906  ,  0.87305123,  0.5626511 , -0.05797276,
        0.87781755,  2.47034915,  3.2460247 ,  1.9263585 , -0.18887976])
#绘制线性的决策边界
x_exmal=np.linspace(np.min(X[:,1]),np.max(X[:,1]),50)
x2=(-w[0,0]-w[1,0]*x_exmal)/(w[2,0])
plt.plot(x_exmal,x2,"r-o")
plt.scatter(X[y==1,1],X[y==1,2],color="b",marker="o")
plt.scatter(X[y==0,1],X[y==0,2],color="c",marker="^")
plt.show()

#计算准确率
y_pred=[1 if item>=0.5 else 0  for item in sigmoid(X@w).ravel()]
y_pred=np.array(y_pred)
np.sum(y_pred==y)/y.shape[0]
0.97

实验3 完成3.3 调用正则化的逻辑回归模型完成分类

3.3调用正则化的逻辑回归模型来进行多分类(调用2.7的梯度下降算法)

y.shape,X.shape,w.shape


((200,), (200, 3), (3, 1))
#调用梯度下降算法
iter_num,alpha,lambd=600000,0.001,1
w,cost_lst=grandient_reg(X,w,y,iter_num,alpha,lambd)
#绘制误差曲线
plt.plot(range(iter_num),cost_lst,"b-o")
[<matplotlib.lines.Line2D at 0x1423279f070>]

#绘制线性的决策边界
x_exmal=np.linspace(np.min(X[:,1]),np.max(X[:,1]),50)
x2=(-w[0,0]-w[1,0]*x_exmal)/(w[2,0])
plt.plot(x_exmal,x2,"r-o")
plt.scatter(X[y==1,1],X[y==1,2],color="b",marker="o")
plt.scatter(X[y==0,1],X[y==0,2],color="c",marker="^")
plt.show()

y.shape,X.shape,w.shape
((200,), (200, 3), (3, 1))
#计算准确率
y_pred=[1 if item>=0.5 else 0  for item in sigmoid(X@w).ravel()]
y_pred=np.array(y_pred)
np.sum(y_pred==y)/y.shape[0]
0.97

实验4 完成3.3 调用SKLEARN完成分类

3.4 调用SKLEARN

from sklearn.linear_model import LogisticRegression
clf = LogisticRegression().fit(X, y)
clf.score(X,y)
0.97


目录
相关文章
|
4天前
|
机器学习/深度学习 Web App开发 算法
Python 机器学习算法交易实用指南(一)(5)
Python 机器学习算法交易实用指南(一)
11 2
|
4天前
|
传感器 机器学习/深度学习 存储
Python 机器学习算法交易实用指南(一)(4)
Python 机器学习算法交易实用指南(一)
15 4
|
4天前
|
机器学习/深度学习 算法 API
Python 机器学习算法交易实用指南(一)(3)
Python 机器学习算法交易实用指南(一)
15 4
|
4天前
|
机器学习/深度学习 存储 算法
Python 机器学习算法交易实用指南(一)(2)
Python 机器学习算法交易实用指南(一)
10 2
|
19天前
|
机器学习/深度学习 存储 搜索推荐
利用机器学习算法改善电商推荐系统的效率
电商行业日益竞争激烈,提升用户体验成为关键。本文将探讨如何利用机器学习算法优化电商推荐系统,通过分析用户行为数据和商品信息,实现个性化推荐,从而提高推荐效率和准确性。
112 14
|
19天前
|
机器学习/深度学习 算法 搜索推荐
Machine Learning机器学习之决策树算法 Decision Tree(附Python代码)
Machine Learning机器学习之决策树算法 Decision Tree(附Python代码)
|
19天前
|
机器学习/深度学习 算法 数据可视化
实现机器学习算法时,特征选择是非常重要的一步,你有哪些推荐的方法?
实现机器学习算法时,特征选择是非常重要的一步,你有哪些推荐的方法?
39 1
|
19天前
|
机器学习/深度学习 数据采集 算法
解码癌症预测的密码:可解释性机器学习算法SHAP揭示XGBoost模型的预测机制
解码癌症预测的密码:可解释性机器学习算法SHAP揭示XGBoost模型的预测机制
172 0
|
19天前
|
机器学习/深度学习 数据采集 监控
机器学习-特征选择:如何使用递归特征消除算法自动筛选出最优特征?
机器学习-特征选择:如何使用递归特征消除算法自动筛选出最优特征?
186 0
|
19天前
|
机器学习/深度学习 人工智能 算法
探索机器学习中的支持向量机(SVM)算法
【2月更文挑战第20天】 在数据科学与人工智能的领域中,支持向量机(SVM)是一种强大的监督学习算法,它基于统计学习理论中的VC维理论和结构风险最小化原理。本文将深入探讨SVM的核心概念、工作原理以及实际应用案例。我们将透过算法的数学原理,揭示如何利用SVM进行有效的数据分类与回归分析,并讨论其在处理非线性问题时的优势。通过本文,读者将对SVM有更深层次的理解,并能够在实践中应用这一算法解决复杂的数据问题。
29 0