鸢尾花数据集分类问题(3)

简介: 鸢尾花数据集分类问题

鸢尾花数据集分类问题(2)https://developer.aliyun.com/article/1540969

6.循环寻找最优解

lr = 0.1  # 学习率为0.1
train_loss_results = []  # 将每轮的loss记录在此列表中,为后续画loss曲线提供数据
test_acc = []  # 将每轮的acc记录在此列表中,为后续画acc曲线提供数据
epoch = 500  # 循环500轮
loss_all = 0  # 每轮分4个step,loss_all记录四个step生成的4个loss的和
for epoch in range(epoch):  #数据集级别的循环,每个epoch循环一次数据集
    for step, (x_train, y_train) in enumerate(train_db):  #batch级别的循环 ,每个step循环一个batch
        with tf.GradientTape() as tape:  # with结构记录梯度信息
            # y = tf.matmul(x_train, w1) + b1     # 神经网络乘加运算
            y = tf.matmul(tf.cast(x_train,dtype=tf.float32), tf.cast(w1,dtype=tf.float32)) + tf.cast(b1,dtype=tf.float32)  # 神经网络乘加运算
            y = tf.nn.softmax(y)  # 使输出y符合概率分布(此操作后与独热码同量级,可相减求loss)
            y_ = tf.one_hot(y_train, depth=3)  # 将标签值转换为独热码格式,方便计算loss和accuracy
            loss = tf.reduce_mean(tf.square(y_ - y))  # 采用均方误差损失函数mse = mean(sum(y-out)^2)
            loss_all += loss.numpy()  # 将每个step计算出的loss累加,为后续求loss平均值提供数据,这样计算的loss更准确
        # 计算loss对各个参数的梯度
        grads = tape.gradient(loss, [w1, b1])
        # 实现梯度更新 w1 = w1 - lr * w1_grad    b = b - lr * b_grad
        w1.assign_sub(lr * grads[0])  # 参数w1自更新
        b1.assign_sub(lr * grads[1])  # 参数b自更新
    # 每个epoch,打印loss信息
    print("Epoch {}, loss: {}".format(epoch, loss_all/4))
    train_loss_results.append(loss_all / 4)  # 将4个step的loss求平均记录在此变量中
    loss_all = 0  # loss_all归零,为记录下一个epoch的loss做准备
Epoch 0, loss: 0.018348709330894053
Epoch 1, loss: 0.01834014558698982
Epoch 2, loss: 0.018331602681428194
Epoch 3, loss: 0.018323074793443084
Epoch 4, loss: 0.018314557150006294
Epoch 5, loss: 0.018306054058484733
Epoch 6, loss: 0.01829756808001548
Epoch 7, loss: 0.018289093393832445
Epoch 8, loss: 0.018280635005794466
Epoch 9, loss: 0.01827219035476446
Epoch 10, loss: 0.01826376060489565
Epoch 11, loss: 0.018255344009958208
Epoch 12, loss: 0.01824694511014968
Epoch 13, loss: 0.018238553777337074
Epoch 14, loss: 0.018230177694931626
Epoch 15, loss: 0.01822182268369943
Epoch 16, loss: 0.018213476752862334
Epoch 17, loss: 0.01820514490827918
Epoch 18, loss: 0.01819682738278061
Epoch 19, loss: 0.018188522895798087
Epoch 20, loss: 0.018180232029408216
Epoch 21, loss: 0.01817195222247392
Epoch 22, loss: 0.018163689645007253
Epoch 23, loss: 0.01815544522833079
Epoch 24, loss: 0.01814720663242042
Epoch 25, loss: 0.018138986080884933
Epoch 26, loss: 0.01813077600672841
Epoch 27, loss: 0.018122584326192737
Epoch 28, loss: 0.01811440079472959
Epoch 29, loss: 0.018106236471794546
Epoch 30, loss: 0.018098083091899753
Epoch 31, loss: 0.018089941586367786
Epoch 32, loss: 0.018081813119351864
Epoch 33, loss: 0.018073701881803572
Epoch 34, loss: 0.01806559960823506
Epoch 35, loss: 0.018057511537335813
Epoch 36, loss: 0.01804943708702922
Epoch 37, loss: 0.018041377421468496
Epoch 38, loss: 0.018033328698948026
Epoch 39, loss: 0.018025295110419393
Epoch 40, loss: 0.018017273396253586
Epoch 41, loss: 0.018009266234003007
Epoch 42, loss: 0.018001268268562853
Epoch 43, loss: 0.01799328823108226
Epoch 44, loss: 0.01798531913664192
Epoch 45, loss: 0.017977362498641014
Epoch 46, loss: 0.01796942122746259
Epoch 47, loss: 0.017961489502340555
Epoch 48, loss: 0.017953572561964393
Epoch 49, loss: 0.017945665982551873
Epoch 50, loss: 0.01793777325656265
Epoch 51, loss: 0.017929895548149943
Epoch 52, loss: 0.01792202692013234
Epoch 53, loss: 0.01791416946798563
Epoch 54, loss: 0.017906329594552517
Epoch 55, loss: 0.017898502526804805
Epoch 56, loss: 0.017890685354359448
Epoch 57, loss: 0.017882882384583354
Epoch 58, loss: 0.017875090707093477
Epoch 59, loss: 0.017867310787551105
Epoch 60, loss: 0.01785954600200057
Epoch 61, loss: 0.01785179285798222
Epoch 62, loss: 0.0178440521704033
Epoch 63, loss: 0.01783632324077189
Epoch 64, loss: 0.01782860280945897
Epoch 65, loss: 0.017820901703089476
Epoch 66, loss: 0.01781320560257882
Epoch 67, loss: 0.017805526731535792
Epoch 68, loss: 0.017797861131839454
Epoch 69, loss: 0.01779020589310676
Epoch 70, loss: 0.017782564624212682
Epoch 71, loss: 0.017774931038729846
Epoch 72, loss: 0.017767311888746917
Epoch 73, loss: 0.01775970368180424
Epoch 74, loss: 0.017752105719409883
Epoch 75, loss: 0.017744520911946893
Epoch 76, loss: 0.017736950423568487
Epoch 77, loss: 0.01772939204238355
Epoch 78, loss: 0.01772184018045664
Epoch 79, loss: 0.017714308574795723
Epoch 80, loss: 0.017706783837638795
Epoch 81, loss: 0.01769926946144551
Epoch 82, loss: 0.017691769637167454
Epoch 83, loss: 0.017684280988760293
Epoch 84, loss: 0.017676800955086946
Epoch 85, loss: 0.01766933756880462
Epoch 86, loss: 0.017661882215179503
Epoch 87, loss: 0.017654439667239785
Epoch 88, loss: 0.017647008411586285
Epoch 89, loss: 0.017639590427279472
Epoch 90, loss: 0.017632179893553257
Epoch 91, loss: 0.01762478763703257
Epoch 92, loss: 0.01761740050278604
Epoch 93, loss: 0.0176100266398862
Epoch 94, loss: 0.01760266872588545
Epoch 95, loss: 0.017595318146049976
Epoch 96, loss: 0.0175879776943475
Epoch 97, loss: 0.01758064958266914
Epoch 98, loss: 0.017573332996107638
Epoch 99, loss: 0.017566024092957377
Epoch 100, loss: 0.017558730207383633
Epoch 101, loss: 0.017551449011079967
Epoch 102, loss: 0.017544178059324622
Epoch 103, loss: 0.01753691746853292
Epoch 104, loss: 0.01752966921776533
Epoch 105, loss: 0.01752243028022349
Epoch 106, loss: 0.017515201005153358
Epoch 107, loss: 0.017507984535768628
Epoch 108, loss: 0.017500781221315265
Epoch 109, loss: 0.017493583145551383
Epoch 110, loss: 0.017486400436609983
Epoch 111, loss: 0.017479229834862053
Epoch 112, loss: 0.017472067032940686
Epoch 113, loss: 0.01746491272933781
Epoch 114, loss: 0.017457775538787246
Epoch 115, loss: 0.01745064160786569
Epoch 116, loss: 0.017443525372073054
Epoch 117, loss: 0.01743642147630453
Epoch 118, loss: 0.01742932270281017
Epoch 119, loss: 0.017422234057448804
Epoch 120, loss: 0.017415160895325243
Epoch 121, loss: 0.017408096930012107
Epoch 122, loss: 0.017401042976416647
Epoch 123, loss: 0.017394001595675945
Epoch 124, loss: 0.01738697080872953
Epoch 125, loss: 0.01737994607537985
Epoch 126, loss: 0.01737293624319136
Epoch 127, loss: 0.01736593304667622
Epoch 128, loss: 0.01735894614830613
Epoch 129, loss: 0.017351965652778745
Epoch 130, loss: 0.017344993189908564
Epoch 131, loss: 0.017338032950647175
Epoch 132, loss: 0.017331089940853417
Epoch 133, loss: 0.01732415158767253
Epoch 134, loss: 0.017317220801487565
Epoch 135, loss: 0.0173103054985404
Epoch 136, loss: 0.01730339613277465
Epoch 137, loss: 0.01729650003835559
Epoch 138, loss: 0.017289610113948584
Epoch 139, loss: 0.017282731248997152
Epoch 140, loss: 0.01727586518973112
Epoch 141, loss: 0.017269007512368262
Epoch 142, loss: 0.017262160661630332
Epoch 143, loss: 0.017255326150916517
Epoch 144, loss: 0.01724849990569055
Epoch 145, loss: 0.01724168413784355
Epoch 146, loss: 0.01723487861454487
Epoch 147, loss: 0.017228084267117083
Epoch 148, loss: 0.01722129958216101
Epoch 149, loss: 0.01721451699268073
Epoch 150, loss: 0.017207755357958376
Epoch 151, loss: 0.01720099721569568
Epoch 152, loss: 0.0171942503657192
Epoch 153, loss: 0.017187514924444258
Epoch 154, loss: 0.017180791590362787
Epoch 155, loss: 0.01717407302930951
Epoch 156, loss: 0.017167365411296487
Epoch 157, loss: 0.017160667572170496
Epoch 158, loss: 0.017153977882117033
Epoch 159, loss: 0.017147303326055408
Epoch 160, loss: 0.01714063237886876
Epoch 161, loss: 0.017133974004536867
Epoch 162, loss: 0.017127325758337975
Epoch 163, loss: 0.01712068449705839
Epoch 164, loss: 0.017114059766754508
Epoch 165, loss: 0.017107439343817532
Epoch 166, loss: 0.0171008255565539
Epoch 167, loss: 0.01709422725252807
Epoch 168, loss: 0.017087630345486104
Epoch 169, loss: 0.017081054509617388
Epoch 170, loss: 0.017074481584131718
Epoch 171, loss: 0.017067916807718575
Epoch 172, loss: 0.017061361926607788
Epoch 173, loss: 0.01705481717363
Epoch 174, loss: 0.01704828254878521
Epoch 175, loss: 0.01704175816848874
Epoch 176, loss: 0.01703524007461965
Epoch 177, loss: 0.017028734902851284
Epoch 178, loss: 0.017022235435433686
Epoch 179, loss: 0.017015748424455523
Epoch 180, loss: 0.017009268747642636
Epoch 181, loss: 0.01700279803480953
Epoch 182, loss: 0.016996338381431997
Epoch 183, loss: 0.016989889089018106
Epoch 184, loss: 0.016983451205305755
Epoch 185, loss: 0.01697701681405306
Epoch 186, loss: 0.016970591619610786
Epoch 187, loss: 0.01696417632047087
Epoch 188, loss: 0.016957771498709917
Epoch 189, loss: 0.01695137470960617
Epoch 190, loss: 0.01694498595315963
Epoch 191, loss: 0.016938610235229135
Epoch 192, loss: 0.016932239173911512
Epoch 193, loss: 0.016925878240726888
Epoch 194, loss: 0.01691952720284462
Epoch 195, loss: 0.016913186525925994
Epoch 196, loss: 0.016906854696571827
Epoch 197, loss: 0.016900531016290188
Epoch 198, loss: 0.016894210246391594
Epoch 199, loss: 0.016887904377654195
Epoch 200, loss: 0.016881612362340093
Epoch 201, loss: 0.016875318135134876
Epoch 202, loss: 0.01686903869267553
Epoch 203, loss: 0.016862768796272576
Epoch 204, loss: 0.01685650448780507
Epoch 205, loss: 0.016850251937285066
Epoch 206, loss: 0.016844003926962614
Epoch 207, loss: 0.016837768722325563
Epoch 208, loss: 0.01683154294732958
Epoch 209, loss: 0.016825323225930333
Epoch 210, loss: 0.016819112235680223
Epoch 211, loss: 0.0168129145167768
Epoch 212, loss: 0.016806717729195952
Epoch 213, loss: 0.016800534911453724
Epoch 214, loss: 0.016794355935417116
Epoch 215, loss: 0.016788186854682863
Epoch 216, loss: 0.016782031278125942
Epoch 217, loss: 0.016775880940258503
Epoch 218, loss: 0.01676973991561681
Epoch 219, loss: 0.016763601452112198
Epoch 220, loss: 0.01675747858826071
Epoch 221, loss: 0.016751361661590636
Epoch 222, loss: 0.016745252651162446
Epoch 223, loss: 0.01673915423452854
Epoch 224, loss: 0.01673306548036635
Epoch 225, loss: 0.01672698266338557
Epoch 226, loss: 0.016720909159630537
Epoch 227, loss: 0.016714839730411768
Epoch 228, loss: 0.016708783456124365
Epoch 229, loss: 0.016702735447324812
Epoch 230, loss: 0.016696687205694616
Epoch 231, loss: 0.016690657357685268
Epoch 232, loss: 0.016684631700627506
Epoch 233, loss: 0.016678615822456777
Epoch 234, loss: 0.016672609141096473
Epoch 235, loss: 0.016666610725224018
Epoch 236, loss: 0.01666061650030315
Epoch 237, loss: 0.016654630890116096
Epoch 238, loss: 0.016648655757308006
Epoch 239, loss: 0.016642686328850687
Epoch 240, loss: 0.016636730288155377
Epoch 241, loss: 0.01663077261764556
Epoch 242, loss: 0.01662483369000256
Epoch 243, loss: 0.016618896857835352
Epoch 244, loss: 0.016612968873232603
Epoch 245, loss: 0.016607051366008818
Epoch 246, loss: 0.01660113874822855
Epoch 247, loss: 0.016595232766121626
Epoch 248, loss: 0.016589339822530746
Epoch 249, loss: 0.016583450604230165
Epoch 250, loss: 0.016577572794631124
Epoch 251, loss: 0.0165716998744756
Epoch 252, loss: 0.01656583370640874
Epoch 253, loss: 0.0165599777828902
Epoch 254, loss: 0.016554133966565132
Epoch 255, loss: 0.016548291314393282
Epoch 256, loss: 0.016542457044124603
Epoch 257, loss: 0.016536633833311498
Epoch 258, loss: 0.016530817258171737
Epoch 259, loss: 0.01652501046191901
Epoch 260, loss: 0.016519207507371902
Epoch 261, loss: 0.016513409093022346
Epoch 262, loss: 0.016507624299265444
Epoch 263, loss: 0.016501848469488323
Epoch 264, loss: 0.016496076830662787
Epoch 265, loss: 0.016490311361849308
Epoch 266, loss: 0.01648456056136638
Epoch 267, loss: 0.016478811507113278
Epoch 268, loss: 0.016473067924380302
Epoch 269, loss: 0.016467337496578693
Epoch 270, loss: 0.016461612773127854
Epoch 271, loss: 0.01645589538384229
Epoch 272, loss: 0.016450183582492173
Epoch 273, loss: 0.016444484004750848
Epoch 274, loss: 0.016438791411928833
Epoch 275, loss: 0.016433099750429392
Epoch 276, loss: 0.016427419031970203
Epoch 277, loss: 0.016421747859567404
Epoch 278, loss: 0.016416081925854087
Epoch 279, loss: 0.016410424723289907
Epoch 280, loss: 0.016404774389229715
Epoch 281, loss: 0.016399133251979947
Epoch 282, loss: 0.01639349735341966
Epoch 283, loss: 0.01638787181582302
Epoch 284, loss: 0.016382248955778778
Epoch 285, loss: 0.01637663773726672
Epoch 286, loss: 0.01637103164102882
Epoch 287, loss: 0.016365430550649762
Epoch 288, loss: 0.01635984191671014
Epoch 289, loss: 0.016354259103536606
Epoch 290, loss: 0.01634867547545582
Epoch 291, loss: 0.016343110240995884
Epoch 292, loss: 0.01633754454087466
Epoch 293, loss: 0.016331990715116262
Epoch 294, loss: 0.016326445154845715
Epoch 295, loss: 0.01632090308703482
Epoch 296, loss: 0.016315370216034353
Epoch 297, loss: 0.016309840604662895
Epoch 298, loss: 0.016304321703501046
Epoch 299, loss: 0.016298808972351253
Epoch 300, loss: 0.016293306602165103
Epoch 301, loss: 0.01628780399914831
Epoch 302, loss: 0.016282318392768502
Epoch 303, loss: 0.016276830923743546
Epoch 304, loss: 0.016271355096250772
Epoch 305, loss: 0.016265882644802332
Epoch 306, loss: 0.016260426375083625
Epoch 307, loss: 0.016254968708381057
Epoch 308, loss: 0.01624951616395265
Epoch 309, loss: 0.016244074911810458
Epoch 310, loss: 0.016238643671385944
Epoch 311, loss: 0.01623321371152997
Epoch 312, loss: 0.016227793879806995
Epoch 313, loss: 0.01622237788978964
Epoch 314, loss: 0.01621697621885687
Epoch 315, loss: 0.016211574315093458
Epoch 316, loss: 0.016206182190217078
Epoch 317, loss: 0.016200798097997904
Epoch 318, loss: 0.016195413190871477
Epoch 319, loss: 0.016190045629628003
Epoch 320, loss: 0.016184675972908735
Epoch 321, loss: 0.01617931795772165
Epoch 322, loss: 0.01617396529763937
Epoch 323, loss: 0.016168618807569146
Epoch 324, loss: 0.016163283959031105
Epoch 325, loss: 0.01615795132238418
Epoch 326, loss: 0.016152626019902527
Epoch 327, loss: 0.016147307120263577
Epoch 328, loss: 0.01614199543837458
Epoch 329, loss: 0.01613669202197343
Epoch 330, loss: 0.016131390701048076
Epoch 331, loss: 0.016126098460517824
Epoch 332, loss: 0.016120814019814134
Epoch 333, loss: 0.016115537495352328
Epoch 334, loss: 0.016110264230519533
Epoch 335, loss: 0.016104997717775404
Epoch 336, loss: 0.01609973842278123
Epoch 337, loss: 0.016094490187242627
Epoch 338, loss: 0.016089246841147542
Epoch 339, loss: 0.016084003378637135
Epoch 340, loss: 0.016078770509921014
Epoch 341, loss: 0.01607354322914034
Epoch 342, loss: 0.016068323981016874
Epoch 343, loss: 0.016063116141594946
Epoch 344, loss: 0.016057906206697226
Epoch 345, loss: 0.01605270802974701
Epoch 346, loss: 0.016047520330175757
Epoch 347, loss: 0.01604233030229807
Epoch 348, loss: 0.016037149005569518
Epoch 349, loss: 0.01603197434451431
Epoch 350, loss: 0.01602680841460824
Epoch 351, loss: 0.016021644696593285
Epoch 352, loss: 0.016016490990296006
Epoch 353, loss: 0.016011342289857566
Epoch 354, loss: 0.016006200574338436
Epoch 355, loss: 0.01600106677506119
Epoch 356, loss: 0.015995933092199266
Epoch 357, loss: 0.015990809304639697
Epoch 358, loss: 0.015985695994459093
Epoch 359, loss: 0.015980583033524454
Epoch 360, loss: 0.01597548311110586
Epoch 361, loss: 0.01597038039471954
Epoch 362, loss: 0.015965290600433946
Epoch 363, loss: 0.01596020522993058
Epoch 364, loss: 0.015955125214532018
Epoch 365, loss: 0.01595005253329873
Epoch 366, loss: 0.0159449860220775
Epoch 367, loss: 0.015939930453896523
Epoch 368, loss: 0.01593487267382443
Epoch 369, loss: 0.015929824323393404
Epoch 370, loss: 0.015924783772788942
Epoch 371, loss: 0.01591974718030542
Epoch 372, loss: 0.01591471757274121
Epoch 373, loss: 0.01590969110839069
Epoch 374, loss: 0.015904674073681235
Epoch 375, loss: 0.015899666235782206
Epoch 376, loss: 0.015894662123173475
Epoch 377, loss: 0.01588966406416148
Epoch 378, loss: 0.015884671825915575
Epoch 379, loss: 0.01587968470994383
Epoch 380, loss: 0.015874700155109167
Epoch 381, loss: 0.015869727358222008
Epoch 382, loss: 0.015864760149270296
Epoch 383, loss: 0.015859795035794377
Epoch 384, loss: 0.015854840632528067
Epoch 385, loss: 0.015849886927753687
Epoch 386, loss: 0.015844940091483295
Epoch 387, loss: 0.015840004081837833
Epoch 388, loss: 0.01583507191389799
Epoch 389, loss: 0.015830145333893597
Epoch 390, loss: 0.01582522422540933
Epoch 391, loss: 0.0158203081227839
Epoch 392, loss: 0.015815396909601986
Epoch 393, loss: 0.015810495242476463
Epoch 394, loss: 0.015805594623088837
Epoch 395, loss: 0.01580070413183421
Epoch 396, loss: 0.01579581806436181
Epoch 397, loss: 0.01579094084445387
Epoch 398, loss: 0.01578606979455799
Epoch 399, loss: 0.01578119967598468
Epoch 400, loss: 0.01577633630950004
Epoch 401, loss: 0.015771481674164534
Epoch 402, loss: 0.015766629367135465
Epoch 403, loss: 0.015761783928610384
Epoch 404, loss: 0.015756942448206246
Epoch 405, loss: 0.015752111445181072
Epoch 406, loss: 0.01574728370178491
Epoch 407, loss: 0.015742462361231446
Epoch 408, loss: 0.01573764579370618
Epoch 409, loss: 0.01573283178731799
Epoch 410, loss: 0.015728031750768423
Epoch 411, loss: 0.015723232878372073
Epoch 412, loss: 0.015718435985036194
Epoch 413, loss: 0.015713648172095418
Epoch 414, loss: 0.015708868973888457
Epoch 415, loss: 0.015704086748883128
Epoch 416, loss: 0.015699317445978522
Epoch 417, loss: 0.015694554429501295
Epoch 418, loss: 0.015689792577177286
Epoch 419, loss: 0.01568503910675645
Epoch 420, loss: 0.01568028738256544
Epoch 421, loss: 0.01567555032670498
Epoch 422, loss: 0.01567080896347761
Epoch 423, loss: 0.015666080871596932
Epoch 424, loss: 0.015661356854252517
Epoch 425, loss: 0.015656631090678275
Epoch 426, loss: 0.015651918249204755
Epoch 427, loss: 0.015647205989807844
Epoch 428, loss: 0.01564250630326569
Epoch 429, loss: 0.015637808246538043
Epoch 430, loss: 0.01563311368227005
Epoch 431, loss: 0.01562842621933669
Epoch 432, loss: 0.015623745857737958
Epoch 433, loss: 0.015619067242369056
Epoch 434, loss: 0.015614393167197704
Epoch 435, loss: 0.015609726426191628
Epoch 436, loss: 0.015605070278979838
Epoch 437, loss: 0.015600414481014013
Epoch 438, loss: 0.015595762757584453
Epoch 439, loss: 0.015591120114549994
Epoch 440, loss: 0.015586481895297766
Epoch 441, loss: 0.015581847401335835
Epoch 442, loss: 0.015577218844555318
Epoch 443, loss: 0.015572597738355398
Epoch 444, loss: 0.015567979775369167
Epoch 445, loss: 0.015563365770503879
Epoch 446, loss: 0.015558761078864336
Epoch 447, loss: 0.015554160112515092
Epoch 448, loss: 0.015549563453532755
Epoch 449, loss: 0.015544973313808441
Epoch 450, loss: 0.015540387947112322
Epoch 451, loss: 0.015535812941379845
Epoch 452, loss: 0.015531235025264323
Epoch 453, loss: 0.015526664792560041
Epoch 454, loss: 0.015522102243267
Epoch 455, loss: 0.01551753783132881
Epoch 456, loss: 0.015512991114519536
Epoch 457, loss: 0.01550843648146838
Epoch 458, loss: 0.015503892907872796
Epoch 459, loss: 0.015499353175982833
Epoch 460, loss: 0.015494818333536386
Epoch 461, loss: 0.015490289777517319
Epoch 462, loss: 0.015485770301893353
Epoch 463, loss: 0.015481252688914537
Epoch 464, loss: 0.015476739266887307
Epoch 465, loss: 0.01547223306261003
Epoch 466, loss: 0.01546773046720773
Epoch 467, loss: 0.01546323299407959
Epoch 468, loss: 0.01545873936265707
Epoch 469, loss: 0.015454252483323216
Epoch 470, loss: 0.015449768863618374
Epoch 471, loss: 0.015445296070538461
Epoch 472, loss: 0.015440825140103698
Epoch 473, loss: 0.015436352114193141
Epoch 474, loss: 0.015431895037181675
Epoch 475, loss: 0.015427436213940382
Epoch 476, loss: 0.015422984608449042
Epoch 477, loss: 0.015418535564094782
Epoch 478, loss: 0.015414097812026739
Epoch 479, loss: 0.015409659245051444
Epoch 480, loss: 0.015405226848088205
Epoch 481, loss: 0.01540080236736685
Epoch 482, loss: 0.015396383474580944
Epoch 483, loss: 0.0153919622534886
Epoch 484, loss: 0.01538755011279136
Epoch 485, loss: 0.015383146819658577
Epoch 486, loss: 0.015378746087662876
Epoch 487, loss: 0.015374345588497818
Epoch 488, loss: 0.015369954984635115
Epoch 489, loss: 0.015365565428510308
Epoch 490, loss: 0.015361182391643524
Epoch 491, loss: 0.01535680890083313
Epoch 492, loss: 0.015352435759268701
Epoch 493, loss: 0.015348068787716329
Epoch 494, loss: 0.015343705890700221
Epoch 495, loss: 0.015339347766712308
Epoch 496, loss: 0.01533499569632113
Epoch 497, loss: 0.015330647234804928
Epoch 498, loss: 0.015326304011978209
Epoch 499, loss: 0.015321968006901443

上边这个cell的第10行,如果只是写y = tf.matmul(x_train, w1) + b1,会报错:InvalidArgumentError: cannot compute MatMul as input #1(zero-based) was expected to be a double tensor but is a float tensor [Op:MatMul],看错误原因是说不应该用float类型而应该用double类型,但是测试过把它转换成tf.double也会报错,转换成tf.float32后运行成功。不过好像没有tf.double类型,具体原因我也不知道是啥。

鸢尾花数据集分类问题(4)https://developer.aliyun.com/article/1540971

目录
相关文章
|
4月前
|
机器学习/深度学习 自然语言处理 算法
什么是数据集的分类?
【7月更文挑战第10天】什么是数据集的分类?
474 1
|
5月前
鸢尾花数据集分类问题(1)
鸢尾花数据集分类问题
35 1
|
5月前
|
机器学习/深度学习
鸢尾花数据集分类问题(2)
鸢尾花数据集分类问题
37 1
|
5月前
鸢尾花数据集分类问题(4)
鸢尾花数据集分类问题
24 0
|
6月前
|
机器学习/深度学习 数据可视化 数据库
R语言对MNIST数据集分析:探索手写数字分类
R语言对MNIST数据集分析:探索手写数字分类
|
6月前
|
数据可视化 算法 数据挖掘
R语言鸢尾花iris数据集的层次聚类分析
R语言鸢尾花iris数据集的层次聚类分析
|
机器学习/深度学习 Python
【统计学习方法】K近邻对鸢尾花(iris)数据集进行多分类
【统计学习方法】K近邻对鸢尾花(iris)数据集进行多分类
234 0
|
机器学习/深度学习 自然语言处理
(imdb数据集)电影评论分类实战:二分类问题
(imdb数据集)电影评论分类实战:二分类问题
|
数据采集 机器学习/深度学习 Python
【统计学习方法】朴素贝叶斯对鸢尾花(iris)数据集进行训练预测
【统计学习方法】朴素贝叶斯对鸢尾花(iris)数据集进行训练预测
392 0
【统计学习方法】朴素贝叶斯对鸢尾花(iris)数据集进行训练预测
|
机器学习/深度学习 Python
【统计学习方法】感知机对鸢尾花(iris)数据集进行二分类
【统计学习方法】感知机对鸢尾花(iris)数据集进行二分类
723 0
【统计学习方法】感知机对鸢尾花(iris)数据集进行二分类