TF之DCGAN:基于TF利用DCGAN测试MNIST数据集并进行生成过程全记录

简介: TF之DCGAN:基于TF利用DCGAN测试MNIST数据集并进行生成过程全记录

测试结果

image.png



train_00_0099  train_00_0799

train_00_0899  train_01_0506

train_01_0606  train_02_0213

train_02_0313  train_02_1013

train_03_0020  train_03_0720


测试过程全记录


1140~1410


……开始测试

{'batch_size': <absl.flags._flag.Flag object at 0x000002A2FFDB1B38>,

'beta1': <absl.flags._flag.Flag object at 0x000002A2FE967DA0>,

'checkpoint_dir': <absl.flags._flag.Flag object at 0x000002A281135A20>,

'crop': <absl.flags._flag.BooleanFlag object at 0x000002A281135B70>,

'dataset': <absl.flags._flag.Flag object at 0x000002A281135908>,

'epoch': <absl.flags._flag.Flag object at 0x000002A2F7728048>,

'h': <tensorflow.python.platform.app._HelpFlag object at 0x000002A281135C50>,

'help': <tensorflow.python.platform.app._HelpFlag object at 0x000002A281135C50>,

'helpfull': <tensorflow.python.platform.app._HelpfullFlag object at 0x000002A281135CC0>,

'helpshort': <tensorflow.python.platform.app._HelpshortFlag object at 0x000002A281135D30>,

'input_fname_pattern': <absl.flags._flag.Flag object at 0x000002A281135978>,

'input_height': <absl.flags._flag.Flag object at 0x000002A2810ABCC0>,

'input_width': <absl.flags._flag.Flag object at 0x000002A281135780>,

'learning_rate': <absl.flags._flag.Flag object at 0x000002A2F92D7AC8>,

'output_height': <absl.flags._flag.Flag object at 0x000002A2811357F0>,

'output_width': <absl.flags._flag.Flag object at 0x000002A281135898>,

'sample_dir': <absl.flags._flag.Flag object at 0x000002A281135A90>,

'train': <absl.flags._flag.BooleanFlag object at 0x000002A281135AC8>,

'train_size': <absl.flags._flag.Flag object at 0x000002A2FE974400>,

'visualize': <absl.flags._flag.BooleanFlag object at 0x000002A281135BE0>}

2018-10-06 11:32:10.690386: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2

data_MNIST\mnist

---------

Variables: name (type shape) [size]

---------

generator/g_h0_lin/Matrix:0 (float32_ref 110x1024) [112640, bytes: 450560]

generator/g_h0_lin/bias:0 (float32_ref 1024) [1024, bytes: 4096]

generator/g_bn0/beta:0 (float32_ref 1024) [1024, bytes: 4096]

generator/g_bn0/gamma:0 (float32_ref 1024) [1024, bytes: 4096]

generator/g_h1_lin/Matrix:0 (float32_ref 1034x6272) [6485248, bytes: 25940992]

generator/g_h1_lin/bias:0 (float32_ref 6272) [6272, bytes: 25088]

generator/g_bn1/beta:0 (float32_ref 6272) [6272, bytes: 25088]

generator/g_bn1/gamma:0 (float32_ref 6272) [6272, bytes: 25088]

generator/g_h2/w:0 (float32_ref 5x5x128x138) [441600, bytes: 1766400]

generator/g_h2/biases:0 (float32_ref 128) [128, bytes: 512]

generator/g_bn2/beta:0 (float32_ref 128) [128, bytes: 512]

generator/g_bn2/gamma:0 (float32_ref 128) [128, bytes: 512]

generator/g_h3/w:0 (float32_ref 5x5x1x138) [3450, bytes: 13800]

generator/g_h3/biases:0 (float32_ref 1) [1, bytes: 4]

discriminator/d_h0_conv/w:0 (float32_ref 5x5x11x11) [3025, bytes: 12100]

discriminator/d_h0_conv/biases:0 (float32_ref 11) [11, bytes: 44]

discriminator/d_h1_conv/w:0 (float32_ref 5x5x21x74) [38850, bytes: 155400]

discriminator/d_h1_conv/biases:0 (float32_ref 74) [74, bytes: 296]

discriminator/d_bn1/beta:0 (float32_ref 74) [74, bytes: 296]

discriminator/d_bn1/gamma:0 (float32_ref 74) [74, bytes: 296]

discriminator/d_h2_lin/Matrix:0 (float32_ref 3636x1024) [3723264, bytes: 14893056]

discriminator/d_h2_lin/bias:0 (float32_ref 1024) [1024, bytes: 4096]

discriminator/d_bn2/beta:0 (float32_ref 1024) [1024, bytes: 4096]

discriminator/d_bn2/gamma:0 (float32_ref 1024) [1024, bytes: 4096]

discriminator/d_h3_lin/Matrix:0 (float32_ref 1034x1) [1034, bytes: 4136]

discriminator/d_h3_lin/bias:0 (float32_ref 1) [1, bytes: 4]

Total size of variables: 10834690

Total bytes of variables: 43338760

[*] Reading checkpoints...

[*] Failed to find a checkpoint

[!] Load failed...

Epoch: [ 0] [   0/1093] time: 3.3617, d_loss: 1.79891801, g_loss: 0.73078763

Epoch: [ 0] [   1/1093] time: 6.4123, d_loss: 1.46442509, g_loss: 0.61579478

Epoch: [ 0] [   2/1093] time: 8.7562, d_loss: 1.49022853, g_loss: 0.67894053

Epoch: [ 0] [   3/1093] time: 10.9214, d_loss: 1.40174472, g_loss: 0.66220653

Epoch: [ 0] [   4/1093] time: 13.3050, d_loss: 1.40663481, g_loss: 0.69936526

Epoch: [ 0] [   5/1093] time: 15.5709, d_loss: 1.38957083, g_loss: 0.68421012

Epoch: [ 0] [   6/1093] time: 17.8600, d_loss: 1.39213061, g_loss: 0.68934584

Epoch: [ 0] [   7/1093] time: 20.4708, d_loss: 1.39794362, g_loss: 0.69806755

Epoch: [ 0] [   8/1093] time: 23.0654, d_loss: 1.43503237, g_loss: 0.70846951

Epoch: [ 0] [   9/1093] time: 25.5358, d_loss: 1.39276147, g_loss: 0.70669782

Epoch: [ 0] [  10/1093] time: 28.2617, d_loss: 1.42136300, g_loss: 0.70364445

Epoch: [ 0] [  11/1093] time: 30.8038, d_loss: 1.40051103, g_loss: 0.70014894

Epoch: [ 0] [  12/1093] time: 33.3130, d_loss: 1.37765169, g_loss: 0.70824486

Epoch: [ 0] [  13/1093] time: 35.6096, d_loss: 1.38219857, g_loss: 0.69451976

Epoch: [ 0] [  14/1093] time: 37.8537, d_loss: 1.36866033, g_loss: 0.70824432

Epoch: [ 0] [  15/1093] time: 40.1426, d_loss: 1.36621869, g_loss: 0.69405836

Epoch: [ 0] [  16/1093] time: 42.7074, d_loss: 1.37535453, g_loss: 0.69518888

Epoch: [ 0] [  17/1093] time: 44.8565, d_loss: 1.36989605, g_loss: 0.69930756

Epoch: [ 0] [  18/1093] time: 46.7869, d_loss: 1.36563087, g_loss: 0.69781649

Epoch: [ 0] [  19/1093] time: 48.7288, d_loss: 1.36397326, g_loss: 0.70866680

Epoch: [ 0] [  20/1093] time: 51.0654, d_loss: 1.38101411, g_loss: 0.69544500

Epoch: [ 0] [  21/1093] time: 53.5399, d_loss: 1.46281934, g_loss: 0.70643008

Epoch: [ 0] [  22/1093] time: 56.5684, d_loss: 1.43966162, g_loss: 0.71961737

Epoch: [ 0] [  23/1093] time: 59.5954, d_loss: 1.42399430, g_loss: 0.72861439

Epoch: [ 0] [  24/1093] time: 62.9032, d_loss: 1.41276562, g_loss: 0.70471978

Epoch: [ 0] [  25/1093] time: 65.7187, d_loss: 1.48300290, g_loss: 0.71538234

Epoch: [ 0] [  26/1093] time: 68.6204, d_loss: 1.39843416, g_loss: 0.68771482

Epoch: [ 0] [  27/1093] time: 70.8153, d_loss: 1.42166626, g_loss: 0.69409549

Epoch: [ 0] [  28/1093] time: 73.5776, d_loss: 1.39594829, g_loss: 0.68035471

Epoch: [ 0] [  29/1093] time: 76.6749, d_loss: 1.39489424, g_loss: 0.69306409

Epoch: [ 0] [  30/1093] time: 79.8282, d_loss: 1.41070235, g_loss: 0.68208236

Epoch: [ 0] [  31/1093] time: 83.5562, d_loss: 1.39976072, g_loss: 0.69344074

Epoch: [ 0] [  32/1093] time: 86.5431, d_loss: 1.39875138, g_loss: 0.69864786

Epoch: [ 0] [  33/1093] time: 89.7386, d_loss: 1.39117682, g_loss: 0.68384939

Epoch: [ 0] [  34/1093] time: 92.1129, d_loss: 1.39306462, g_loss: 0.68603516

Epoch: [ 0] [  35/1093] time: 94.6717, d_loss: 1.39766645, g_loss: 0.67713618

Epoch: [ 0] [  36/1093] time: 97.4150, d_loss: 1.39619994, g_loss: 0.68300879

Epoch: [ 0] [  37/1093] time: 99.9408, d_loss: 1.39534819, g_loss: 0.69076747

Epoch: [ 0] [  38/1093] time: 103.1213, d_loss: 1.39753985, g_loss: 0.68903100

Epoch: [ 0] [  39/1093] time: 105.8520, d_loss: 1.41161013, g_loss: 0.69302136

Epoch: [ 0] [  40/1093] time: 108.9503, d_loss: 1.38997078, g_loss: 0.68370312

Epoch: [ 0] [  41/1093] time: 112.2070, d_loss: 1.39786303, g_loss: 0.69124269

Epoch: [ 0] [  42/1093] time: 115.2431, d_loss: 1.38943410, g_loss: 0.69021893

Epoch: [ 0] [  43/1093] time: 118.6511, d_loss: 1.38621378, g_loss: 0.68407494

Epoch: [ 0] [  44/1093] time: 122.0462, d_loss: 1.39240563, g_loss: 0.69688046

Epoch: [ 0] [  45/1093] time: 125.3139, d_loss: 1.39452100, g_loss: 0.69252259

Epoch: [ 0] [  46/1093] time: 129.0117, d_loss: 1.39167857, g_loss: 0.68246353

Epoch: [ 0] [  47/1093] time: 132.8489, d_loss: 1.39049268, g_loss: 0.69009811

Epoch: [ 0] [  48/1093] time: 136.4826, d_loss: 1.39105415, g_loss: 0.69570535

Epoch: [ 0] [  49/1093] time: 139.8832, d_loss: 1.38744533, g_loss: 0.68307704

Epoch: [ 0] [  50/1093] time: 142.6343, d_loss: 1.39128542, g_loss: 0.68657452

Epoch: [ 0] [  51/1093] time: 145.0365, d_loss: 1.39720774, g_loss: 0.68289292

Epoch: [ 0] [  52/1093] time: 148.8226, d_loss: 1.40998244, g_loss: 0.69946194

Epoch: [ 0] [  53/1093] time: 151.4981, d_loss: 1.42358077, g_loss: 0.69425476

Epoch: [ 0] [  54/1093] time: 154.4366, d_loss: 1.40655017, g_loss: 0.69315112

Epoch: [ 0] [  55/1093] time: 157.9840, d_loss: 1.39314961, g_loss: 0.67903620

Epoch: [ 0] [  56/1093] time: 160.5293, d_loss: 1.39538550, g_loss: 0.68701828

Epoch: [ 0] [  57/1093] time: 162.8455, d_loss: 1.40030372, g_loss: 0.68119174

Epoch: [ 0] [  58/1093] time: 165.5109, d_loss: 1.39839721, g_loss: 0.68374062

Epoch: [ 0] [  59/1093] time: 168.1250, d_loss: 1.40220833, g_loss: 0.67849696

Epoch: [ 0] [  60/1093] time: 170.4443, d_loss: 1.40346980, g_loss: 0.68534362

Epoch: [ 0] [  61/1093] time: 172.5757, d_loss: 1.40919614, g_loss: 0.68264174

Epoch: [ 0] [  62/1093] time: 175.3375, d_loss: 1.41680074, g_loss: 0.69107366

Epoch: [ 0] [  63/1093] time: 178.1931, d_loss: 1.42677331, g_loss: 0.68684256

Epoch: [ 0] [  64/1093] time: 180.9363, d_loss: 1.41873085, g_loss: 0.68174267

Epoch: [ 0] [  65/1093] time: 183.4142, d_loss: 1.41352820, g_loss: 0.69168335

Epoch: [ 0] [  66/1093] time: 186.2004, d_loss: 1.40492952, g_loss: 0.68485790

Epoch: [ 0] [  67/1093] time: 188.9013, d_loss: 1.41416049, g_loss: 0.69247150

Epoch: [ 0] [  68/1093] time: 191.3907, d_loss: 1.44085050, g_loss: 0.70080090

Epoch: [ 0] [  69/1093] time: 193.6596, d_loss: 1.42936659, g_loss: 0.70780182

Epoch: [ 0] [  70/1093] time: 196.2392, d_loss: 1.39855242, g_loss: 0.68066621

Epoch: [ 0] [  71/1093] time: 198.6732, d_loss: 1.39962685, g_loss: 0.68119228

Epoch: [ 0] [  72/1093] time: 201.1359, d_loss: 1.39792156, g_loss: 0.68046838

Epoch: [ 0] [  73/1093] time: 203.9913, d_loss: 1.40156364, g_loss: 0.68185544

Epoch: [ 0] [  74/1093] time: 206.5057, d_loss: 1.40137339, g_loss: 0.68439347

Epoch: [ 0] [  75/1093] time: 208.9730, d_loss: 1.39628625, g_loss: 0.68880224

Epoch: [ 0] [  76/1093] time: 212.1802, d_loss: 1.39695120, g_loss: 0.69053137

Epoch: [ 0] [  77/1093] time: 215.1069, d_loss: 1.39827728, g_loss: 0.67404974

Epoch: [ 0] [  78/1093] time: 217.8231, d_loss: 1.39441288, g_loss: 0.68811285

Epoch: [ 0] [  79/1093] time: 220.8017, d_loss: 1.39862061, g_loss: 0.68243313

Epoch: [ 0] [  80/1093] time: 223.6711, d_loss: 1.39560962, g_loss: 0.68420863

Epoch: [ 0] [  81/1093] time: 226.1243, d_loss: 1.39474165, g_loss: 0.68446684

Epoch: [ 0] [  82/1093] time: 228.9125, d_loss: 1.39735079, g_loss: 0.68914992

Epoch: [ 0] [  83/1093] time: 231.7087, d_loss: 1.40495729, g_loss: 0.67565703

Epoch: [ 0] [  84/1093] time: 234.3499, d_loss: 1.40376186, g_loss: 0.68402076

Epoch: [ 0] [  85/1093] time: 236.8927, d_loss: 1.39633703, g_loss: 0.67996454

Epoch: [ 0] [  86/1093] time: 239.8556, d_loss: 1.40431571, g_loss: 0.68185967

Epoch: [ 0] [  87/1093] time: 242.7527, d_loss: 1.40456629, g_loss: 0.68880403

Epoch: [ 0] [  88/1093] time: 245.2765, d_loss: 1.39363539, g_loss: 0.68647277

Epoch: [ 0] [  89/1093] time: 247.9097, d_loss: 1.39768720, g_loss: 0.68281728

Epoch: [ 0] [  90/1093] time: 250.6797, d_loss: 1.40258384, g_loss: 0.69015211

Epoch: [ 0] [  91/1093] time: 252.9605, d_loss: 1.41010988, g_loss: 0.69163489

Epoch: [ 0] [  92/1093] time: 255.8331, d_loss: 1.39705300, g_loss: 0.67692769

Epoch: [ 0] [  93/1093] time: 258.7976, d_loss: 1.41552734, g_loss: 0.69169050

Epoch: [ 0] [  94/1093] time: 262.1104, d_loss: 1.39865696, g_loss: 0.68793559

Epoch: [ 0] [  95/1093] time: 265.0370, d_loss: 1.40191650, g_loss: 0.68027002

Epoch: [ 0] [  96/1093] time: 267.7568, d_loss: 1.40628874, g_loss: 0.67845261

Epoch: [ 0] [  97/1093] time: 270.7154, d_loss: 1.40095508, g_loss: 0.68664324

Epoch: [ 0] [  98/1093] time: 273.6299, d_loss: 1.41269326, g_loss: 0.68330830

Epoch: [ 0] [  99/1093] time: 276.4041, d_loss: 1.41343331, g_loss: 0.69674391

[Sample] d_loss: 1.39404178, g_loss: 0.71861243

Epoch: [ 0] [ 100/1093] time: 279.9370, d_loss: 1.39926529, g_loss: 0.69326425

Epoch: [ 0] [ 101/1093] time: 282.8589, d_loss: 1.39894390, g_loss: 0.68361241

Epoch: [ 0] [ 102/1093] time: 285.4811, d_loss: 1.39818084, g_loss: 0.69090337

Epoch: [ 0] [ 103/1093] time: 287.6454, d_loss: 1.39627695, g_loss: 0.67909706

Epoch: [ 0] [ 104/1093] time: 290.3276, d_loss: 1.39514160, g_loss: 0.68727589

Epoch: [ 0] [ 105/1093] time: 293.5694, d_loss: 1.40148556, g_loss: 0.68616998

Epoch: [ 0] [ 106/1093] time: 296.7065, d_loss: 1.39823532, g_loss: 0.68184149

Epoch: [ 0] [ 107/1093] time: 299.5040, d_loss: 1.40077090, g_loss: 0.67544007

Epoch: [ 0] [ 108/1093] time: 302.5080, d_loss: 1.40159750, g_loss: 0.68739390

Epoch: [ 0] [ 109/1093] time: 305.3266, d_loss: 1.40064311, g_loss: 0.68674183

Epoch: [ 0] [ 110/1093] time: 308.2463, d_loss: 1.40190828, g_loss: 0.68489563

……

Epoch: [ 0] [ 190/1093] time: 535.9742, d_loss: 1.39696872, g_loss: 0.67972469

Epoch: [ 0] [ 191/1093] time: 538.4506, d_loss: 1.39499533, g_loss: 0.68089843

Epoch: [ 0] [ 192/1093] time: 541.1816, d_loss: 1.39483309, g_loss: 0.68199342

Epoch: [ 0] [ 193/1093] time: 544.6827, d_loss: 1.39154720, g_loss: 0.69034952

Epoch: [ 0] [ 194/1093] time: 548.6390, d_loss: 1.38941956, g_loss: 0.68652773

Epoch: [ 0] [ 195/1093] time: 551.9678, d_loss: 1.39027929, g_loss: 0.69264108

Epoch: [ 0] [ 196/1093] time: 555.3258, d_loss: 1.39162266, g_loss: 0.68833613

Epoch: [ 0] [ 197/1093] time: 558.5404, d_loss: 1.40050042, g_loss: 0.68856359

Epoch: [ 0] [ 198/1093] time: 561.3181, d_loss: 1.39854860, g_loss: 0.69332385

Epoch: [ 0] [ 199/1093] time: 563.8952, d_loss: 1.40790129, g_loss: 0.69219285

[Sample] d_loss: 1.39614487, g_loss: 0.70220172

Epoch: [ 0] [ 200/1093] time: 566.5791, d_loss: 1.39575028, g_loss: 0.68371403

Epoch: [ 0] [ 201/1093] time: 568.9093, d_loss: 1.39769495, g_loss: 0.68171024

Epoch: [ 0] [ 202/1093] time: 571.4728, d_loss: 1.40282321, g_loss: 0.67665672

Epoch: [ 0] [ 203/1093] time: 574.0684, d_loss: 1.40040171, g_loss: 0.68347836

Epoch: [ 0] [ 204/1093] time: 576.6086, d_loss: 1.40370631, g_loss: 0.67588425

Epoch: [ 0] [ 205/1093] time: 579.1860, d_loss: 1.40058494, g_loss: 0.67948377

Epoch: [ 0] [ 206/1093] time: 581.7698, d_loss: 1.40094650, g_loss: 0.68511415

Epoch: [ 0] [ 207/1093] time: 584.3541, d_loss: 1.39703560, g_loss: 0.68563807

Epoch: [ 0] [ 208/1093] time: 586.9515, d_loss: 1.39535570, g_loss: 0.69189703

Epoch: [ 0] [ 209/1093] time: 589.5623, d_loss: 1.39087117, g_loss: 0.68965638

Epoch: [ 0] [ 210/1093] time: 592.1490, d_loss: 1.39308906, g_loss: 0.68321383

……

Epoch: [ 0] [ 889/1093] time: 2314.8393, d_loss: 1.39859378, g_loss: 0.67322266

Epoch: [ 0] [ 890/1093] time: 2316.9278, d_loss: 1.39070845, g_loss: 0.68732977

Epoch: [ 0] [ 891/1093] time: 2319.3591, d_loss: 1.39387286, g_loss: 0.67873466

Epoch: [ 0] [ 892/1093] time: 2321.4178, d_loss: 1.39172828, g_loss: 0.68356216

Epoch: [ 0] [ 893/1093] time: 2323.4089, d_loss: 1.39842272, g_loss: 0.67815489

Epoch: [ 0] [ 894/1093] time: 2325.6301, d_loss: 1.39376366, g_loss: 0.68304271

Epoch: [ 0] [ 895/1093] time: 2328.0387, d_loss: 1.39139628, g_loss: 0.67735171

Epoch: [ 0] [ 896/1093] time: 2330.0398, d_loss: 1.39796066, g_loss: 0.67579186

Epoch: [ 0] [ 897/1093] time: 2332.2183, d_loss: 1.39888477, g_loss: 0.66883886

Epoch: [ 0] [ 898/1093] time: 2334.6396, d_loss: 1.39262605, g_loss: 0.67790604

Epoch: [ 0] [ 899/1093] time: 2336.6380, d_loss: 1.38774049, g_loss: 0.68282270

[Sample] d_loss: 1.38685536, g_loss: 0.70143592

Epoch: [ 0] [ 900/1093] time: 2339.1794, d_loss: 1.39559400, g_loss: 0.67823637

Epoch: [ 0] [ 901/1093] time: 2341.5979, d_loss: 1.39618373, g_loss: 0.67359304

Epoch: [ 0] [ 902/1093] time: 2343.6090, d_loss: 1.40060043, g_loss: 0.68315041

Epoch: [ 0] [ 903/1093] time: 2345.6101, d_loss: 1.38607645, g_loss: 0.68459594

Epoch: [ 0] [ 904/1093] time: 2347.6186, d_loss: 1.38612366, g_loss: 0.68465877

Epoch: [ 0] [ 905/1093] time: 2349.8598, d_loss: 1.38972747, g_loss: 0.68110597

Epoch: [ 0] [ 906/1093] time: 2352.2383, d_loss: 1.40021336, g_loss: 0.67477131

Epoch: [ 0] [ 907/1093] time: 2354.2594, d_loss: 1.38780701, g_loss: 0.68614316

Epoch: [ 0] [ 908/1093] time: 2356.4380, d_loss: 1.39729989, g_loss: 0.68168002

Epoch: [ 0] [ 909/1093] time: 2358.8492, d_loss: 1.39604807, g_loss: 0.68169260

Epoch: [ 0] [ 910/1093] time: 2360.8703, d_loss: 1.39347506, g_loss: 0.67698503

……

Epoch: [ 0] [ 990/1093] time: 2534.4882, d_loss: 1.38051999, g_loss: 0.68829250

Epoch: [ 0] [ 991/1093] time: 2536.8594, d_loss: 1.38707495, g_loss: 0.69181627

Epoch: [ 0] [ 992/1093] time: 2538.9105, d_loss: 1.39524150, g_loss: 0.68155080

Epoch: [ 0] [ 993/1093] time: 2540.9216, d_loss: 1.39088154, g_loss: 0.68005645

Epoch: [ 0] [ 994/1093] time: 2543.1603, d_loss: 1.38700223, g_loss: 0.68155348

Epoch: [ 0] [ 995/1093] time: 2545.5215, d_loss: 1.40298247, g_loss: 0.66744435

Epoch: [ 0] [ 996/1093] time: 2547.5300, d_loss: 1.40880179, g_loss: 0.66607797

Epoch: [ 0] [ 997/1093] time: 2549.5310, d_loss: 1.39295077, g_loss: 0.67571455

Epoch: [ 0] [ 998/1093] time: 2551.8797, d_loss: 1.39118791, g_loss: 0.68550998

Epoch: [ 0] [ 999/1093] time: 2554.1409, d_loss: 1.38995099, g_loss: 0.68077219

[Sample] d_loss: 1.39188242, g_loss: 0.69870007

Epoch: [ 0] [1000/1093] time: 2556.5095, d_loss: 1.38937902, g_loss: 0.68420708

Epoch: [ 0] [1001/1093] time: 2559.4411, d_loss: 1.38841224, g_loss: 0.67964196

Epoch: [ 0] [1002/1093] time: 2561.3995, d_loss: 1.39025033, g_loss: 0.68857718

Epoch: [ 0] [1003/1093] time: 2563.4106, d_loss: 1.38774192, g_loss: 0.68713319

Epoch: [ 0] [1004/1093] time: 2565.7818, d_loss: 1.38517952, g_loss: 0.69962525

Epoch: [ 0] [1005/1093] time: 2568.0208, d_loss: 1.39758313, g_loss: 0.68758988

Epoch: [ 0] [1006/1093] time: 2570.0219, d_loss: 1.39658952, g_loss: 0.69050717

Epoch: [ 0] [1007/1093] time: 2572.0104, d_loss: 1.39825773, g_loss: 0.67399806

Epoch: [ 0] [1008/1093] time: 2574.2516, d_loss: 1.39735007, g_loss: 0.68345094

Epoch: [ 0] [1009/1093] time: 2576.4203, d_loss: 1.39032114, g_loss: 0.67591566

Epoch: [ 0] [1010/1093] time: 2578.4213, d_loss: 1.39701056, g_loss: 0.67272741

……

Epoch: [ 0] [1080/1093] time: 2729.6181, d_loss: 1.38660502, g_loss: 0.67934191

Epoch: [ 0] [1081/1093] time: 2731.6592, d_loss: 1.39765692, g_loss: 0.67786539

Epoch: [ 0] [1082/1093] time: 2733.8804, d_loss: 1.38977814, g_loss: 0.67776024

Epoch: [ 0] [1083/1093] time: 2736.3117, d_loss: 1.39641953, g_loss: 0.67741239

Epoch: [ 0] [1084/1093] time: 2738.3027, d_loss: 1.39849305, g_loss: 0.66936278

Epoch: [ 0] [1085/1093] time: 2740.2938, d_loss: 1.39174080, g_loss: 0.67819309

Epoch: [ 0] [1086/1093] time: 2742.3049, d_loss: 1.39430928, g_loss: 0.67690992

Epoch: [ 0] [1087/1093] time: 2744.5361, d_loss: 1.38831007, g_loss: 0.67887449

Epoch: [ 0] [1088/1093] time: 2746.8773, d_loss: 1.38743389, g_loss: 0.67928505

Epoch: [ 0] [1089/1093] time: 2748.8884, d_loss: 1.40019858, g_loss: 0.66898370

Epoch: [ 0] [1090/1093] time: 2750.8895, d_loss: 1.38798690, g_loss: 0.67247820

Epoch: [ 0] [1091/1093] time: 2753.2107, d_loss: 1.39350247, g_loss: 0.67379618

Epoch: [ 0] [1092/1093] time: 2755.4819, d_loss: 1.39420724, g_loss: 0.67820472

Epoch: [ 1] [   0/1093] time: 2757.4730, d_loss: 1.39512217, g_loss: 0.67989075

Epoch: [ 1] [   1/1093] time: 2759.4640, d_loss: 1.40053773, g_loss: 0.67416751

Epoch: [ 1] [   2/1093] time: 2761.6852, d_loss: 1.39061642, g_loss: 0.68139255

Epoch: [ 1] [   3/1093] time: 2764.0665, d_loss: 1.39106679, g_loss: 0.68662095

Epoch: [ 1] [   4/1093] time: 2766.0576, d_loss: 1.39566541, g_loss: 0.68151307

Epoch: [ 1] [   5/1093] time: 2768.2087, d_loss: 1.39311624, g_loss: 0.67771947

Epoch: [ 1] [   6/1093] time: 2770.6400, d_loss: 1.38920772, g_loss: 0.68134636

[Sample] d_loss: 1.37139106, g_loss: 0.70543092

Epoch: [ 1] [   7/1093] time: 2773.0913, d_loss: 1.39394724, g_loss: 0.68192399

Epoch: [ 1] [   8/1093] time: 2775.0624, d_loss: 1.38887393, g_loss: 0.68720746

Epoch: [ 1] [   9/1093] time: 2777.4136, d_loss: 1.38760364, g_loss: 0.67848784

Epoch: [ 1] [  10/1093] time: 2779.6648, d_loss: 1.39168441, g_loss: 0.68020177

……

Epoch: [ 1] [ 100/1093] time: 2974.2737, d_loss: 1.39194596, g_loss: 0.67957735

Epoch: [ 1] [ 101/1093] time: 2976.2621, d_loss: 1.38731110, g_loss: 0.68144447

Epoch: [ 1] [ 102/1093] time: 2978.5533, d_loss: 1.39706790, g_loss: 0.67597866

Epoch: [ 1] [ 103/1093] time: 2980.8618, d_loss: 1.39810014, g_loss: 0.66924357

Epoch: [ 1] [ 104/1093] time: 2982.8729, d_loss: 1.38598788, g_loss: 0.67554963

Epoch: [ 1] [ 105/1093] time: 2984.8740, d_loss: 1.39240956, g_loss: 0.67254972

Epoch: [ 1] [ 106/1093] time: 2986.9812, d_loss: 1.39499450, g_loss: 0.67016041

[Sample] d_loss: 1.38732648, g_loss: 0.69703865

Epoch: [ 1] [ 107/1093] time: 2990.3271, d_loss: 1.40116143, g_loss: 0.66683507

Epoch: [ 1] [ 108/1093] time: 2992.7527, d_loss: 1.39175665, g_loss: 0.68012154

Epoch: [ 1] [ 109/1093] time: 2995.0739, d_loss: 1.39712453, g_loss: 0.67622381

Epoch: [ 1] [ 110/1093] time: 2997.6827, d_loss: 1.39206731, g_loss: 0.68065107

……

Epoch: [ 1] [ 200/1093] time: 3202.6353, d_loss: 1.38041210, g_loss: 0.69372916

Epoch: [ 1] [ 201/1093] time: 3204.9065, d_loss: 1.37933481, g_loss: 0.68821752

Epoch: [ 1] [ 202/1093] time: 3206.9749, d_loss: 1.38175058, g_loss: 0.68613887

Epoch: [ 1] [ 203/1093] time: 3209.2561, d_loss: 1.39573455, g_loss: 0.67698872

Epoch: [ 1] [ 204/1093] time: 3211.8648, d_loss: 1.39549482, g_loss: 0.67765439

Epoch: [ 1] [ 205/1093] time: 3213.9159, d_loss: 1.39421272, g_loss: 0.67087078

Epoch: [ 1] [ 206/1093] time: 3215.9443, d_loss: 1.38698030, g_loss: 0.68094480

[Sample] d_loss: 1.38046920, g_loss: 0.69818783

Epoch: [ 1] [ 207/1093] time: 3218.7558, d_loss: 1.38357759, g_loss: 0.68195403

Epoch: [ 1] [ 208/1093] time: 3220.9143, d_loss: 1.38065100, g_loss: 0.68955791

Epoch: [ 1] [ 209/1093] time: 3222.9153, d_loss: 1.39242363, g_loss: 0.67996120

Epoch: [ 1] [ 210/1093] time: 3225.2766, d_loss: 1.39360881, g_loss: 0.67260170

Epoch: [ 1] [ 211/1093] time: 3227.5352, d_loss: 1.38966787, g_loss: 0.68173468

……

Epoch: [ 1] [ 300/1093] time: 3423.0672, d_loss: 1.38942528, g_loss: 0.68868965

Epoch: [ 1] [ 301/1093] time: 3425.2884, d_loss: 1.39816928, g_loss: 0.67360890

Epoch: [ 1] [ 302/1093] time: 3427.7397, d_loss: 1.39097309, g_loss: 0.67551196

Epoch: [ 1] [ 303/1093] time: 3430.0009, d_loss: 1.38649738, g_loss: 0.68443769

Epoch: [ 1] [ 304/1093] time: 3432.1621, d_loss: 1.37904358, g_loss: 0.68674159

Epoch: [ 1] [ 305/1093] time: 3434.4833, d_loss: 1.38382614, g_loss: 0.68362451

Epoch: [ 1] [ 306/1093] time: 3436.7545, d_loss: 1.39781308, g_loss: 0.67674035

[Sample] d_loss: 1.38065791, g_loss: 0.70156980

Epoch: [ 1] [ 307/1093] time: 3439.1758, d_loss: 1.38384795, g_loss: 0.68276435

Epoch: [ 1] [ 308/1093] time: 3441.1669, d_loss: 1.38682365, g_loss: 0.67517352

Epoch: [ 1] [ 309/1093] time: 3443.1579, d_loss: 1.39301312, g_loss: 0.67435873

Epoch: [ 1] [ 310/1093] time: 3445.3991, d_loss: 1.38605368, g_loss: 0.67695403

Epoch: [ 1] [ 311/1093] time: 3447.7679, d_loss: 1.39315736, g_loss: 0.67680848

Epoch: [ 1] [ 312/1093] time: 3449.7789, d_loss: 1.39378428, g_loss: 0.67591465

Epoch: [ 1] [ 313/1093] time: 3451.9869, d_loss: 1.38802958, g_loss: 0.68172151

Epoch: [ 1] [ 314/1093] time: 3454.4282, d_loss: 1.39341950, g_loss: 0.67019951

Epoch: [ 1] [ 315/1093] time: 3456.4468, d_loss: 1.38873637, g_loss: 0.67581522

……

6023

Epoch: [ 1] [ 500/1093] time: 3862.5952, d_loss: 1.37781966, g_loss: 0.68809289

Epoch: [ 1] [ 501/1093] time: 3864.8864, d_loss: 1.39578390, g_loss: 0.67059171

Epoch: [ 1] [ 502/1093] time: 3866.8775, d_loss: 1.37757528, g_loss: 0.69209492

Epoch: [ 1] [ 503/1093] time: 3868.8760, d_loss: 1.39398217, g_loss: 0.67311525

Epoch: [ 1] [ 504/1093] time: 3871.1472, d_loss: 1.39142919, g_loss: 0.67788839

Epoch: [ 1] [ 505/1093] time: 3873.6359, d_loss: 1.39205325, g_loss: 0.67508668

Epoch: [ 1] [ 506/1093] time: 3875.6370, d_loss: 1.39611387, g_loss: 0.67204535

[Sample] d_loss: 1.37556362, g_loss: 0.69815457

Epoch: [ 1] [ 507/1093] time: 3877.9957, d_loss: 1.39341450, g_loss: 0.67685163

Epoch: [ 1] [ 508/1093] time: 3880.4070, d_loss: 1.39084995, g_loss: 0.67754412

Epoch: [ 1] [ 509/1093] time: 3882.5755, d_loss: 1.40043855, g_loss: 0.66707742

Epoch: [ 1] [ 510/1093] time: 3884.5566, d_loss: 1.38664675, g_loss: 0.67468828

Epoch: [ 1] [ 511/1093] time: 3886.9479, d_loss: 1.39450240, g_loss: 0.66686535

Epoch: [ 1] [ 512/1093] time: 3889.1964, d_loss: 1.38870108, g_loss: 0.67924225

Epoch: [ 1] [ 513/1093] time: 3891.3575, d_loss: 1.39083517, g_loss: 0.68065596

Epoch: [ 1] [ 514/1093] time: 3893.6961, d_loss: 1.38016534, g_loss: 0.68610257

Epoch: [ 1] [ 515/1093] time: 3895.9473, d_loss: 1.38265920, g_loss: 0.68078399

Epoch: [ 1] [ 516/1093] time: 3897.9557, d_loss: 1.39135432, g_loss: 0.67949045

Epoch: [ 1] [ 517/1093] time: 3899.9467, d_loss: 1.38820958, g_loss: 0.67711371

Epoch: [ 1] [ 518/1093] time: 3902.2179, d_loss: 1.39466333, g_loss: 0.68058121

……

Epoch: [ 1] [1077/1093] time: 5125.0453, d_loss: 1.37844777, g_loss: 0.68651271

Epoch: [ 1] [1078/1093] time: 5127.2238, d_loss: 1.38850927, g_loss: 0.68094480

Epoch: [ 1] [1079/1093] time: 5129.5851, d_loss: 1.37683725, g_loss: 0.68991780

Epoch: [ 1] [1080/1093] time: 5131.8035, d_loss: 1.39222741, g_loss: 0.66837865

Epoch: [ 1] [1081/1093] time: 5133.8046, d_loss: 1.38264728, g_loss: 0.67701787

Epoch: [ 1] [1082/1093] time: 5135.7957, d_loss: 1.39265454, g_loss: 0.67443299

Epoch: [ 1] [1083/1093] time: 5138.0342, d_loss: 1.39083576, g_loss: 0.68285644

Epoch: [ 1] [1084/1093] time: 5140.3855, d_loss: 1.39100790, g_loss: 0.67561376

Epoch: [ 1] [1085/1093] time: 5142.3541, d_loss: 1.38509417, g_loss: 0.67930484

Epoch: [ 1] [1086/1093] time: 5144.3652, d_loss: 1.38570714, g_loss: 0.67512459

Epoch: [ 1] [1087/1093] time: 5146.8339, d_loss: 1.37933540, g_loss: 0.67861497

Epoch: [ 1] [1088/1093] time: 5149.1052, d_loss: 1.39024305, g_loss: 0.67276442

Epoch: [ 1] [1089/1093] time: 5151.1162, d_loss: 1.37893343, g_loss: 0.68706435

Epoch: [ 1] [1090/1093] time: 5153.1038, d_loss: 1.38589072, g_loss: 0.67717320

Epoch: [ 1] [1091/1093] time: 5155.4151, d_loss: 1.38973820, g_loss: 0.67712557

Epoch: [ 1] [1092/1093] time: 5157.8440, d_loss: 1.38368809, g_loss: 0.67974091

Epoch: [ 2] [   0/1093] time: 5159.8150, d_loss: 1.38032269, g_loss: 0.67759383

Epoch: [ 2] [   1/1093] time: 5162.0339, d_loss: 1.37580657, g_loss: 0.67377681

Epoch: [ 2] [   2/1093] time: 5164.4152, d_loss: 1.37951207, g_loss: 0.67664278

Epoch: [ 2] [   3/1093] time: 5166.4536, d_loss: 1.39463484, g_loss: 0.67749333

Epoch: [ 2] [   4/1093] time: 5168.4547, d_loss: 1.38351607, g_loss: 0.67323297

Epoch: [ 2] [   5/1093] time: 5170.9160, d_loss: 1.39039516, g_loss: 0.66864181

Epoch: [ 2] [   6/1093] time: 5173.2147, d_loss: 1.39086890, g_loss: 0.68247157

Epoch: [ 2] [   7/1093] time: 5175.3358, d_loss: 1.40376759, g_loss: 0.67604411

Epoch: [ 2] [   8/1093] time: 5177.3142, d_loss: 1.39150715, g_loss: 0.67578733

Epoch: [ 2] [   9/1093] time: 5179.6255, d_loss: 1.37265015, g_loss: 0.68143678

Epoch: [ 2] [  10/1093] time: 5182.0736, d_loss: 1.39045727, g_loss: 0.68101263

Epoch: [ 2] [  11/1093] time: 5184.2047, d_loss: 1.39368677, g_loss: 0.67329615

Epoch: [ 2] [  12/1093] time: 5186.1958, d_loss: 1.39578104, g_loss: 0.67907357

Epoch: [ 2] [  13/1093] time: 5188.6146, d_loss: 1.38878369, g_loss: 0.67477858

[Sample] d_loss: 1.36533904, g_loss: 0.70099354

Epoch: [ 2] [  14/1093] time: 5191.1659, d_loss: 1.39303446, g_loss: 0.68040711

Epoch: [ 2] [  15/1093] time: 5193.3644, d_loss: 1.38526893, g_loss: 0.67990983

Epoch: [ 2] [  16/1093] time: 5195.7657, d_loss: 1.39147758, g_loss: 0.68214095

Epoch: [ 2] [  17/1093] time: 5197.8844, d_loss: 1.36999416, g_loss: 0.69020271

……

Epoch: [ 2] [ 910/1093] time: 7159.6691, d_loss: 1.38843203, g_loss: 0.67605901

Epoch: [ 2] [ 911/1093] time: 7161.7976, d_loss: 1.40062439, g_loss: 0.66792578

Epoch: [ 2] [ 912/1093] time: 7164.0088, d_loss: 1.38792086, g_loss: 0.67560351

Epoch: [ 2] [ 913/1093] time: 7166.4575, d_loss: 1.38766861, g_loss: 0.67637527

[Sample] d_loss: 1.38370931, g_loss: 0.69774455

Epoch: [ 2] [ 914/1093] time: 7168.9888, d_loss: 1.39563513, g_loss: 0.67776477

Epoch: [ 2] [ 915/1093] time: 7171.2900, d_loss: 1.38675511, g_loss: 0.67512888

Epoch: [ 2] [ 916/1093] time: 7173.6588, d_loss: 1.38995445, g_loss: 0.67824239

Epoch: [ 2] [ 917/1093] time: 7175.6899, d_loss: 1.38771570, g_loss: 0.67128199

Epoch: [ 2] [ 918/1093] time: 7177.9085, d_loss: 1.38684642, g_loss: 0.68519258

Epoch: [ 2] [ 919/1093] time: 7180.3298, d_loss: 1.37333655, g_loss: 0.68652695

……

Epoch: [ 3] [ 362/1093] time: 8356.1996, d_loss: 1.39512014, g_loss: 0.66916156

Epoch: [ 3] [ 363/1093] time: 8358.3508, d_loss: 1.39369631, g_loss: 0.67236710

Epoch: [ 3] [ 364/1093] time: 8360.7621, d_loss: 1.38735843, g_loss: 0.68572378

Epoch: [ 3] [ 365/1093] time: 8362.7831, d_loss: 1.39971066, g_loss: 0.67346537

Epoch: [ 3] [ 366/1093] time: 8364.7842, d_loss: 1.39366436, g_loss: 0.67099309

Epoch: [ 3] [ 367/1093] time: 8367.0154, d_loss: 1.38990140, g_loss: 0.67454803

Epoch: [ 3] [ 368/1093] time: 8369.3366, d_loss: 1.38183749, g_loss: 0.68153870

Epoch: [ 3] [ 369/1093] time: 8371.3877, d_loss: 1.38687146, g_loss: 0.67545623

Epoch: [ 3] [ 370/1093] time: 8373.4989, d_loss: 1.38756406, g_loss: 0.68393183

Epoch: [ 3] [ 371/1093] time: 8375.8701, d_loss: 1.39338064, g_loss: 0.68219018

Epoch: [ 3] [ 372/1093] time: 8378.0713, d_loss: 1.38763750, g_loss: 0.67938375

Epoch: [ 3] [ 373/1093] time: 8380.0724, d_loss: 1.39371848, g_loss: 0.67651957

Epoch: [ 3] [ 374/1093] time: 8382.3936, d_loss: 1.38683343, g_loss: 0.67617160

Epoch: [ 3] [ 375/1093] time: 8384.6548, d_loss: 1.37663138, g_loss: 0.68140066

Epoch: [ 3] [ 376/1093] time: 8386.6659, d_loss: 1.37809563, g_loss: 0.68609798

Epoch: [ 3] [ 377/1093] time: 8388.6870, d_loss: 1.39943898, g_loss: 0.67443997

Epoch: [ 3] [ 378/1093] time: 8390.6980, d_loss: 1.39024842, g_loss: 0.67799813

Epoch: [ 3] [ 379/1093] time: 8393.0593, d_loss: 1.38977277, g_loss: 0.67707658

Epoch: [ 3] [ 380/1093] time: 8395.2905, d_loss: 1.38423812, g_loss: 0.68118286

Epoch: [ 3] [ 381/1093] time: 8397.4416, d_loss: 1.38743722, g_loss: 0.67777479

Epoch: [ 3] [ 382/1093] time: 8399.8829, d_loss: 1.37790775, g_loss: 0.68277538

Epoch: [ 3] [ 383/1093] time: 8402.1041, d_loss: 1.38662457, g_loss: 0.68058980

Epoch: [ 3] [ 384/1093] time: 8404.1052, d_loss: 1.39429832, g_loss: 0.67511570

Epoch: [ 3] [ 385/1093] time: 8406.5865, d_loss: 1.38111138, g_loss: 0.68313456

Epoch: [ 3] [ 386/1093] time: 8408.8477, d_loss: 1.38022339, g_loss: 0.68807602

Epoch: [ 3] [ 387/1093] time: 8410.8788, d_loss: 1.37367630, g_loss: 0.68106210

Epoch: [ 3] [ 388/1093] time: 8413.1800, d_loss: 1.37601101, g_loss: 0.68398643

Epoch: [ 3] [ 389/1093] time: 8415.5213, d_loss: 1.38206851, g_loss: 0.68312538

Epoch: [ 3] [ 390/1093] time: 8417.5339, d_loss: 1.39440620, g_loss: 0.67587590

Epoch: [ 3] [ 391/1093] time: 8419.7451, d_loss: 1.38435912, g_loss: 0.68598908

Epoch: [ 3] [ 392/1093] time: 8422.1564, d_loss: 1.38480914, g_loss: 0.67896384

Epoch: [ 3] [ 393/1093] time: 8424.1875, d_loss: 1.39561296, g_loss: 0.67151248

Epoch: [ 3] [ 394/1093] time: 8426.1785, d_loss: 1.38200879, g_loss: 0.67769241

Epoch: [ 3] [ 395/1093] time: 8428.5098, d_loss: 1.38265324, g_loss: 0.67953098

Epoch: [ 3] [ 396/1093] time: 8430.8210, d_loss: 1.38887477, g_loss: 0.68306112

Epoch: [ 3] [ 397/1093] time: 8432.8221, d_loss: 1.37987733, g_loss: 0.68379599

Epoch: [ 3] [ 398/1093] time: 8435.1133, d_loss: 1.38668215, g_loss: 0.68350947

Epoch: [ 3] [ 399/1093] time: 8437.4845, d_loss: 1.38988137, g_loss: 0.67655754

Epoch: [ 3] [ 400/1093] time: 8439.5957, d_loss: 1.39809549, g_loss: 0.66322517

Epoch: [ 3] [ 401/1093] time: 8441.5667, d_loss: 1.38576388, g_loss: 0.67762470

Epoch: [ 3] [ 402/1093] time: 8443.5578, d_loss: 1.39277625, g_loss: 0.67869860

Epoch: [ 3] [ 403/1093] time: 8445.9391, d_loss: 1.37714362, g_loss: 0.68563044

Epoch: [ 3] [ 404/1093] time: 8448.1903, d_loss: 1.38713384, g_loss: 0.68173122

Epoch: [ 3] [ 405/1093] time: 8450.1813, d_loss: 1.38332641, g_loss: 0.67876709

Epoch: [ 3] [ 406/1093] time: 8452.2124, d_loss: 1.38762641, g_loss: 0.67437690

Epoch: [ 3] [ 407/1093] time: 8454.6137, d_loss: 1.39600587, g_loss: 0.67091662

Epoch: [ 3] [ 408/1093] time: 8456.9349, d_loss: 1.39475024, g_loss: 0.67384183

Epoch: [ 3] [ 409/1093] time: 8458.9060, d_loss: 1.38960707, g_loss: 0.67936569

Epoch: [ 3] [ 410/1093] time: 8461.1472, d_loss: 1.40030944, g_loss: 0.67041624

Epoch: [ 3] [ 411/1093] time: 8463.5084, d_loss: 1.39593017, g_loss: 0.67498016

Epoch: [ 3] [ 412/1093] time: 8465.5195, d_loss: 1.38999593, g_loss: 0.67841613

Epoch: [ 3] [ 413/1093] time: 8467.6707, d_loss: 1.38826776, g_loss: 0.67693788

Epoch: [ 3] [ 414/1093] time: 8470.0919, d_loss: 1.38349032, g_loss: 0.68139064

Epoch: [ 3] [ 415/1093] time: 8472.1430, d_loss: 1.39280987, g_loss: 0.67648333

Epoch: [ 3] [ 416/1093] time: 8474.1467, d_loss: 1.38741899, g_loss: 0.68393362

Epoch: [ 3] [ 417/1093] time: 8476.4279, d_loss: 1.38289893, g_loss: 0.68440443

Epoch: [ 3] [ 418/1093] time: 8478.8592, d_loss: 1.38225627, g_loss: 0.68390000

Epoch: [ 3] [ 419/1093] time: 8480.8303, d_loss: 1.38956904, g_loss: 0.68032801

Epoch: [ 3] [ 420/1093] time: 8483.0515, d_loss: 1.39274383, g_loss: 0.67899847

[Sample] d_loss: 1.38313830, g_loss: 0.69713199

Epoch: [ 3] [ 421/1093] time: 8485.8330, d_loss: 1.38047338, g_loss: 0.68255627

Epoch: [ 3] [ 422/1093] time: 8487.8740, d_loss: 1.38204312, g_loss: 0.68332297

Epoch: [ 3] [ 423/1093] time: 8489.8651, d_loss: 1.39266825, g_loss: 0.67830092

Epoch: [ 3] [ 424/1093] time: 8491.8662, d_loss: 1.37269580, g_loss: 0.68564689

Epoch: [ 3] [ 425/1093] time: 8494.2675, d_loss: 1.38354051, g_loss: 0.67787158

Epoch: [ 3] [ 426/1093] time: 8496.5987, d_loss: 1.39322877, g_loss: 0.67212951

Epoch: [ 3] [ 427/1093] time: 8498.5898, d_loss: 1.38431156, g_loss: 0.68219298

Epoch: [ 3] [ 428/1093] time: 8500.9110, d_loss: 1.38419461, g_loss: 0.68294287

Epoch: [ 3] [ 429/1093] time: 8503.6765, d_loss: 1.38120306, g_loss: 0.68784416

Epoch: [ 3] [ 430/1093] time: 8505.8076, d_loss: 1.37416363, g_loss: 0.68105757

Epoch: [ 3] [ 431/1093] time: 8508.4290, d_loss: 1.38731599, g_loss: 0.67775297

Epoch: [ 3] [ 432/1093] time: 8511.1404, d_loss: 1.37534189, g_loss: 0.69095331

Epoch: [ 3] [ 433/1093] time: 8513.5317, d_loss: 1.37824667, g_loss: 0.68170595

Epoch: [ 3] [ 434/1093] time: 8516.2832, d_loss: 1.39408314, g_loss: 0.67305529

Epoch: [ 3] [ 435/1093] time: 8518.4143, d_loss: 1.38113260, g_loss: 0.68300515

Epoch: [ 3] [ 436/1093] time: 8520.6555, d_loss: 1.37323284, g_loss: 0.68911874

Epoch: [ 3] [ 437/1093] time: 8523.2069, d_loss: 1.38123012, g_loss: 0.68157446

Epoch: [ 3] [ 438/1093] time: 8525.7382, d_loss: 1.40273654, g_loss: 0.67261004

Epoch: [ 3] [ 439/1093] time: 8527.9470, d_loss: 1.39703226, g_loss: 0.67010546

Epoch: [ 3] [ 440/1093] time: 8530.5494, d_loss: 1.39484859, g_loss: 0.67308128

Epoch: [ 3] [ 441/1093] time: 8533.0077, d_loss: 1.38215089, g_loss: 0.67669988

Epoch: [ 3] [ 442/1093] time: 8535.0588, d_loss: 1.39523888, g_loss: 0.67494458

Epoch: [ 3] [ 443/1093] time: 8537.3100, d_loss: 1.39211106, g_loss: 0.68104178

Epoch: [ 3] [ 444/1093] time: 8539.8213, d_loss: 1.39172995, g_loss: 0.67493176

Epoch: [ 3] [ 445/1093] time: 8542.0925, d_loss: 1.37271404, g_loss: 0.68719661


相关文章
|
人工智能 自然语言处理 安全
【网安AIGC专题10.19】论文3代码生成:ChatGPT+自协作代码生成+角色扮演(分析员、程序员、测试员)+消融实验、用于MBPP+HumanEval数据集
【网安AIGC专题10.19】论文3代码生成:ChatGPT+自协作代码生成+角色扮演(分析员、程序员、测试员)+消融实验、用于MBPP+HumanEval数据集
180 0
使用python将数据集划分为训练集、验证集和测试集
使用python将数据集划分为训练集、验证集和测试集
|
2月前
|
机器学习/深度学习 JSON 算法
实例分割笔记(一): 使用YOLOv5-Seg对图像进行分割检测完整版(从自定义数据集到测试验证的完整流程)
本文详细介绍了使用YOLOv5-Seg模型进行图像分割的完整流程,包括图像分割的基础知识、YOLOv5-Seg模型的特点、环境搭建、数据集准备、模型训练、验证、测试以及评价指标。通过实例代码,指导读者从自定义数据集开始,直至模型的测试验证,适合深度学习领域的研究者和开发者参考。
830 3
实例分割笔记(一): 使用YOLOv5-Seg对图像进行分割检测完整版(从自定义数据集到测试验证的完整流程)
|
2月前
|
机器学习/深度学习 JSON 算法
语义分割笔记(二):DeepLab V3对图像进行分割(自定义数据集从零到一进行训练、验证和测试)
本文介绍了DeepLab V3在语义分割中的应用,包括数据集准备、模型训练、测试和评估,提供了代码和资源链接。
295 0
语义分割笔记(二):DeepLab V3对图像进行分割(自定义数据集从零到一进行训练、验证和测试)
|
2月前
|
机器学习/深度学习 算法 PyTorch
目标检测实战(五): 使用YOLOv5-7.0版本对图像进行目标检测完整版(从自定义数据集到测试验证的完整流程)
本文详细介绍了使用YOLOv5-7.0版本进行目标检测的完整流程,包括算法介绍、环境搭建、数据集准备、模型训练、验证、测试以及评价指标。YOLOv5以其高精度、快速度和模型小尺寸在计算机视觉领域受到广泛应用。
944 0
目标检测实战(五): 使用YOLOv5-7.0版本对图像进行目标检测完整版(从自定义数据集到测试验证的完整流程)
|
机器学习/深度学习 人工智能 数据可视化
【网安AIGC专题10.19】论文4:大模型(CODEX 、CodeGen 、INCODER )+自动生成代码评估:改进自动化测试方法、创建测试输入生成器、探索新的评估数据集扩充方法
【网安AIGC专题10.19】论文4:大模型(CODEX 、CodeGen 、INCODER )+自动生成代码评估:改进自动化测试方法、创建测试输入生成器、探索新的评估数据集扩充方法
475 1
|
7月前
|
人工智能 监控 数据处理
【AI大模型应用开发】【LangSmith: 生产级AI应用维护平台】1. 快速上手数据集与测试评估过程
【AI大模型应用开发】【LangSmith: 生产级AI应用维护平台】1. 快速上手数据集与测试评估过程
130 0
|
7月前
|
机器学习/深度学习 算法 异构计算
yolov7训练自己的数据集(pycharm上训练测试)
yolov7训练自己的数据集(pycharm上训练测试)
267 0
|
7月前
|
SQL 分布式计算 DataWorks
dataworks数据集问题之测试联通性报错如何解决
DataWorks数据集是指在阿里云DataWorks平台内创建、管理的数据集合;本合集将介绍DataWorks数据集的创建和使用方法,以及常见的配置问题和解决方法。
|
SQL XML JSON
使用 Data Assistant 快速创建测试数据集
Data Assistant 提供超过 100 种数据类型,为任何开发、测试或演示目的生成大量、异构、真实的数据。
99 0
使用 Data Assistant 快速创建测试数据集
下一篇
DataWorks