DL之Attention:基于ClutteredMNIST手写数字图片数据集分别利用CNN_Init、ST_CNN算法(CNN+SpatialTransformer)实现多分类预测

简介: DL之Attention:基于ClutteredMNIST手写数字图片数据集分别利用CNN_Init、ST_CNN算法(CNN+SpatialTransformer)实现多分类预测

目录

基于ClutteredMNIST手写数字图片数据集分别利用CNN_Init、ST_CNN算法(CNN+SpatialTransformer)实现多分类预测

数据特征工程

T1、CNN_Init start

输出结果

核心代码

T2、ST_CNN start

核心代码


 

 

 

相关文章

DL之Attention:基于ClutteredMNIST手写数字图片数据集分别利用CNN_Init、ST_CNN算法(CNN+SpatialTransformer)实现多分类预测

DL之Attention:基于ClutteredMNIST手写数字图片数据集分别利用CNN_Init、ST_CNN算法(CNN+SpatialTransformer)实现多分类预测实现

 

基于ClutteredMNIST手写数字图片数据集分别利用CNN_Init、ST_CNN算法(CNN+SpatialTransformer)实现多分类预测

数据特征工程

1. Train samples: (50000, 60, 60, 1)
2. Validation samples: (10000, 60, 60, 1)
3. Test samples: (10000, 60, 60, 1)
4. Input shape: (60, 60, 1)

 

T1、CNN_Init start

输出结果

1. T1、CNN_Init start!
2. _________________________________________________________________
3. Layer (type)                 Output Shape              Param #   
4. =================================================================
5. conv2d_1 (Conv2D)            (None, 58, 58, 32)        320
6. _________________________________________________________________
7. conv2d_2 (Conv2D)            (None, 56, 56, 64)        18496
8. _________________________________________________________________
9. max_pooling2d_1 (MaxPooling2 (None, 28, 28, 64)        0
10. _________________________________________________________________
11. conv2d_3 (Conv2D)            (None, 26, 26, 64)        36928
12. _________________________________________________________________
13. max_pooling2d_2 (MaxPooling2 (None, 13, 13, 64)        0
14. _________________________________________________________________
15. conv2d_4 (Conv2D)            (None, 11, 11, 64)        36928
16. _________________________________________________________________
17. dropout_1 (Dropout)          (None, 11, 11, 64)        0
18. _________________________________________________________________
19. flatten_1 (Flatten)          (None, 7744)              0
20. _________________________________________________________________
21. dense_1 (Dense)              (None, 128)               991360
22. _________________________________________________________________
23. dropout_2 (Dropout)          (None, 128)               0
24. _________________________________________________________________
25. dense_2 (Dense)              (None, 10)                1290
26. =================================================================
27. Total params: 1,085,322
28. Trainable params: 1,085,322
29. Non-trainable params: 0
30. _________________________________________________________________
31. None
32. Train on 50000 samples, validate on 10000 samples
33. Epoch 1/30

核心代码

1. #(1)、定义模型结构
2.     model = Sequential()
3.     model.add(Conv2D(32, kernel_size=(3, 3),
4.                      activation='relu',
5.                      input_shape=input_shape))
6.     model.add(Conv2D(64, (3, 3), activation='relu'))
7.     model.add(MaxPooling2D(pool_size=(2, 2)))
8.     model.add(Conv2D(64, kernel_size=(3, 3),
9.                      activation='relu'))
10.     model.add(MaxPooling2D(pool_size=(2, 2)))
11.     model.add(Conv2D(64, (3, 3), activation='relu'))
12.     model.add(Dropout(0.25))
13.     model.add(Flatten())
14.     model.add(Dense(128, activation='relu'))
15.     model.add(Dropout(0.5))
16.     model.add(Dense(nb_classes, activation='softmax'))

 

 

 

 

 

T2、ST_CNN start

1. _________________________________________________________________
2. Layer (type)                 Output Shape              Param #   
3. =================================================================
4. conv2d_1 (Conv2D)            (None, 56, 56, 32)        832
5. _________________________________________________________________
6. activation_1 (Activation)    (None, 56, 56, 32)        0
7. _________________________________________________________________
8. max_pooling2d_1 (MaxPooling2 (None, 28, 28, 32)        0
9. _________________________________________________________________
10. conv2d_2 (Conv2D)            (None, 24, 24, 64)        51264
11. _________________________________________________________________
12. activation_2 (Activation)    (None, 24, 24, 64)        0
13. _________________________________________________________________
14. conv2d_3 (Conv2D)            (None, 22, 22, 64)        36928
15. _________________________________________________________________
16. activation_3 (Activation)    (None, 22, 22, 64)        0
17. _________________________________________________________________
18. max_pooling2d_2 (MaxPooling2 (None, 11, 11, 64)        0
19. _________________________________________________________________
20. flatten_1 (Flatten)          (None, 7744)              0
21. _________________________________________________________________
22. dense_1 (Dense)              (None, 50)                387250
23. _________________________________________________________________
24. activation_4 (Activation)    (None, 50)                0
25. _________________________________________________________________
26. dense_2 (Dense)              (None, 6)                 306
27. =================================================================
28. Total params: 476,580
29. Trainable params: 476,580
30. Non-trainable params: 0
31. _________________________________________________________________
32. None
33. _________________________________________________________________
34. Layer (type)                 Output Shape              Param #   
35. =================================================================
36. spatial_transformer_1 (Spati (None, 30, 30, 1)         476580
37. _________________________________________________________________
38. conv2d_4 (Conv2D)            (None, 28, 28, 32)        320
39. _________________________________________________________________
40. dropout_1 (Dropout)          (None, 28, 28, 32)        0
41. _________________________________________________________________
42. conv2d_5 (Conv2D)            (None, 26, 26, 64)        18496
43. _________________________________________________________________
44. dropout_2 (Dropout)          (None, 26, 26, 64)        0
45. _________________________________________________________________
46. max_pooling2d_3 (MaxPooling2 (None, 13, 13, 64)        0
47. _________________________________________________________________
48. conv2d_6 (Conv2D)            (None, 11, 11, 64)        36928
49. _________________________________________________________________
50. dropout_3 (Dropout)          (None, 11, 11, 64)        0
51. _________________________________________________________________
52. max_pooling2d_4 (MaxPooling2 (None, 5, 5, 64)          0
53. _________________________________________________________________
54. flatten_2 (Flatten)          (None, 1600)              0
55. _________________________________________________________________
56. dense_3 (Dense)              (None, 256)               409856
57. _________________________________________________________________
58. dropout_4 (Dropout)          (None, 256)               0
59. _________________________________________________________________
60. activation_5 (Activation)    (None, 256)               0
61. _________________________________________________________________
62. dense_4 (Dense)              (None, 10)                2570
63. _________________________________________________________________
64. activation_6 (Activation)    (None, 10)                0
65. =================================================================
66. Total params: 944,750
67. Trainable params: 944,750
68. Non-trainable params: 0
69. _________________________________________________________________
70. None
71. Train on 50000 samples, validate on 10000 samples
72. Epoch 1/30
73.  - 974s - loss: 2.0926 - categorical_accuracy: 0.2345 - val_loss: 1.6258 - val_categorical_accuracy: 0.5949
74. Epoch 2/30
75.  - 1007s - loss: 1.0926 - categorical_accuracy: 0.6387 - val_loss: 0.7963 - val_categorical_accuracy: 0.8433
76. Epoch 3/30
77.  - 844s - loss: 0.6038 - categorical_accuracy: 0.8118 - val_loss: 0.4906 - val_categorical_accuracy: 0.8977
78. Epoch 4/30
79.  - 851s - loss: 0.4351 - categorical_accuracy: 0.8648 - val_loss: 0.3909 - val_categorical_accuracy: 0.9160
80. Epoch 5/30
81.  - 864s - loss: 0.3483 - categorical_accuracy: 0.8914 - val_loss: 0.3046 - val_categorical_accuracy: 0.9367
82. Epoch 6/30
83.  - 872s - loss: 0.3158 - categorical_accuracy: 0.9027 - val_loss: 0.2826 - val_categorical_accuracy: 0.9349
84. Epoch 7/30
85.  - 861s - loss: 0.2772 - categorical_accuracy: 0.9136 - val_loss: 0.3244 - val_categorical_accuracy: 0.9243
86. Epoch 8/30
87.  - 862s - loss: 0.2414 - categorical_accuracy: 0.9251 - val_loss: 0.2228 - val_categorical_accuracy: 0.9600
88. Epoch 9/30
89.  - 858s - loss: 0.2278 - categorical_accuracy: 0.9287 - val_loss: 0.2305 - val_categorical_accuracy: 0.9556
90. Epoch 10/30
91.  - 860s - loss: 0.2150 - categorical_accuracy: 0.9328 - val_loss: 0.2119 - val_categorical_accuracy: 0.9600
92. Epoch 11/30
93.  - 862s - loss: 0.2130 - categorical_accuracy: 0.9334 - val_loss: 0.1949 - val_categorical_accuracy: 0.9583
94. Epoch 12/30
95.  - 855s - loss: 0.1917 - categorical_accuracy: 0.9410 - val_loss: 0.1841 - val_categorical_accuracy: 0.9595
96. Epoch 13/30
97.  - 857s - loss: 0.1891 - categorical_accuracy: 0.9414 - val_loss: 0.2455 - val_categorical_accuracy: 0.9613
98. Epoch 14/30
99.  - 862s - loss: 0.1865 - categorical_accuracy: 0.9423 - val_loss: 0.2044 - val_categorical_accuracy: 0.9629
100. Epoch 15/30
101.  - 863s - loss: 0.1789 - categorical_accuracy: 0.9446 - val_loss: 0.2147 - val_categorical_accuracy: 0.9647
102. Epoch 16/30
103.  - 855s - loss: 0.1708 - categorical_accuracy: 0.9460 - val_loss: 0.1748 - val_categorical_accuracy: 0.9692
104. Epoch 17/30
105.  - 859s - loss: 0.1615 - categorical_accuracy: 0.9509 - val_loss: 0.1870 - val_categorical_accuracy: 0.9707
106. Epoch 18/30
107.  - 862s - loss: 0.1538 - categorical_accuracy: 0.9514 - val_loss: 0.1906 - val_categorical_accuracy: 0.9689
108. Epoch 19/30
109.  - 866s - loss: 0.1494 - categorical_accuracy: 0.9537 - val_loss: 0.1596 - val_categorical_accuracy: 0.9728
110. Epoch 20/30
111.  - 864s - loss: 0.1490 - categorical_accuracy: 0.9537 - val_loss: 0.1821 - val_categorical_accuracy: 0.9692
112. Epoch 21/30
113.  - 860s - loss: 0.1517 - categorical_accuracy: 0.9524 - val_loss: 0.1579 - val_categorical_accuracy: 0.9701
114. Epoch 22/30
115.  - 859s - loss: 0.1506 - categorical_accuracy: 0.9539 - val_loss: 0.1595 - val_categorical_accuracy: 0.9712
116. Epoch 23/30
117.  - 859s - loss: 0.1407 - categorical_accuracy: 0.9567 - val_loss: 0.1590 - val_categorical_accuracy: 0.9712
118. Epoch 24/30
119.  - 856s - loss: 0.1361 - categorical_accuracy: 0.9569 - val_loss: 0.2160 - val_categorical_accuracy: 0.9723
120. Epoch 25/30
121.  - 856s - loss: 0.1348 - categorical_accuracy: 0.9583 - val_loss: 0.1678 - val_categorical_accuracy: 0.9741
122. Epoch 26/30
123.  - 856s - loss: 0.1298 - categorical_accuracy: 0.9596 - val_loss: 0.1820 - val_categorical_accuracy: 0.9707
124. Epoch 27/30
125.  - 856s - loss: 0.1317 - categorical_accuracy: 0.9597 - val_loss: 0.1998 - val_categorical_accuracy: 0.9738
126. Epoch 28/30
127.  - 855s - loss: 0.1325 - categorical_accuracy: 0.9594 - val_loss: 0.1991 - val_categorical_accuracy: 0.9674
128. Epoch 29/30
129.  - 856s - loss: 0.1230 - categorical_accuracy: 0.9621 - val_loss: 0.1848 - val_categorical_accuracy: 0.9720
130. Epoch 30/30
131.  - 856s - loss: 0.1246 - categorical_accuracy: 0.9611 - val_loss: 0.1754 - val_categorical_accuracy: 0.9755
132. 
133. 32/10000 [..............................] - ETA: 59s
134. 64/10000 [..............................] - ETA: 59s
135. 96/10000 [..............................] - ETA: 59s
136. 128/10000 [..............................] - ETA: 57s
137. 160/10000 [..............................] - ETA: 56s
138. 192/10000 [..............................] - ETA: 55s
139. 224/10000 [..............................] - ETA: 55s
140. 256/10000 [..............................] - ETA: 54s
141. 288/10000 [..............................] - ETA: 54s
142. 320/10000 [..............................] - ETA: 54s
143. 352/10000 [>.............................] - ETA: 53s
144. 384/10000 [>.............................] - ETA: 53s
145. 416/10000 [>.............................] - ETA: 53s
146. 448/10000 [>.............................] - ETA: 52s
147. 480/10000 [>.............................] - ETA: 52s
148. 512/10000 [>.............................] - ETA: 52s
149. 544/10000 [>.............................] - ETA: 52s
150. 576/10000 [>.............................] - ETA: 52s
151. 608/10000 [>.............................] - ETA: 52s
152. 640/10000 [>.............................] - ETA: 52s
153. 672/10000 [=>............................] - ETA: 51s
154. 704/10000 [=>............................] - ETA: 51s
155. 736/10000 [=>............................] - ETA: 51s
156. 768/10000 [=>............................] - ETA: 51s
157. 800/10000 [=>............................] - ETA: 51s
158. 832/10000 [=>............................] - ETA: 50s
159. 864/10000 [=>............................] - ETA: 50s
160. 896/10000 [=>............................] - ETA: 50s
161. 928/10000 [=>............................] - ETA: 50s
162. 960/10000 [=>............................] - ETA: 50s
163. 992/10000 [=>............................] - ETA: 49s
164. 1024/10000 [==>...........................] - ETA: 49s
165. 1056/10000 [==>...........................] - ETA: 49s
166. 1088/10000 [==>...........................] - ETA: 49s
167. 1120/10000 [==>...........................] - ETA: 49s
168. 1152/10000 [==>...........................] - ETA: 49s
169. 1184/10000 [==>...........................] - ETA: 49s
170. 1216/10000 [==>...........................] - ETA: 48s
171. 1248/10000 [==>...........................] - ETA: 48s
172. 1280/10000 [==>...........................] - ETA: 48s
173. 1312/10000 [==>...........................] - ETA: 48s
174. 1344/10000 [===>..........................] - ETA: 48s
175. 1376/10000 [===>..........................] - ETA: 47s
176. 1408/10000 [===>..........................] - ETA: 47s
177. 1440/10000 [===>..........................] - ETA: 47s
178. 1472/10000 [===>..........................] - ETA: 47s
179. 1504/10000 [===>..........................] - ETA: 47s
180. 1536/10000 [===>..........................] - ETA: 46s
181. 1568/10000 [===>..........................] - ETA: 46s
182. 1600/10000 [===>..........................] - ETA: 46s
183. 1632/10000 [===>..........................] - ETA: 46s
184. 1664/10000 [===>..........................] - ETA: 46s
185. 1696/10000 [====>.........................] - ETA: 46s
186. 1728/10000 [====>.........................] - ETA: 46s
187. 1760/10000 [====>.........................] - ETA: 46s
188. 1792/10000 [====>.........................] - ETA: 46s
189. 1824/10000 [====>.........................] - ETA: 45s
190. 1856/10000 [====>.........................] - ETA: 45s
191. 1888/10000 [====>.........................] - ETA: 45s
192. 1920/10000 [====>.........................] - ETA: 45s
193. 1952/10000 [====>.........................] - ETA: 45s
194. 1984/10000 [====>.........................] - ETA: 45s
195. 2016/10000 [=====>........................] - ETA: 44s
196. 2048/10000 [=====>........................] - ETA: 44s
197. 2080/10000 [=====>........................] - ETA: 44s
198. 2112/10000 [=====>........................] - ETA: 44s
199. 2144/10000 [=====>........................] - ETA: 44s
200. 2176/10000 [=====>........................] - ETA: 44s
201. 2208/10000 [=====>........................] - ETA: 44s
202. 2240/10000 [=====>........................] - ETA: 43s
203. 2272/10000 [=====>........................] - ETA: 43s
204. 2304/10000 [=====>........................] - ETA: 43s
205. 2336/10000 [======>.......................] - ETA: 43s
206. 2368/10000 [======>.......................] - ETA: 43s
207. 2400/10000 [======>.......................] - ETA: 43s
208. 2432/10000 [======>.......................] - ETA: 42s
209. 2464/10000 [======>.......................] - ETA: 42s
210. 2496/10000 [======>.......................] - ETA: 42s
211. 2528/10000 [======>.......................] - ETA: 42s
212. 2560/10000 [======>.......................] - ETA: 42s
213. 2592/10000 [======>.......................] - ETA: 41s
214. 2624/10000 [======>.......................] - ETA: 41s
215. 2656/10000 [======>.......................] - ETA: 41s
216. 2688/10000 [=======>......................] - ETA: 41s
217. 2720/10000 [=======>......................] - ETA: 41s
218. 2752/10000 [=======>......................] - ETA: 41s
219. 2784/10000 [=======>......................] - ETA: 40s
220. 2816/10000 [=======>......................] - ETA: 40s
221. 2848/10000 [=======>......................] - ETA: 40s
222. 2880/10000 [=======>......................] - ETA: 40s
223. 2912/10000 [=======>......................] - ETA: 40s
224. 2944/10000 [=======>......................] - ETA: 39s
225. 2976/10000 [=======>......................] - ETA: 39s
226. 3008/10000 [========>.....................] - ETA: 39s
227. 3040/10000 [========>.....................] - ETA: 39s
228. 3072/10000 [========>.....................] - ETA: 39s
229. 3104/10000 [========>.....................] - ETA: 39s
230. 3136/10000 [========>.....................] - ETA: 38s
231. 3168/10000 [========>.....................] - ETA: 38s
232. 3200/10000 [========>.....................] - ETA: 38s
233. 3232/10000 [========>.....................] - ETA: 38s
234. 3264/10000 [========>.....................] - ETA: 38s
235. 3296/10000 [========>.....................] - ETA: 37s
236. 3328/10000 [========>.....................] - ETA: 37s
237. 3360/10000 [=========>....................] - ETA: 37s
238. 3392/10000 [=========>....................] - ETA: 37s
239. 3424/10000 [=========>....................] - ETA: 37s
240. 3456/10000 [=========>....................] - ETA: 36s
241. 3488/10000 [=========>....................] - ETA: 36s
242. 3520/10000 [=========>....................] - ETA: 36s
243. 3552/10000 [=========>....................] - ETA: 36s
244. 3584/10000 [=========>....................] - ETA: 36s
245. 3616/10000 [=========>....................] - ETA: 36s
246. 3648/10000 [=========>....................] - ETA: 35s
247. 3680/10000 [==========>...................] - ETA: 35s
248. 3712/10000 [==========>...................] - ETA: 35s
249. 3744/10000 [==========>...................] - ETA: 35s
250. 3776/10000 [==========>...................] - ETA: 35s
251. 3808/10000 [==========>...................] - ETA: 34s
252. 3840/10000 [==========>...................] - ETA: 34s
253. 3872/10000 [==========>...................] - ETA: 34s
254. 3904/10000 [==========>...................] - ETA: 34s
255. 3936/10000 [==========>...................] - ETA: 34s
256. 3968/10000 [==========>...................] - ETA: 33s
257. 4000/10000 [===========>..................] - ETA: 33s
258. 4032/10000 [===========>..................] - ETA: 33s
259. 4064/10000 [===========>..................] - ETA: 33s
260. 4096/10000 [===========>..................] - ETA: 33s
261. 4128/10000 [===========>..................] - ETA: 33s
262. 4160/10000 [===========>..................] - ETA: 32s
263. 4192/10000 [===========>..................] - ETA: 32s
264. 4224/10000 [===========>..................] - ETA: 32s
265. 4256/10000 [===========>..................] - ETA: 32s
266. 4288/10000 [===========>..................] - ETA: 32s
267. 4320/10000 [===========>..................] - ETA: 31s
268. 4352/10000 [============>.................] - ETA: 31s
269. 4384/10000 [============>.................] - ETA: 31s
270. 4416/10000 [============>.................] - ETA: 31s
271. 4448/10000 [============>.................] - ETA: 31s
272. 4480/10000 [============>.................] - ETA: 31s
273. 4512/10000 [============>.................] - ETA: 30s
274. 4544/10000 [============>.................] - ETA: 30s
275. 4576/10000 [============>.................] - ETA: 30s
276. 4608/10000 [============>.................] - ETA: 30s
277. 4640/10000 [============>.................] - ETA: 30s
278. 4672/10000 [=============>................] - ETA: 29s
279. 4704/10000 [=============>................] - ETA: 29s
280. 4736/10000 [=============>................] - ETA: 29s
281. 4768/10000 [=============>................] - ETA: 29s
282. 4800/10000 [=============>................] - ETA: 29s
283. 4832/10000 [=============>................] - ETA: 29s
284. 4864/10000 [=============>................] - ETA: 28s
285. 4896/10000 [=============>................] - ETA: 28s
286. 4928/10000 [=============>................] - ETA: 28s
287. 4960/10000 [=============>................] - ETA: 28s
288. 4992/10000 [=============>................] - ETA: 28s
289. 5024/10000 [==============>...............] - ETA: 27s
290. 5056/10000 [==============>...............] - ETA: 27s
291. 5088/10000 [==============>...............] - ETA: 27s
292. 5120/10000 [==============>...............] - ETA: 27s
293. 5152/10000 [==============>...............] - ETA: 27s
294. 5184/10000 [==============>...............] - ETA: 27s
295. 5216/10000 [==============>...............] - ETA: 26s
296. 5248/10000 [==============>...............] - ETA: 26s
297. 5280/10000 [==============>...............] - ETA: 26s
298. 5312/10000 [==============>...............] - ETA: 26s
299. 5344/10000 [===============>..............] - ETA: 26s
300. 5376/10000 [===============>..............] - ETA: 25s
301. 5408/10000 [===============>..............] - ETA: 25s
302. 5440/10000 [===============>..............] - ETA: 25s
303. 5472/10000 [===============>..............] - ETA: 25s
304. 5504/10000 [===============>..............] - ETA: 25s
305. 5536/10000 [===============>..............] - ETA: 25s
306. 5568/10000 [===============>..............] - ETA: 24s
307. 5600/10000 [===============>..............] - ETA: 24s
308. 5632/10000 [===============>..............] - ETA: 24s
309. 5664/10000 [===============>..............] - ETA: 24s
310. 5696/10000 [================>.............] - ETA: 24s
311. 5728/10000 [================>.............] - ETA: 23s
312. 5760/10000 [================>.............] - ETA: 23s
313. 5792/10000 [================>.............] - ETA: 23s
314. 5824/10000 [================>.............] - ETA: 23s
315. 5856/10000 [================>.............] - ETA: 23s
316. 5888/10000 [================>.............] - ETA: 23s
317. 5920/10000 [================>.............] - ETA: 22s
318. 5952/10000 [================>.............] - ETA: 22s
319. 5984/10000 [================>.............] - ETA: 22s
320. 6016/10000 [=================>............] - ETA: 22s
321. 6048/10000 [=================>............] - ETA: 22s
322. 6080/10000 [=================>............] - ETA: 21s
323. 6112/10000 [=================>............] - ETA: 21s
324. 6144/10000 [=================>............] - ETA: 21s
325. 6176/10000 [=================>............] - ETA: 21s
326. 6208/10000 [=================>............] - ETA: 21s
327. 6240/10000 [=================>............] - ETA: 21s
328. 6272/10000 [=================>............] - ETA: 20s
329. 6304/10000 [=================>............] - ETA: 20s
330. 6336/10000 [==================>...........] - ETA: 20s
331. 6368/10000 [==================>...........] - ETA: 20s
332. 6400/10000 [==================>...........] - ETA: 20s
333. 6432/10000 [==================>...........] - ETA: 19s
334. 6464/10000 [==================>...........] - ETA: 19s
335. 6496/10000 [==================>...........] - ETA: 19s
336. 6528/10000 [==================>...........] - ETA: 19s
337. 6560/10000 [==================>...........] - ETA: 19s
338. 6592/10000 [==================>...........] - ETA: 19s
339. 6624/10000 [==================>...........] - ETA: 18s
340. 6656/10000 [==================>...........] - ETA: 18s
341. 6688/10000 [===================>..........] - ETA: 18s
342. 6720/10000 [===================>..........] - ETA: 18s
343. 6752/10000 [===================>..........] - ETA: 18s
344. 6784/10000 [===================>..........] - ETA: 17s
345. 6816/10000 [===================>..........] - ETA: 17s
346. 6848/10000 [===================>..........] - ETA: 17s
347. 6880/10000 [===================>..........] - ETA: 17s
348. 6912/10000 [===================>..........] - ETA: 17s
349. 6944/10000 [===================>..........] - ETA: 17s
350. 6976/10000 [===================>..........] - ETA: 16s
351. 7008/10000 [====================>.........] - ETA: 16s
352. 7040/10000 [====================>.........] - ETA: 16s
353. 7072/10000 [====================>.........] - ETA: 16s
354. 7104/10000 [====================>.........] - ETA: 16s
355. 7136/10000 [====================>.........] - ETA: 16s
356. 7168/10000 [====================>.........] - ETA: 15s
357. 7200/10000 [====================>.........] - ETA: 15s
358. 7232/10000 [====================>.........] - ETA: 15s
359. 7264/10000 [====================>.........] - ETA: 15s
360. 7296/10000 [====================>.........] - ETA: 15s
361. 7328/10000 [====================>.........] - ETA: 14s
362. 7360/10000 [=====================>........] - ETA: 14s
363. 7392/10000 [=====================>........] - ETA: 14s
364. 7424/10000 [=====================>........] - ETA: 14s
365. 7456/10000 [=====================>........] - ETA: 14s
366. 7488/10000 [=====================>........] - ETA: 14s
367. 7520/10000 [=====================>........] - ETA: 13s
368. 7552/10000 [=====================>........] - ETA: 13s
369. 7584/10000 [=====================>........] - ETA: 13s
370. 7616/10000 [=====================>........] - ETA: 13s
371. 7648/10000 [=====================>........] - ETA: 13s
372. 7680/10000 [======================>.......] - ETA: 13s
373. 7712/10000 [======================>.......] - ETA: 12s
374. 7744/10000 [======================>.......] - ETA: 12s
375. 7776/10000 [======================>.......] - ETA: 12s
376. 7808/10000 [======================>.......] - ETA: 12s
377. 7840/10000 [======================>.......] - ETA: 12s
378. 7872/10000 [======================>.......] - ETA: 11s
379. 7904/10000 [======================>.......] - ETA: 11s
380. 7936/10000 [======================>.......] - ETA: 11s
381. 7968/10000 [======================>.......] - ETA: 11s
382. 8000/10000 [=======================>......] - ETA: 11s
383. 8032/10000 [=======================>......] - ETA: 11s
384. 8064/10000 [=======================>......] - ETA: 10s
385. 8096/10000 [=======================>......] - ETA: 10s
386. 8128/10000 [=======================>......] - ETA: 10s
387. 8160/10000 [=======================>......] - ETA: 10s
388. 8192/10000 [=======================>......] - ETA: 10s
389. 8224/10000 [=======================>......] - ETA: 9s 
390. 8256/10000 [=======================>......] - ETA: 9s
391. 8288/10000 [=======================>......] - ETA: 9s
392. 8320/10000 [=======================>......] - ETA: 9s
393. 8352/10000 [========================>.....] - ETA: 9s
394. 8384/10000 [========================>.....] - ETA: 9s
395. 8416/10000 [========================>.....] - ETA: 8s
396. 8448/10000 [========================>.....] - ETA: 8s
397. 8480/10000 [========================>.....] - ETA: 8s
398. 8512/10000 [========================>.....] - ETA: 8s
399. 8544/10000 [========================>.....] - ETA: 8s
400. 8576/10000 [========================>.....] - ETA: 7s
401. 8608/10000 [========================>.....] - ETA: 7s
402. 8640/10000 [========================>.....] - ETA: 7s
403. 8672/10000 [=========================>....] - ETA: 7s
404. 8704/10000 [=========================>....] - ETA: 7s
405. 8736/10000 [=========================>....] - ETA: 7s
406. 8768/10000 [=========================>....] - ETA: 6s
407. 8800/10000 [=========================>....] - ETA: 6s
408. 8832/10000 [=========================>....] - ETA: 6s
409. 8864/10000 [=========================>....] - ETA: 6s
410. 8896/10000 [=========================>....] - ETA: 6s
411. 8928/10000 [=========================>....] - ETA: 6s
412. 8960/10000 [=========================>....] - ETA: 5s
413. 8992/10000 [=========================>....] - ETA: 5s
414. 9024/10000 [==========================>...] - ETA: 5s
415. 9056/10000 [==========================>...] - ETA: 5s
416. 9088/10000 [==========================>...] - ETA: 5s
417. 9120/10000 [==========================>...] - ETA: 4s
418. 9152/10000 [==========================>...] - ETA: 4s
419. 9184/10000 [==========================>...] - ETA: 4s
420. 9216/10000 [==========================>...] - ETA: 4s
421. 9248/10000 [==========================>...] - ETA: 4s
422. 9280/10000 [==========================>...] - ETA: 4s
423. 9312/10000 [==========================>...] - ETA: 3s
424. 9344/10000 [===========================>..] - ETA: 3s
425. 9376/10000 [===========================>..] - ETA: 3s
426. 9408/10000 [===========================>..] - ETA: 3s
427. 9440/10000 [===========================>..] - ETA: 3s
428. 9472/10000 [===========================>..] - ETA: 2s
429. 9504/10000 [===========================>..] - ETA: 2s
430. 9536/10000 [===========================>..] - ETA: 2s
431. 9568/10000 [===========================>..] - ETA: 2s
432. 9600/10000 [===========================>..] - ETA: 2s
433. 9632/10000 [===========================>..] - ETA: 2s
434. 9664/10000 [===========================>..] - ETA: 1s
435. 9696/10000 [============================>.] - ETA: 1s
436. 9728/10000 [============================>.] - ETA: 1s
437. 9760/10000 [============================>.] - ETA: 1s
438. 9792/10000 [============================>.] - ETA: 1s
439. 9824/10000 [============================>.] - ETA: 0s
440. 9856/10000 [============================>.] - ETA: 0s
441. 9888/10000 [============================>.] - ETA: 0s
442. 9920/10000 [============================>.] - ETA: 0s
443. 9952/10000 [============================>.] - ETA: 0s
444. 9984/10000 [============================>.] - ETA: 0s
445. 10000/10000 [==============================] - 56s 6ms/step
446.

核心代码

1. #(2)、建立ST定位网络:尝试更多的conv层,并分别在X轴和y轴上做最大池化
2. # localization net. TODO: try more conv layers, and do max pooling on X- and Y-axes respectively
3.     locnet = Sequential()
4. # locnet.add(MaxPooling2D(pool_size=(2,2), input_shape=input_shape))
5. # locnet.add(Convolution2D(32, (5, 5)))
6.     locnet.add(Convolution2D(32, (5, 5), input_shape=input_shape))
7.     locnet.add(Activation('relu'))
8. # locnet.add(Dropout(0.2)) # 0.2
9.     locnet.add(MaxPooling2D(pool_size=(2,2)))
10.     locnet.add(Convolution2D(64, (5, 5)))
11.     locnet.add(Activation('relu'))
12. # locnet.add(Dropout(0.2)) # 0.3
13.     locnet.add(Convolution2D(64, (3, 3)))
14.     locnet.add(Activation('relu'))
15.     locnet.add(MaxPooling2D(pool_size=(2,2)))
16. 
17.     locnet.add(Flatten())
18.     locnet.add(Dense(50))
19.     locnet.add(Activation('relu'))
20.     locnet.add(Dense(6, weights=weights))
21. print(locnet.summary())
22. 
23. 
24. #(3)、建立CNN网络
25.     model = Sequential()
26.     model.add(SpatialTransformer(localization_net=locnet,
27.                                  output_size=(30,30), input_shape=input_shape))
28. # model.add(Convolution2D(32, (3, 3), padding='same'))
29. # model.add(Activation('relu'))
30. # model.add(MaxPooling2D(pool_size=(2, 2)))
31. # model.add(Convolution2D(64, (3, 3)))
32. # model.add(Activation('relu'))
33. # model.add(MaxPooling2D(pool_size=(2, 2)))
34. # model.add(Dropout(0.5)) # 0.25
35. 
36. # E: removed first 3 dropout layers
37.     model.add(Conv2D(32, kernel_size=(3, 3), activation='relu'))
38.     model.add(Dropout(0.5)) # 0.5
39.     model.add(Conv2D(64, (3, 3), activation='relu'))
40.     model.add(Dropout(0.5)) # 0.5
41.     model.add(MaxPooling2D(pool_size=(2, 2)))
42.     model.add(Conv2D(64, kernel_size=(3, 3),
43.                      activation='relu'))
44.     model.add(Dropout(0.5)) # 0.5
45.     model.add(MaxPooling2D(pool_size=(2, 2)))
46. # model.add(Conv2D(64, (3, 3), activation='relu'))
47. # model.add(Dropout(0.5))
48.     model.add(Flatten())
49.     model.add(Dense(256)) # 256
50.     model.add(Dropout(0.5)) # 0.5
51.     model.add(Activation('relu'))
52.     model.add(Dense(nb_classes))
53.     model.add(Activation('softmax'))


相关文章
|
7月前
|
机器学习/深度学习 编解码 PyTorch
Pytorch实现手写数字识别 | MNIST数据集(CNN卷积神经网络)
Pytorch实现手写数字识别 | MNIST数据集(CNN卷积神经网络)
|
3月前
|
机器学习/深度学习 数据采集 数据可视化
深度学习实践:构建并训练卷积神经网络(CNN)对CIFAR-10数据集进行分类
本文详细介绍如何使用PyTorch构建并训练卷积神经网络(CNN)对CIFAR-10数据集进行图像分类。从数据预处理、模型定义到训练过程及结果可视化,文章全面展示了深度学习项目的全流程。通过实际操作,读者可以深入了解CNN在图像分类任务中的应用,并掌握PyTorch的基本使用方法。希望本文为您的深度学习项目提供有价值的参考与启示。
|
7月前
|
机器学习/深度学习 数据可视化 数据挖掘
R语言深度学习卷积神经网络 (CNN)对 CIFAR 图像进行分类:训练与结果评估可视化
R语言深度学习卷积神经网络 (CNN)对 CIFAR 图像进行分类:训练与结果评估可视化
|
7月前
|
机器学习/深度学习 算法 TensorFlow
【视频】神经网络正则化方法防过拟合和R语言CNN分类手写数字图像数据MNIST|数据分享
【视频】神经网络正则化方法防过拟合和R语言CNN分类手写数字图像数据MNIST|数据分享
|
7月前
|
机器学习/深度学习 数据采集 TensorFlow
R语言KERAS深度学习CNN卷积神经网络分类识别手写数字图像数据(MNIST)
R语言KERAS深度学习CNN卷积神经网络分类识别手写数字图像数据(MNIST)
|
7月前
|
机器学习/深度学习 数据采集 传感器
基于CNN和双向gru的心跳分类系统
CNN and Bidirectional GRU-Based Heartbeat Sound Classification Architecture for Elderly People是发布在2023 MDPI Mathematics上的论文,提出了基于卷积神经网络和双向门控循环单元(CNN + BiGRU)注意力的心跳声分类,论文不仅显示了模型还构建了完整的系统。
95 6
|
7月前
|
机器学习/深度学习 TensorFlow 算法框架/工具
【Python深度学习】Tensorflow+CNN进行人脸识别实战(附源码和数据集)
【Python深度学习】Tensorflow+CNN进行人脸识别实战(附源码和数据集)
730 4
|
3天前
|
机器学习/深度学习 算法
基于改进遗传优化的BP神经网络金融序列预测算法matlab仿真
本项目基于改进遗传优化的BP神经网络进行金融序列预测,使用MATLAB2022A实现。通过对比BP神经网络、遗传优化BP神经网络及改进遗传优化BP神经网络,展示了三者的误差和预测曲线差异。核心程序结合遗传算法(GA)与BP神经网络,利用GA优化BP网络的初始权重和阈值,提高预测精度。GA通过选择、交叉、变异操作迭代优化,防止局部收敛,增强模型对金融市场复杂性和不确定性的适应能力。
111 80
|
23小时前
|
机器学习/深度学习 算法 索引
单目标问题的烟花优化算法求解matlab仿真,对比PSO和GA
本项目使用FW烟花优化算法求解单目标问题,并在MATLAB2022A中实现仿真,对比PSO和GA的性能。核心代码展示了适应度计算、火花生成及位置约束等关键步骤。最终通过收敛曲线对比三种算法的优化效果。烟花优化算法模拟烟花爆炸过程,探索搜索空间,寻找全局最优解,适用于复杂非线性问题。PSO和GA则分别适合快速收敛和大解空间的问题。参数调整和算法特性分析显示了各自的优势与局限。
|
22天前
|
算法
基于WOA算法的SVDD参数寻优matlab仿真
该程序利用鲸鱼优化算法(WOA)对支持向量数据描述(SVDD)模型的参数进行优化,以提高数据分类的准确性。通过MATLAB2022A实现,展示了不同信噪比(SNR)下模型的分类误差。WOA通过模拟鲸鱼捕食行为,动态调整SVDD参数,如惩罚因子C和核函数参数γ,以寻找最优参数组合,增强模型的鲁棒性和泛化能力。