CV之CNN:基于tensorflow框架采用CNN(改进的AlexNet,训练/评估/推理)卷积神经网络算法实现猫狗图像分类识别

简介: CV之CNN:基于tensorflow框架采用CNN(改进的AlexNet,训练/评估/推理)卷积神经网络算法实现猫狗图像分类识别


目录

基于tensorflow框架采用CNN(改进的AlexNet,训练/评估/推理)卷积神经网络算法实现猫狗图像分类识别

数据集介绍

输出结果

使用model.ckpt-6000模型预测

预测错误的只有一个案例,如下所示

训练结果

核心代码


基于tensorflow框架采用CNN(改进的AlexNet,训练/评估/推理)卷积神经网络算法实现猫狗图像分类识别

数据集介绍

数据下载Dogs vs. Cats Redux: Kernels Edition | Kaggle

    train文件夹里有25000张狗和猫的图片。这个文件夹中的每个图像都有标签作为文件名的一部分。测试文件夹包含12500张图片,根据数字id命名。对于测试集中的每个图像,您应该预测图像是一只狗的概率(1 =狗,0 =猫)。

输出结果

使用model.ckpt-6000模型预测

预测错误的只有一个案例,如下所示

序号 使用model.ckpt-4000模型预测 使用model.ckpt-6000模型预测 使用model.ckpt-8000模型预测 使用model.ckpt-10000模型预测 使用model.ckpt-12000模型预测
1 cat cat (1).jpg 猫的概率 0.631 cat (1).jpg 狗的概率 0.740 cat (1).jpg 狗的概率 0.781 cat (1).jpg 狗的概率 0.976 cat (1).jpg 狗的概率 0.991
2 cat (10).jpg 狗的概率 0.697 cat (10).jpg 猫的概率 0.566 cat (10).jpg 猫的概率 0.925 cat (10).jpg 猫的概率 0.925 cat (10).jpg 猫的概率 0.816
3 cat (11).jpg 猫的概率 0.927 cat (11).jpg 猫的概率 0.988 cat (11).jpg 猫的概率 1.000 cat (11).jpg 猫的概率 1.000 cat (11).jpg 猫的概率 1.000
4 cat (12).jpg 狗的概率 0.746 cat (12).jpg 狗的概率 0.723 cat (12).jpg 狗的概率 0.822 cat (12).jpg 狗的概率 0.998 cat (12).jpg 狗的概率 1.000
5 cat (13).jpg 猫的概率 0.933 cat (13).jpg 猫的概率 0.983 cat (13).jpg 猫的概率 0.997 cat (13).jpg 猫的概率 1.000 cat (13).jpg 猫的概率 1.000
6 cat (14).jpg 狗的概率 0.657 cat (14).jpg 猫的概率 0.597 cat (14).jpg 狗的概率 0.758 cat (14).jpg 狗的概率 0.695 cat (14).jpg 猫的概率 0.544
7 cat (15).jpg 狗的概率 0.578 cat (15).jpg 狗的概率 0.535 cat (15).jpg 狗的概率 0.526 cat (15).jpg 狗的概率 0.750 cat (15).jpg 狗的概率 0.569
8 cat (2).jpg 猫的概率 0.649 cat (2).jpg 猫的概率 0.637 cat (2).jpg 猫的概率 0.844 cat (2).jpg 猫的概率 0.996 cat (2).jpg 猫的概率 0.998
9 cat (3).jpg 狗的概率 0.668 cat (3).jpg 猫的概率 0.538 cat (3).jpg 猫的概率 0.710 cat (3).jpg 猫的概率 0.968 cat (3).jpg 猫的概率 0.995
10 cat (4).jpg 狗的概率 0.856 cat (4).jpg 狗的概率 0.780 cat (4).jpg 狗的概率 0.831 cat (4).jpg 狗的概率 0.974 cat (4).jpg 狗的概率 0.976
11 cat (5).jpg 猫的概率 0.812 cat (5).jpg 猫的概率 0.776 cat (5).jpg 猫的概率 0.505 cat (5).jpg 猫的概率 0.732 cat (5).jpg 狗的概率 0.608
12 cat (6).jpg 猫的概率 0.524 cat (6).jpg 狗的概率 0.661 cat (6).jpg 狗的概率 0.748 cat (6).jpg 狗的概率 0.970 cat (6).jpg 狗的概率 0.987
13 cat (7).jpg 狗的概率 0.612 cat (7).jpg 猫的概率 0.845 cat (7).jpg 猫的概率 0.894 cat (7).jpg 猫的概率 0.987 cat (7).jpg 猫的概率 0.728
14 cat (8).jpg 狗的概率 0.823 cat (8).jpg 狗的概率 0.948 cat (8).jpg 狗的概率 0.920 cat (8).jpg 狗的概率 0.982 cat (8).jpg 狗的概率 0.999
15 cat (9).jpg 猫的概率 0.697 cat (9).jpg 猫的概率 0.704 cat (9).jpg 狗的概率 0.819 cat (9).jpg 猫的概率 0.930 cat (9).jpg 狗的概率 0.718
16 dog dog (1).jpg 狗的概率 0.987 dog (1).jpg 狗的概率 0.995 dog (1).jpg 狗的概率 0.999 dog (1).jpg 狗的概率 1.000 dog (1).jpg 狗的概率 1.000
17 dog (10).jpg 狗的概率 0.628 dog (10).jpg 猫的概率 0.629 dog (10).jpg 猫的概率 0.994 dog (10).jpg 猫的概率 1.000 dog (10).jpg 猫的概率 1.000
18 dog (11).jpg 狗的概率 0.804 dog (11).jpg 狗的概率 0.879 dog (11).jpg 狗的概率 0.993 dog (11).jpg 狗的概率 1.000 dog (11).jpg 狗的概率 1.000
19 dog (12).jpg 猫的概率 0.704 dog (12).jpg 猫的概率 0.758 dog (12).jpg 狗的概率 0.503 dog (12).jpg 狗的概率 0.653 dog (12).jpg 猫的概率 0.985
20 dog (13).jpg 狗的概率 0.987 dog (13).jpg 狗的概率 0.997 dog (13).jpg 狗的概率 1.000 dog (13).jpg 狗的概率 1.000 dog (13).jpg 狗的概率 1.000
21 dog (14).jpg 狗的概率 0.815 dog (14).jpg 狗的概率 0.844 dog (14).jpg 狗的概率 0.904 dog (14).jpg 狗的概率 0.996 dog (14).jpg 狗的概率 0.950
22 dog (15).jpg 狗的概率 0.917 dog (15).jpg 狗的概率 0.984 dog (15).jpg 狗的概率 0.999 dog (15).jpg 狗的概率 1.000 dog (15).jpg 狗的概率 1.000
23 dog (16).jpg 狗的概率 0.883 dog (16).jpg 狗的概率 0.931 dog (16).jpg 狗的概率 0.830 dog (16).jpg 狗的概率 0.975 dog (16).jpg 狗的概率 0.983
24 dog (2).jpg 狗的概率 0.934 dog (2).jpg 狗的概率 0.982 dog (2).jpg 狗的概率 0.998 dog (2).jpg 狗的概率 1.000 dog (2).jpg 狗的概率 1.000
25 dog (3).jpg 狗的概率 0.993 dog (3).jpg 狗的概率 1.000 dog (3).jpg 狗的概率 1.000 dog (3).jpg 狗的概率 1.000 dog (3).jpg 狗的概率 1.000
26 dog (4).jpg 狗的概率 0.693 dog (4).jpg 狗的概率 0.754 dog (4).jpg 狗的概率 0.976 dog (4).jpg 狗的概率 0.515 dog (4).jpg 狗的概率 0.995
27 dog (5).jpg 狗的概率 0.916 dog (5).jpg 狗的概率 0.976 dog (5).jpg 狗的概率 0.993 dog (5).jpg 狗的概率 0.998 dog (5).jpg 狗的概率 1.000
28 dog (6).jpg 狗的概率 0.947 dog (6).jpg 狗的概率 0.989 dog (6).jpg 狗的概率 0.999 dog (6).jpg 狗的概率 1.000 dog (6).jpg 狗的概率 1.000
29 dog (7).jpg 猫的概率 0.526 dog (7).jpg 猫的概率 0.685 dog (7).jpg 猫的概率 0.961 dog (7).jpg 猫的概率 1.000 dog (7).jpg 猫的概率 1.000
30 dog (8).jpg 狗的概率 0.981 dog (8).jpg 狗的概率 0.998 dog (8).jpg 狗的概率 1.000 dog (8).jpg 狗的概率 1.000 dog (8).jpg 狗的概率 1.000
31 dog (9).jpg 狗的概率 0.899 dog (9).jpg 狗的概率 0.983 dog (9).jpg 狗的概率 0.999 dog (9).jpg 狗的概率 1.000 dog (9).jpg 狗的概率 1.000

训练结果

1. Step 0, train loss = 0.69, train accuracy = 78.12%
2. Step 50, train loss = 0.69, train accuracy = 43.75%
3. Step 100, train loss = 0.70, train accuracy = 46.88%
4. Step 150, train loss = 0.65, train accuracy = 75.00%
5. Step 200, train loss = 0.66, train accuracy = 59.38%
6. Step 250, train loss = 0.66, train accuracy = 62.50%
7. Step 300, train loss = 0.72, train accuracy = 40.62%
8. Step 350, train loss = 0.66, train accuracy = 62.50%
9. Step 400, train loss = 0.58, train accuracy = 68.75%
10. Step 450, train loss = 0.70, train accuracy = 65.62%
11. Step 500, train loss = 0.68, train accuracy = 56.25%
12. Step 550, train loss = 0.51, train accuracy = 81.25%
13. Step 600, train loss = 0.54, train accuracy = 75.00%
14. Step 650, train loss = 0.64, train accuracy = 68.75%
15. Step 700, train loss = 0.69, train accuracy = 53.12%
16. Step 750, train loss = 0.57, train accuracy = 71.88%
17. Step 800, train loss = 0.80, train accuracy = 50.00%
18. Step 850, train loss = 0.62, train accuracy = 59.38%
19. Step 900, train loss = 0.59, train accuracy = 65.62%
20. Step 950, train loss = 0.54, train accuracy = 71.88%
21. Step 1000, train loss = 0.57, train accuracy = 68.75%
22. Step 1050, train loss = 0.56, train accuracy = 78.12%
23. Step 1100, train loss = 0.66, train accuracy = 59.38%
24. Step 1150, train loss = 0.50, train accuracy = 84.38%
25. Step 1200, train loss = 0.46, train accuracy = 81.25%
26. Step 1250, train loss = 0.57, train accuracy = 59.38%
27. Step 1300, train loss = 0.37, train accuracy = 81.25%
28. Step 1350, train loss = 0.64, train accuracy = 62.50%
29. Step 1400, train loss = 0.44, train accuracy = 81.25%
30. Step 1450, train loss = 0.46, train accuracy = 84.38%
31. Step 1500, train loss = 0.50, train accuracy = 71.88%
32. Step 1550, train loss = 0.58, train accuracy = 62.50%
33. Step 1600, train loss = 0.43, train accuracy = 75.00%
34. Step 1650, train loss = 0.55, train accuracy = 71.88%
35. Step 1700, train loss = 0.50, train accuracy = 71.88%
36. Step 1750, train loss = 0.46, train accuracy = 75.00%
37. Step 1800, train loss = 0.81, train accuracy = 53.12%
38. Step 1850, train loss = 0.41, train accuracy = 90.62%
39. Step 1900, train loss = 0.65, train accuracy = 68.75%
40. Step 1950, train loss = 0.37, train accuracy = 84.38%
41. Step 2000, train loss = 0.39, train accuracy = 81.25%
42. Step 2050, train loss = 0.45, train accuracy = 84.38%
43. Step 2100, train loss = 0.44, train accuracy = 78.12%
44. Step 2150, train loss = 0.59, train accuracy = 65.62%
45. Step 2200, train loss = 0.51, train accuracy = 78.12%
46. Step 2250, train loss = 0.42, train accuracy = 81.25%
47. Step 2300, train loss = 0.32, train accuracy = 87.50%
48. Step 2350, train loss = 0.48, train accuracy = 75.00%
49. Step 2400, train loss = 0.54, train accuracy = 71.88%
50. Step 2450, train loss = 0.51, train accuracy = 71.88%
51. Step 2500, train loss = 0.73, train accuracy = 59.38%
52. Step 2550, train loss = 0.52, train accuracy = 78.12%
53. Step 2600, train loss = 0.65, train accuracy = 62.50%
54. Step 2650, train loss = 0.52, train accuracy = 71.88%
55. Step 2700, train loss = 0.48, train accuracy = 71.88%
56. Step 2750, train loss = 0.37, train accuracy = 84.38%
57. Step 2800, train loss = 0.46, train accuracy = 78.12%
58. Step 2850, train loss = 0.40, train accuracy = 84.38%
59. Step 2900, train loss = 0.45, train accuracy = 81.25%
60. Step 2950, train loss = 0.36, train accuracy = 84.38%
61. Step 3000, train loss = 0.46, train accuracy = 75.00%
62. Step 3050, train loss = 0.53, train accuracy = 71.88%
63. Step 3100, train loss = 0.37, train accuracy = 84.38%
64. Step 3150, train loss = 0.53, train accuracy = 75.00%
65. Step 3200, train loss = 0.52, train accuracy = 75.00%
66. Step 3250, train loss = 0.62, train accuracy = 65.62%
67. Step 3300, train loss = 0.58, train accuracy = 71.88%
68. Step 3350, train loss = 0.71, train accuracy = 65.62%
69. Step 3400, train loss = 0.43, train accuracy = 78.12%
70. Step 3450, train loss = 0.46, train accuracy = 78.12%
71. Step 3500, train loss = 0.46, train accuracy = 71.88%
72. Step 3550, train loss = 0.53, train accuracy = 68.75%
73. Step 3600, train loss = 0.44, train accuracy = 75.00%
74. Step 3650, train loss = 0.55, train accuracy = 65.62%
75. Step 3700, train loss = 0.62, train accuracy = 75.00%
76. Step 3750, train loss = 0.48, train accuracy = 75.00%
77. Step 3800, train loss = 0.66, train accuracy = 53.12%
78. Step 3850, train loss = 0.53, train accuracy = 75.00%
79. Step 3900, train loss = 0.36, train accuracy = 81.25%
80. Step 3950, train loss = 0.37, train accuracy = 87.50%
81. Step 4000, train loss = 0.46, train accuracy = 78.12%
82. Step 4050, train loss = 0.36, train accuracy = 84.38%
83. Step 4100, train loss = 0.34, train accuracy = 78.12%
84. Step 4150, train loss = 0.48, train accuracy = 78.12%
85. Step 4200, train loss = 0.43, train accuracy = 87.50%
86. Step 4250, train loss = 0.34, train accuracy = 84.38%
87. Step 4300, train loss = 0.28, train accuracy = 87.50%
88. Step 4350, train loss = 0.19, train accuracy = 96.88%
89. Step 4400, train loss = 0.46, train accuracy = 71.88%
90. Step 4450, train loss = 0.33, train accuracy = 84.38%
91. Step 4500, train loss = 0.55, train accuracy = 75.00%
92. Step 4550, train loss = 0.31, train accuracy = 93.75%
93. Step 4600, train loss = 0.30, train accuracy = 84.38%
94. Step 4650, train loss = 0.38, train accuracy = 84.38%
95. Step 4700, train loss = 0.36, train accuracy = 84.38%
96. Step 4750, train loss = 0.32, train accuracy = 87.50%
97. Step 4800, train loss = 0.36, train accuracy = 81.25%
98. Step 4850, train loss = 0.36, train accuracy = 87.50%
99. Step 4900, train loss = 0.49, train accuracy = 71.88%
100. Step 4950, train loss = 0.51, train accuracy = 68.75%
101. Step 5000, train loss = 0.59, train accuracy = 68.75%
102. Step 5050, train loss = 0.55, train accuracy = 75.00%
103. Step 5100, train loss = 0.71, train accuracy = 68.75%
104. Step 5150, train loss = 0.48, train accuracy = 71.88%
105. Step 5200, train loss = 0.39, train accuracy = 90.62%
106. Step 5250, train loss = 0.49, train accuracy = 81.25%
107. Step 5300, train loss = 0.36, train accuracy = 81.25%
108. Step 5350, train loss = 0.31, train accuracy = 90.62%
109. Step 5400, train loss = 0.39, train accuracy = 87.50%
110. Step 5450, train loss = 0.34, train accuracy = 78.12%
111. Step 5500, train loss = 0.29, train accuracy = 84.38%
112. Step 5550, train loss = 0.21, train accuracy = 93.75%
113. Step 5600, train loss = 0.41, train accuracy = 78.12%
114. Step 5650, train loss = 0.38, train accuracy = 84.38%
115. Step 5700, train loss = 0.27, train accuracy = 87.50%
116. Step 5750, train loss = 0.24, train accuracy = 90.62%
117. Step 5800, train loss = 0.17, train accuracy = 96.88%
118. Step 5850, train loss = 0.23, train accuracy = 93.75%
119. Step 5900, train loss = 0.37, train accuracy = 71.88%
120. Step 5950, train loss = 0.49, train accuracy = 71.88%
121. Step 6000, train loss = 0.43, train accuracy = 81.25%
122. Step 6050, train loss = 0.33, train accuracy = 87.50%
123. Step 6100, train loss = 0.46, train accuracy = 75.00%
124. Step 6150, train loss = 0.61, train accuracy = 81.25%
125. Step 6200, train loss = 0.34, train accuracy = 84.38%
126. Step 6250, train loss = 0.63, train accuracy = 71.88%
127. Step 6300, train loss = 0.21, train accuracy = 90.62%
128. Step 6350, train loss = 0.21, train accuracy = 90.62%
129. Step 6400, train loss = 0.27, train accuracy = 87.50%
130. Step 6450, train loss = 0.17, train accuracy = 87.50%
131. Step 6500, train loss = 0.34, train accuracy = 87.50%
132. Step 6550, train loss = 0.34, train accuracy = 87.50%
133. Step 6600, train loss = 0.32, train accuracy = 84.38%
134. Step 6650, train loss = 0.39, train accuracy = 84.38%
135. Step 6700, train loss = 0.38, train accuracy = 84.38%
136. Step 6750, train loss = 0.41, train accuracy = 84.38%
137. Step 6800, train loss = 0.49, train accuracy = 81.25%
138. Step 6850, train loss = 0.36, train accuracy = 84.38%
139. Step 6900, train loss = 0.20, train accuracy = 93.75%
140. Step 6950, train loss = 0.13, train accuracy = 93.75%
141. Step 7000, train loss = 0.31, train accuracy = 87.50%
142. Step 7050, train loss = 0.18, train accuracy = 93.75%
143. Step 7100, train loss = 0.23, train accuracy = 90.62%
144. Step 7150, train loss = 0.13, train accuracy = 96.88%
145. Step 7200, train loss = 0.14, train accuracy = 96.88%
146. Step 7250, train loss = 0.32, train accuracy = 84.38%
147. Step 7300, train loss = 0.18, train accuracy = 93.75%
148. Step 7350, train loss = 0.14, train accuracy = 100.00%
149. Step 7400, train loss = 0.60, train accuracy = 75.00%
150. Step 7450, train loss = 0.20, train accuracy = 93.75%
151. Step 7500, train loss = 0.13, train accuracy = 93.75%
152. Step 7550, train loss = 0.22, train accuracy = 90.62%
153. Step 7600, train loss = 0.13, train accuracy = 96.88%
154. Step 7650, train loss = 0.20, train accuracy = 93.75%
155. Step 7700, train loss = 0.24, train accuracy = 90.62%
156. Step 7750, train loss = 0.19, train accuracy = 93.75%
157. Step 7800, train loss = 0.16, train accuracy = 93.75%
158. Step 7850, train loss = 0.08, train accuracy = 100.00%
159. Step 7900, train loss = 0.10, train accuracy = 96.88%
160. Step 7950, train loss = 0.13, train accuracy = 93.75%
161. Step 8000, train loss = 0.18, train accuracy = 90.62%
162. Step 8050, train loss = 0.27, train accuracy = 93.75%
163. Step 8100, train loss = 0.04, train accuracy = 100.00%
164. Step 8150, train loss = 0.27, train accuracy = 87.50%
165. Step 8200, train loss = 0.06, train accuracy = 96.88%
166. Step 8250, train loss = 0.12, train accuracy = 100.00%
167. Step 8300, train loss = 0.28, train accuracy = 87.50%
168. Step 8350, train loss = 0.24, train accuracy = 90.62%
169. Step 8400, train loss = 0.16, train accuracy = 93.75%
170. Step 8450, train loss = 0.11, train accuracy = 93.75%
171. Step 8500, train loss = 0.13, train accuracy = 96.88%
172. Step 8550, train loss = 0.05, train accuracy = 100.00%
173. Step 8600, train loss = 0.10, train accuracy = 93.75%
174. Step 8650, train loss = 0.14, train accuracy = 100.00%
175. Step 8700, train loss = 0.21, train accuracy = 90.62%
176. Step 8750, train loss = 0.09, train accuracy = 96.88%
177. Step 8800, train loss = 0.11, train accuracy = 96.88%
178. Step 8850, train loss = 0.10, train accuracy = 96.88%
179. Step 8900, train loss = 0.12, train accuracy = 93.75%
180. Step 8950, train loss = 0.48, train accuracy = 81.25%
181. Step 9000, train loss = 0.07, train accuracy = 100.00%
182. Step 9050, train loss = 0.03, train accuracy = 100.00%
183. Step 9100, train loss = 0.10, train accuracy = 93.75%
184. Step 9150, train loss = 0.05, train accuracy = 96.88%
185. Step 9200, train loss = 0.04, train accuracy = 100.00%
186. Step 9250, train loss = 0.03, train accuracy = 100.00%
187. Step 9300, train loss = 0.04, train accuracy = 96.88%
188. Step 9350, train loss = 0.08, train accuracy = 100.00%
189. Step 9400, train loss = 0.05, train accuracy = 100.00%
190. Step 9450, train loss = 0.15, train accuracy = 90.62%
191. Step 9500, train loss = 0.03, train accuracy = 100.00%
192. Step 9550, train loss = 0.05, train accuracy = 100.00%
193. Step 9600, train loss = 0.15, train accuracy = 96.88%
194. Step 9650, train loss = 0.03, train accuracy = 100.00%
195. Step 9700, train loss = 0.02, train accuracy = 100.00%
196. Step 9750, train loss = 0.08, train accuracy = 96.88%
197. Step 9800, train loss = 0.04, train accuracy = 100.00%
198. Step 9850, train loss = 0.06, train accuracy = 96.88%
199. Step 9900, train loss = 0.03, train accuracy = 100.00%
200. Step 9950, train loss = 0.03, train accuracy = 100.00%
201. Step 10000, train loss = 0.11, train accuracy = 93.75%
202. Step 10050, train loss = 0.02, train accuracy = 100.00%
203. Step 10100, train loss = 0.01, train accuracy = 100.00%
204. Step 10150, train loss = 0.05, train accuracy = 96.88%
205. Step 10200, train loss = 0.07, train accuracy = 96.88%
206. Step 10250, train loss = 0.06, train accuracy = 96.88%
207. Step 10300, train loss = 0.03, train accuracy = 100.00%
208. Step 10350, train loss = 0.08, train accuracy = 96.88%
209. Step 10400, train loss = 0.05, train accuracy = 96.88%
210. Step 10450, train loss = 0.02, train accuracy = 100.00%
211. Step 10500, train loss = 0.22, train accuracy = 93.75%
212. Step 10550, train loss = 0.06, train accuracy = 100.00%
213. Step 10600, train loss = 0.02, train accuracy = 100.00%
214. Step 10650, train loss = 0.02, train accuracy = 100.00%
215. Step 10700, train loss = 0.03, train accuracy = 100.00%
216. Step 10750, train loss = 0.15, train accuracy = 96.88%
217. Step 10800, train loss = 0.05, train accuracy = 100.00%
218. Step 10850, train loss = 0.02, train accuracy = 100.00%
219. Step 10900, train loss = 0.04, train accuracy = 96.88%
220. Step 10950, train loss = 0.05, train accuracy = 96.88%
221. Step 11000, train loss = 0.02, train accuracy = 100.00%
222. Step 11050, train loss = 0.10, train accuracy = 96.88%
223. Step 11100, train loss = 0.08, train accuracy = 96.88%
224. Step 11150, train loss = 0.02, train accuracy = 100.00%
225. Step 11200, train loss = 0.01, train accuracy = 100.00%
226. Step 11250, train loss = 0.06, train accuracy = 96.88%
227. Step 11300, train loss = 0.18, train accuracy = 93.75%
228. Step 11350, train loss = 0.02, train accuracy = 100.00%
229. Step 11400, train loss = 0.04, train accuracy = 100.00%
230. Step 11450, train loss = 0.03, train accuracy = 100.00%
231. Step 11500, train loss = 0.01, train accuracy = 100.00%
232. Step 11550, train loss = 0.02, train accuracy = 100.00%

核心代码

1.         weights = tf.get_variable('weights',  
2.                                   shape=[3, 3, 3, 16],  
3.                                   dtype=tf.float32,  
4.                                   initializer=tf.truncated_normal_initializer(stddev=0.1, dtype=tf.float32))  
5.         biases = tf.get_variable('biases',  
6.                                  shape=[16],  
7.                                  dtype=tf.float32,  
8.                                  initializer=tf.constant_initializer(0.1))  
9.         conv = tf.nn.conv2d(images, weights, strides=[1, 1, 1, 1], padding='SAME')  
10.         pre_activation = tf.nn.bias_add(conv, biases)  
11.         conv1 = tf.nn.relu(pre_activation, name=scope.name)  
12. with tf.variable_scope('pooling1_lrn') as scope:  
13.             pool1 = tf.nn.max_pool(conv1, ksize=[1, 3, 3, 1], strides=[1, 2, 2, 1], padding='SAME', name='pooling1')  
14.             norm1 = tf.nn.lrn(pool1, depth_radius=4, bias=1.0, alpha=0.001 / 9.0, beta=0.75, name='norm1')  
15. 
16. with tf.variable_scope('conv2') as scope:  
17.                 weights = tf.get_variable('weights',  
18.                                           shape=[3, 3, 16, 16],  
19.                                           dtype=tf.float32,  
20.                                           initializer=tf.truncated_normal_initializer(stddev=0.1, dtype=tf.float32))  
21.                 biases = tf.get_variable('biases',  
22.                                          shape=[16],  
23.                                          dtype=tf.float32,  
24.                                          initializer=tf.constant_initializer(0.1))  
25.                 conv = tf.nn.conv2d(norm1, weights, strides=[1, 1, 1, 1], padding='SAME')  
26.                 pre_activation = tf.nn.bias_add(conv, biases)  
27.                 conv2 = tf.nn.relu(pre_activation, name='conv2')  
28. 
29. 
30. with tf.variable_scope('pooling2_lrn') as scope:  
31.         norm2 = tf.nn.lrn(conv2, depth_radius=4, bias=1.0, alpha=0.001 / 9.0, beta=0.75, name='norm2')  
32.         pool2 = tf.nn.max_pool(norm2, ksize=[1, 3, 3, 1], strides=[1, 1, 1, 1], padding='SAME', name='pooling2')  
33. 
34. with tf.variable_scope('local3') as scope:  
35.         reshape = tf.reshape(pool2, shape=[batch_size, -1])  
36.         dim = reshape.get_shape()[1].value  
37.         weights = tf.get_variable('weights',  
38.                                   shape=[dim, 128],  
39.                                   dtype=tf.float32,  
40.                                   initializer=tf.truncated_normal_initializer(stddev=0.005, dtype=tf.float32))  
41.         biases = tf.get_variable('biases',  
42.                                  shape=[128],  
43.                                  dtype=tf.float32,  
44.                                  initializer=tf.constant_initializer(0.1))  
45.     local3 = tf.nn.relu(tf.matmul(reshape, weights) + biases, name=scope.name)  
46. 
47. # local4  
48. with tf.variable_scope('local4') as scope:  
49.         weights = tf.get_variable('weights',  
50.                                   shape=[128, 128],  
51.                                   dtype=tf.float32,  
52.                                   initializer=tf.truncated_normal_initializer(stddev=0.005, dtype=tf.float32))  
53.         biases = tf.get_variable('biases',  
54.                                  shape=[128],  
55.                                  dtype=tf.float32,  
56.                                  initializer=tf.constant_initializer(0.1))  
57.         local4 = tf.nn.relu(tf.matmul(local3, weights) + biases, name='local4')  
58. 
59. 
60. with tf.variable_scope('softmax_linear') as scope:  
61.         weights = tf.get_variable('softmax_linear',  
62.                                   shape=[128, n_classes],  
63.                                   dtype=tf.float32,  
64.                                   initializer=tf.truncated_normal_initializer(stddev=0.005, dtype=tf.float32))  
65.         biases = tf.get_variable('biases',  
66.                                  shape=[n_classes],  
67.                                  dtype=tf.float32,  
68.                                  initializer=tf.constant_initializer(0.1))  
69.         softmax_linear = tf.add(tf.matmul(local4, weights), biases, name='softmax_linear')


相关文章
|
18天前
|
机器学习/深度学习 人工智能 算法
猫狗宠物识别系统Python+TensorFlow+人工智能+深度学习+卷积网络算法
宠物识别系统使用Python和TensorFlow搭建卷积神经网络,基于37种常见猫狗数据集训练高精度模型,并保存为h5格式。通过Django框架搭建Web平台,用户上传宠物图片即可识别其名称,提供便捷的宠物识别服务。
201 55
|
4月前
|
机器学习/深度学习 算法 TensorFlow
动物识别系统Python+卷积神经网络算法+TensorFlow+人工智能+图像识别+计算机毕业设计项目
动物识别系统。本项目以Python作为主要编程语言,并基于TensorFlow搭建ResNet50卷积神经网络算法模型,通过收集4种常见的动物图像数据集(猫、狗、鸡、马)然后进行模型训练,得到一个识别精度较高的模型文件,然后保存为本地格式的H5格式文件。再基于Django开发Web网页端操作界面,实现用户上传一张动物图片,识别其名称。
121 1
动物识别系统Python+卷积神经网络算法+TensorFlow+人工智能+图像识别+计算机毕业设计项目
|
28天前
|
机器学习/深度学习 人工智能 算法
【宠物识别系统】Python+卷积神经网络算法+深度学习+人工智能+TensorFlow+图像识别
宠物识别系统,本系统使用Python作为主要开发语言,基于TensorFlow搭建卷积神经网络算法,并收集了37种常见的猫狗宠物种类数据集【'阿比西尼亚猫(Abyssinian)', '孟加拉猫(Bengal)', '暹罗猫(Birman)', '孟买猫(Bombay)', '英国短毛猫(British Shorthair)', '埃及猫(Egyptian Mau)', '缅因猫(Maine Coon)', '波斯猫(Persian)', '布偶猫(Ragdoll)', '俄罗斯蓝猫(Russian Blue)', '暹罗猫(Siamese)', '斯芬克斯猫(Sphynx)', '美国斗牛犬
152 29
【宠物识别系统】Python+卷积神经网络算法+深度学习+人工智能+TensorFlow+图像识别
|
4月前
|
机器学习/深度学习 人工智能 算法
植物病害识别系统Python+卷积神经网络算法+图像识别+人工智能项目+深度学习项目+计算机课设项目+Django网页界面
植物病害识别系统。本系统使用Python作为主要编程语言,通过收集水稻常见的四种叶片病害图片('细菌性叶枯病', '稻瘟病', '褐斑病', '稻瘟条纹病毒病')作为后面模型训练用到的数据集。然后使用TensorFlow搭建卷积神经网络算法模型,并进行多轮迭代训练,最后得到一个识别精度较高的算法模型,然后将其保存为h5格式的本地模型文件。再使用Django搭建Web网页平台操作界面,实现用户上传一张测试图片识别其名称。
152 22
植物病害识别系统Python+卷积神经网络算法+图像识别+人工智能项目+深度学习项目+计算机课设项目+Django网页界面
|
2月前
|
机器学习/深度学习 人工智能 算法
【车辆车型识别】Python+卷积神经网络算法+深度学习+人工智能+TensorFlow+算法模型
车辆车型识别,使用Python作为主要编程语言,通过收集多种车辆车型图像数据集,然后基于TensorFlow搭建卷积网络算法模型,并对数据集进行训练,最后得到一个识别精度较高的模型文件。再基于Django搭建web网页端操作界面,实现用户上传一张车辆图片识别其类型。
100 0
【车辆车型识别】Python+卷积神经网络算法+深度学习+人工智能+TensorFlow+算法模型
|
4月前
|
机器学习/深度学习 人工智能 算法
鸟类识别系统Python+卷积神经网络算法+深度学习+人工智能+TensorFlow+ResNet50算法模型+图像识别
鸟类识别系统。本系统采用Python作为主要开发语言,通过使用加利福利亚大学开源的200种鸟类图像作为数据集。使用TensorFlow搭建ResNet50卷积神经网络算法模型,然后进行模型的迭代训练,得到一个识别精度较高的模型,然后在保存为本地的H5格式文件。在使用Django开发Web网页端操作界面,实现用户上传一张鸟类图像,识别其名称。
126 12
鸟类识别系统Python+卷积神经网络算法+深度学习+人工智能+TensorFlow+ResNet50算法模型+图像识别
|
3月前
|
机器学习/深度学习 算法 数据挖掘
基于GWO灰狼优化的GroupCNN分组卷积网络时间序列预测算法matlab仿真
本项目展示了基于分组卷积神经网络(GroupCNN)和灰狼优化(GWO)的时间序列回归预测算法。算法运行效果良好,无水印展示。使用Matlab2022a开发,提供完整代码及详细中文注释。GroupCNN通过分组卷积减少计算成本,GWO则优化超参数,提高预测性能。项目包含操作步骤视频,方便用户快速上手。
|
3月前
|
机器学习/深度学习 算法 数据安全/隐私保护
基于GA遗传优化的GroupCNN分组卷积网络时间序列预测算法matlab仿真
该算法结合了遗传算法(GA)与分组卷积神经网络(GroupCNN),利用GA优化GroupCNN的网络结构和超参数,提升时间序列预测精度与效率。遗传算法通过模拟自然选择过程中的选择、交叉和变异操作寻找最优解;分组卷积则有效减少了计算成本和参数数量。本项目使用MATLAB2022A实现,并提供完整代码及视频教程。注意:展示图含水印,完整程序运行无水印。
|
3月前
|
机器学习/深度学习 算法 数据安全/隐私保护
基于WOA鲸鱼优化的GroupCNN分组卷积网络时间序列预测算法matlab仿真
本项目展示了一种基于WOA优化的GroupCNN分组卷积网络时间序列预测算法。使用Matlab2022a开发,提供无水印运行效果预览及核心代码(含中文注释)。算法通过WOA优化网络结构与超参数,结合分组卷积技术,有效提升预测精度与效率。分组卷积减少了计算成本,而WOA则模拟鲸鱼捕食行为进行优化,适用于多种连续优化问题。
|
4月前
|
机器学习/深度学习 算法 TensorFlow
交通标志识别系统Python+卷积神经网络算法+深度学习人工智能+TensorFlow模型训练+计算机课设项目+Django网页界面
交通标志识别系统。本系统使用Python作为主要编程语言,在交通标志图像识别功能实现中,基于TensorFlow搭建卷积神经网络算法模型,通过对收集到的58种常见的交通标志图像作为数据集,进行迭代训练最后得到一个识别精度较高的模型文件,然后保存为本地的h5格式文件。再使用Django开发Web网页端操作界面,实现用户上传一张交通标志图片,识别其名称。
149 6
交通标志识别系统Python+卷积神经网络算法+深度学习人工智能+TensorFlow模型训练+计算机课设项目+Django网页界面

热门文章

最新文章