DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(层加深,基于keras)对单个character字符预测

简介: DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(层加深,基于keras)对单个character字符预测

 

目录

基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(层加深,基于keras)对单个character字符预测

设计思路

输出结果

核心代码


 

 

 

基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(层加深,基于keras)对单个character字符预测

设计思路

数据集下载https://download.csdn.net/download/qq_41185868/13767751

 

 

输出结果

1. Using TensorFlow backend.
2. F:\Program Files\Python\Python36\lib\site-packages\tensorflow\python\framework\dtypes.py:523: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
3.   _np_qint8 = np.dtype([("qint8", np.int8, 1)])
4. F:\Program Files\Python\Python36\lib\site-packages\tensorflow\python\framework\dtypes.py:524: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
5.   _np_quint8 = np.dtype([("quint8", np.uint8, 1)])
6. F:\Program Files\Python\Python36\lib\site-packages\tensorflow\python\framework\dtypes.py:525: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
7.   _np_qint16 = np.dtype([("qint16", np.int16, 1)])
8. F:\Program Files\Python\Python36\lib\site-packages\tensorflow\python\framework\dtypes.py:526: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
9.   _np_quint16 = np.dtype([("quint16", np.uint16, 1)])
10. F:\Program Files\Python\Python36\lib\site-packages\tensorflow\python\framework\dtypes.py:527: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
11.   _np_qint32 = np.dtype([("qint32", np.int32, 1)])
12. F:\Program Files\Python\Python36\lib\site-packages\tensorflow\python\framework\dtypes.py:532: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
13.   np_resource = np.dtype([("resource", np.ubyte, 1)])
14. [nltk_data] Error loading punkt: <urlopen error [Errno 11004]
15. [nltk_data]     getaddrinfo failed>
16. raw_text[:10] : alice's ad
17. Total Characters: 144413
18. chars ['\n', ' ', '!', '"', "'", '(', ')', '*', ',', '-', '.', '0', '3', ':', ';', '?', '[', ']', '_', 'a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 'x', 'y', 'z']
19. Total Vocab: 45
20. sentences 1625 ["alice's adventures in wonderland\n\nlewis carroll\n\nthe millennium fulcrum edition 3.0\n\nchapter i. down the rabbit-hole\n\nalice was beginning to get very tired of sitting by her sister on the\nbank, and of having nothing to do: once or twice she had peeped into the\nbook her sister was reading, but it had no pictures or conversations in\nit, 'and what is the use of a book,' thought alice 'without pictures or\nconversations?'", 'so she was considering in her own mind (as well as she could, for the\nhot day made her feel very sleepy and stupid), whether the pleasure\nof making a daisy-chain would be worth the trouble of getting up and\npicking the daisies, when suddenly a white rabbit with pink eyes ran\nclose by her.', "there was nothing so very remarkable in that; nor did alice think it so\nvery much out of the way to hear the rabbit say to itself, 'oh dear!", 'oh dear!', "i shall be late!'"]
21. lengths (1625,) [420 289 140 ... 636 553   7]
22. CharMapInt_dict 45 {'\n': 0, ' ': 1, '!': 2, '"': 3, "'": 4, '(': 5, ')': 6, '*': 7, ',': 8, '-': 9, '.': 10, '0': 11, '3': 12, ':': 13, ';': 14, '?': 15, '[': 16, ']': 17, '_': 18, 'a': 19, 'b': 20, 'c': 21, 'd': 22, 'e': 23, 'f': 24, 'g': 25, 'h': 26, 'i': 27, 'j': 28, 'k': 29, 'l': 30, 'm': 31, 'n': 32, 'o': 33, 'p': 34, 'q': 35, 'r': 36, 's': 37, 't': 38, 'u': 39, 'v': 40, 'w': 41, 'x': 42, 'y': 43, 'z': 44}
23. IntMapChar_dict 45 {0: '\n', 1: ' ', 2: '!', 3: '"', 4: "'", 5: '(', 6: ')', 7: '*', 8: ',', 9: '-', 10: '.', 11: '0', 12: '3', 13: ':', 14: ';', 15: '?', 16: '[', 17: ']', 18: '_', 19: 'a', 20: 'b', 21: 'c', 22: 'd', 23: 'e', 24: 'f', 25: 'g', 26: 'h', 27: 'i', 28: 'j', 29: 'k', 30: 'l', 31: 'm', 32: 'n', 33: 'o', 34: 'p', 35: 'q', 36: 'r', 37: 's', 38: 't', 39: 'u', 40: 'v', 41: 'w', 42: 'x', 43: 'y', 44: 'z'}
24. dataX: 144313 100 [[19, 30, 27, 21, 23, 4, 37, 1, 19, 22, 40, 23, 32, 38, 39, 36, 23, 37, 1, 27, 32, 1, 41, 33, 32, 22, 23, 36, 30, 19, 32, 22, 0, 0, 30, 23, 41, 27, 37, 1, 21, 19, 36, 36, 33, 30, 30, 0, 0, 38, 26, 23, 1, 31, 27, 30, 30, 23, 32, 32, 27, 39, 31, 1, 24, 39, 30, 21, 36, 39, 31, 1, 23, 22, 27, 38, 27, 33, 32, 1, 12, 10, 11, 0, 0, 21, 26, 19, 34, 38, 23, 36, 1, 27, 10, 1, 22, 33, 41, 32], [30, 27, 21, 23, 4, 37, 1, 19, 22, 40, 23, 32, 38, 39, 36, 23, 37, 1, 27, 32, 1, 41, 33, 32, 22, 23, 36, 30, 19, 32, 22, 0, 0, 30, 23, 41, 27, 37, 1, 21, 19, 36, 36, 33, 30, 30, 0, 0, 38, 26, 23, 1, 31, 27, 30, 30, 23, 32, 32, 27, 39, 31, 1, 24, 39, 30, 21, 36, 39, 31, 1, 23, 22, 27, 38, 27, 33, 32, 1, 12, 10, 11, 0, 0, 21, 26, 19, 34, 38, 23, 36, 1, 27, 10, 1, 22, 33, 41, 32, 1], [27, 21, 23, 4, 37, 1, 19, 22, 40, 23, 32, 38, 39, 36, 23, 37, 1, 27, 32, 1, 41, 33, 32, 22, 23, 36, 30, 19, 32, 22, 0, 0, 30, 23, 41, 27, 37, 1, 21, 19, 36, 36, 33, 30, 30, 0, 0, 38, 26, 23, 1, 31, 27, 30, 30, 23, 32, 32, 27, 39, 31, 1, 24, 39, 30, 21, 36, 39, 31, 1, 23, 22, 27, 38, 27, 33, 32, 1, 12, 10, 11, 0, 0, 21, 26, 19, 34, 38, 23, 36, 1, 27, 10, 1, 22, 33, 41, 32, 1, 38], [21, 23, 4, 37, 1, 19, 22, 40, 23, 32, 38, 39, 36, 23, 37, 1, 27, 32, 1, 41, 33, 32, 22, 23, 36, 30, 19, 32, 22, 0, 0, 30, 23, 41, 27, 37, 1, 21, 19, 36, 36, 33, 30, 30, 0, 0, 38, 26, 23, 1, 31, 27, 30, 30, 23, 32, 32, 27, 39, 31, 1, 24, 39, 30, 21, 36, 39, 31, 1, 23, 22, 27, 38, 27, 33, 32, 1, 12, 10, 11, 0, 0, 21, 26, 19, 34, 38, 23, 36, 1, 27, 10, 1, 22, 33, 41, 32, 1, 38, 26], [23, 4, 37, 1, 19, 22, 40, 23, 32, 38, 39, 36, 23, 37, 1, 27, 32, 1, 41, 33, 32, 22, 23, 36, 30, 19, 32, 22, 0, 0, 30, 23, 41, 27, 37, 1, 21, 19, 36, 36, 33, 30, 30, 0, 0, 38, 26, 23, 1, 31, 27, 30, 30, 23, 32, 32, 27, 39, 31, 1, 24, 39, 30, 21, 36, 39, 31, 1, 23, 22, 27, 38, 27, 33, 32, 1, 12, 10, 11, 0, 0, 21, 26, 19, 34, 38, 23, 36, 1, 27, 10, 1, 22, 33, 41, 32, 1, 38, 26, 23]]
25. dataY: 144313 [1, 38, 26, 23, 1]
26. Total patterns: 144313
27. X_train.shape (144313, 100, 1)
28. Y_train.shape (144313, 45)
29. Init data,after read_out, chars: 
30.  144313 alice's adventures in wonderland
31. 
32. lewis carroll
33. 
34. tge millennium fulcrum edition 3.0
35. 
36. cgapter i. down
37. _________________________________________________________________
38. Layer (type)                 Output Shape              Param #   
39. =================================================================
40. F:\File_Jupyter\实用代码\NeuralNetwork(神经网络)\CharacterLanguageLSTM.py:135: UserWarning: The `nb_epoch` argument in `fit` has been renamed `epochs`.
41.   LSTM_Model.fit(X_train[:train_index], Y_train[:train_index], nb_epoch=10, batch_size=64, callbacks=callbacks_list)
42. lstm_1 (LSTM)                (None, 256)               264192    
43. _________________________________________________________________
44. dropout_1 (Dropout)          (None, 256)               0         
45. _________________________________________________________________
46. dense_1 (Dense)              (None, 45)                11565     
47. =================================================================
48. Total params: 275,757
49. Trainable params: 275,757
50. Non-trainable params: 0
51. _________________________________________________________________
52. LSTM_Model 
53.  None
54. Epoch 1/10
55. 2020-12-23 23:42:07.919094: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2
56. 
57.   64/1000 [>.............................] - ETA: 29s - loss: 3.8086
58.  128/1000 [==>...........................] - ETA: 15s - loss: 3.7953
59.  192/1000 [====>.........................] - ETA: 11s - loss: 3.7823
60.  256/1000 [======>.......................] - ETA: 8s - loss: 3.7692 
61.  320/1000 [========>.....................] - ETA: 7s - loss: 3.7552
62.  384/1000 [==========>...................] - ETA: 5s - loss: 3.7372
63.  448/1000 [============>.................] - ETA: 4s - loss: 3.7026
64.  512/1000 [==============>...............] - ETA: 4s - loss: 3.6552
65.  576/1000 [================>.............] - ETA: 3s - loss: 3.5955
66.  640/1000 [==================>...........] - ETA: 2s - loss: 3.5678
67.  704/1000 [====================>.........] - ETA: 2s - loss: 3.5116
68.  768/1000 [======================>.......] - ETA: 1s - loss: 3.4778
69.  832/1000 [=======================>......] - ETA: 1s - loss: 3.4441
70.  896/1000 [=========================>....] - ETA: 0s - loss: 3.4278
71.  960/1000 [===========================>..] - ETA: 0s - loss: 3.4092
72. 1000/1000 [==============================] - 7s 7ms/step - loss: 3.3925
73. 
74. Epoch 00001: loss improved from inf to 3.39249, saving model to hdf5/weights-improvement-01-3.3925.hdf5
75. Epoch 2/10
76. 
77.   64/1000 [>.............................] - ETA: 4s - loss: 3.1429
78.  128/1000 [==>...........................] - ETA: 4s - loss: 3.1370
79.  192/1000 [====>.........................] - ETA: 3s - loss: 3.1034
80.  256/1000 [======>.......................] - ETA: 3s - loss: 3.1038
81.  320/1000 [========>.....................] - ETA: 3s - loss: 3.0962
82.  384/1000 [==========>...................] - ETA: 2s - loss: 3.1055
83.  448/1000 [============>.................] - ETA: 2s - loss: 3.0986
84.  512/1000 [==============>...............] - ETA: 2s - loss: 3.0628
85.  576/1000 [================>.............] - ETA: 2s - loss: 3.0452
86.  640/1000 [==================>...........] - ETA: 1s - loss: 3.0571
87.  704/1000 [====================>.........] - ETA: 1s - loss: 3.0684
88.  768/1000 [======================>.......] - ETA: 1s - loss: 3.0606
89.  832/1000 [=======================>......] - ETA: 0s - loss: 3.0596
90.  896/1000 [=========================>....] - ETA: 0s - loss: 3.0529
91.  960/1000 [===========================>..] - ETA: 0s - loss: 3.0484
92. 1000/1000 [==============================] - 5s 5ms/step - loss: 3.0371
93. 
94. Epoch 00002: loss improved from 3.39249 to 3.03705, saving model to hdf5/weights-improvement-02-3.0371.hdf5
95. Epoch 3/10
96. 
97.   64/1000 [>.............................] - ETA: 4s - loss: 3.1671
98.  128/1000 [==>...........................] - ETA: 4s - loss: 3.0008
99.  192/1000 [====>.........................] - ETA: 4s - loss: 3.0159
100.  256/1000 [======>.......................] - ETA: 4s - loss: 3.0019
101.  320/1000 [========>.....................] - ETA: 3s - loss: 3.0056
102.  384/1000 [==========>...................] - ETA: 3s - loss: 3.0156
103.  448/1000 [============>.................] - ETA: 2s - loss: 3.0392
104.  512/1000 [==============>...............] - ETA: 2s - loss: 3.0243
105.  576/1000 [================>.............] - ETA: 2s - loss: 3.0226
106.  640/1000 [==================>...........] - ETA: 1s - loss: 3.0162
107.  704/1000 [====================>.........] - ETA: 1s - loss: 3.0238
108.  768/1000 [======================>.......] - ETA: 1s - loss: 3.0195
109.  832/1000 [=======================>......] - ETA: 0s - loss: 3.0286
110.  896/1000 [=========================>....] - ETA: 0s - loss: 3.0272
111.  960/1000 [===========================>..] - ETA: 0s - loss: 3.0214
112. 1000/1000 [==============================] - 6s 6ms/step - loss: 3.0225
113. 
114. Epoch 00003: loss improved from 3.03705 to 3.02249, saving model to hdf5/weights-improvement-03-3.0225.hdf5
115. Epoch 4/10
116. 
117.   64/1000 [>.............................] - ETA: 5s - loss: 2.7843
118.  128/1000 [==>...........................] - ETA: 5s - loss: 2.8997
119.  192/1000 [====>.........................] - ETA: 4s - loss: 2.9975
120.  256/1000 [======>.......................] - ETA: 4s - loss: 3.0150
121.  320/1000 [========>.....................] - ETA: 3s - loss: 3.0025
122.  384/1000 [==========>...................] - ETA: 3s - loss: 3.0442
123.  448/1000 [============>.................] - ETA: 3s - loss: 3.0494
124.  512/1000 [==============>...............] - ETA: 2s - loss: 3.0398
125.  576/1000 [================>.............] - ETA: 2s - loss: 3.0170
126.  640/1000 [==================>...........] - ETA: 2s - loss: 3.0421
127.  704/1000 [====================>.........] - ETA: 1s - loss: 3.0366
128.  768/1000 [======================>.......] - ETA: 1s - loss: 3.0339
129.  832/1000 [=======================>......] - ETA: 0s - loss: 3.0316
130.  896/1000 [=========================>....] - ETA: 0s - loss: 3.0361
131.  960/1000 [===========================>..] - ETA: 0s - loss: 3.0326
132. 1000/1000 [==============================] - 6s 6ms/step - loss: 3.0352
133. 
134. Epoch 00004: loss did not improve from 3.02249
135. Epoch 5/10
136. 
137.   64/1000 [>.............................] - ETA: 4s - loss: 2.8958
138.  128/1000 [==>...........................] - ETA: 4s - loss: 2.9239
139.  192/1000 [====>.........................] - ETA: 4s - loss: 2.9044
140.  256/1000 [======>.......................] - ETA: 4s - loss: 2.9417
141.  320/1000 [========>.....................] - ETA: 3s - loss: 2.9674
142.  384/1000 [==========>...................] - ETA: 3s - loss: 2.9646
143.  448/1000 [============>.................] - ETA: 3s - loss: 2.9629
144.  512/1000 [==============>...............] - ETA: 2s - loss: 2.9707
145.  576/1000 [================>.............] - ETA: 2s - loss: 2.9699
146.  640/1000 [==================>...........] - ETA: 1s - loss: 2.9594
147.  704/1000 [====================>.........] - ETA: 1s - loss: 2.9830
148.  768/1000 [======================>.......] - ETA: 1s - loss: 2.9773
149.  832/1000 [=======================>......] - ETA: 0s - loss: 2.9774
150.  896/1000 [=========================>....] - ETA: 0s - loss: 2.9891
151.  960/1000 [===========================>..] - ETA: 0s - loss: 3.0070
152. 1000/1000 [==============================] - 5s 5ms/step - loss: 3.0120
153. 
154. Epoch 00005: loss improved from 3.02249 to 3.01205, saving model to hdf5/weights-improvement-05-3.0120.hdf5
155. Epoch 6/10
156. 
157.   64/1000 [>.............................] - ETA: 4s - loss: 3.0241
158.  128/1000 [==>...........................] - ETA: 4s - loss: 3.0463
159.  192/1000 [====>.........................] - ETA: 3s - loss: 3.0364
160.  256/1000 [======>.......................] - ETA: 3s - loss: 2.9712
161.  320/1000 [========>.....................] - ETA: 3s - loss: 2.9840
162.  384/1000 [==========>...................] - ETA: 3s - loss: 2.9887
163.  448/1000 [============>.................] - ETA: 2s - loss: 2.9785
164.  512/1000 [==============>...............] - ETA: 2s - loss: 2.9852
165.  576/1000 [================>.............] - ETA: 2s - loss: 2.9893
166.  640/1000 [==================>...........] - ETA: 1s - loss: 2.9931
167.  704/1000 [====================>.........] - ETA: 1s - loss: 2.9790
168.  768/1000 [======================>.......] - ETA: 1s - loss: 2.9962
169.  832/1000 [=======================>......] - ETA: 0s - loss: 3.0166
170.  896/1000 [=========================>....] - ETA: 0s - loss: 3.0213
171.  960/1000 [===========================>..] - ETA: 0s - loss: 3.0143
172. 1000/1000 [==============================] - 5s 5ms/step - loss: 3.0070
173. 
174. Epoch 00006: loss improved from 3.01205 to 3.00701, saving model to hdf5/weights-improvement-06-3.0070.hdf5
175. Epoch 7/10
176. 
177.   64/1000 [>.............................] - ETA: 5s - loss: 3.0738
178.  128/1000 [==>...........................] - ETA: 5s - loss: 3.0309
179.  192/1000 [====>.........................] - ETA: 4s - loss: 2.9733
180.  256/1000 [======>.......................] - ETA: 4s - loss: 2.9728
181.  320/1000 [========>.....................] - ETA: 4s - loss: 2.9422
182.  384/1000 [==========>...................] - ETA: 3s - loss: 2.9496
183.  448/1000 [============>.................] - ETA: 3s - loss: 2.9548
184.  512/1000 [==============>...............] - ETA: 3s - loss: 2.9635
185.  576/1000 [================>.............] - ETA: 2s - loss: 2.9614
186.  640/1000 [==================>...........] - ETA: 2s - loss: 2.9537
187.  704/1000 [====================>.........] - ETA: 1s - loss: 2.9454
188.  768/1000 [======================>.......] - ETA: 1s - loss: 2.9649
189.  832/1000 [=======================>......] - ETA: 1s - loss: 2.9814
190.  896/1000 [=========================>....] - ETA: 0s - loss: 2.9955
191.  960/1000 [===========================>..] - ETA: 0s - loss: 2.9948
192. 1000/1000 [==============================] - 6s 6ms/step - loss: 2.9903
193. 
194. Epoch 00007: loss improved from 3.00701 to 2.99027, saving model to hdf5/weights-improvement-07-2.9903.hdf5
195. Epoch 8/10
196. 
197.   64/1000 [>.............................] - ETA: 5s - loss: 2.9248
198.  128/1000 [==>...........................] - ETA: 4s - loss: 2.9293
199.  192/1000 [====>.........................] - ETA: 4s - loss: 2.9820
200.  256/1000 [======>.......................] - ETA: 4s - loss: 3.0261
201.  320/1000 [========>.....................] - ETA: 3s - loss: 2.9989
202.  384/1000 [==========>...................] - ETA: 3s - loss: 3.0101
203.  448/1000 [============>.................] - ETA: 3s - loss: 3.0050
204.  512/1000 [==============>...............] - ETA: 2s - loss: 3.0155
205.  576/1000 [================>.............] - ETA: 2s - loss: 3.0414
206.  640/1000 [==================>...........] - ETA: 2s - loss: 3.0180
207.  704/1000 [====================>.........] - ETA: 1s - loss: 3.0295
208.  768/1000 [======================>.......] - ETA: 1s - loss: 2.9996
209.  832/1000 [=======================>......] - ETA: 0s - loss: 3.0151
210.  896/1000 [=========================>....] - ETA: 0s - loss: 3.0201
211.  960/1000 [===========================>..] - ETA: 0s - loss: 3.0063
212. 1000/1000 [==============================] - 6s 6ms/step - loss: 3.0064
213. 
214. Epoch 00008: loss did not improve from 2.99027
215. Epoch 9/10
216. 
217.   64/1000 [>.............................] - ETA: 4s - loss: 2.8417
218.  128/1000 [==>...........................] - ETA: 4s - loss: 2.9652
219.  192/1000 [====>.........................] - ETA: 4s - loss: 2.9907
220.  256/1000 [======>.......................] - ETA: 3s - loss: 3.0133
221.  320/1000 [========>.....................] - ETA: 3s - loss: 3.0092
222.  384/1000 [==========>...................] - ETA: 3s - loss: 3.0139
223.  448/1000 [============>.................] - ETA: 2s - loss: 3.0453
224.  512/1000 [==============>...............] - ETA: 2s - loss: 3.0481
225.  576/1000 [================>.............] - ETA: 2s - loss: 3.0434
226.  640/1000 [==================>...........] - ETA: 1s - loss: 3.0158
227.  704/1000 [====================>.........] - ETA: 1s - loss: 3.0141
228.  768/1000 [======================>.......] - ETA: 1s - loss: 3.0203
229.  832/1000 [=======================>......] - ETA: 0s - loss: 3.0068
230.  896/1000 [=========================>....] - ETA: 0s - loss: 2.9980
231.  960/1000 [===========================>..] - ETA: 0s - loss: 3.0016
232. 1000/1000 [==============================] - 5s 5ms/step - loss: 2.9944
233. 
234. Epoch 00009: loss did not improve from 2.99027
235. Epoch 10/10
236. 
237.   64/1000 [>.............................] - ETA: 4s - loss: 3.0100
238.  128/1000 [==>...........................] - ETA: 4s - loss: 3.0620
239.  192/1000 [====>.........................] - ETA: 4s - loss: 3.0169
240.  256/1000 [======>.......................] - ETA: 3s - loss: 3.0289
241.  320/1000 [========>.....................] - ETA: 3s - loss: 3.0060
242.  384/1000 [==========>...................] - ETA: 3s - loss: 2.9940
243.  448/1000 [============>.................] - ETA: 2s - loss: 2.9823
244.  512/1000 [==============>...............] - ETA: 2s - loss: 2.9686
245.  576/1000 [================>.............] - ETA: 2s - loss: 2.9699
246.  640/1000 [==================>...........] - ETA: 1s - loss: 2.9710
247.  704/1000 [====================>.........] - ETA: 1s - loss: 2.9625
248.  768/1000 [======================>.......] - ETA: 1s - loss: 2.9748
249.  832/1000 [=======================>......] - ETA: 0s - loss: 2.9794
250.  896/1000 [=========================>....] - ETA: 0s - loss: 2.9788
251.  960/1000 [===========================>..] - ETA: 0s - loss: 2.9802
252. 1000/1000 [==============================] - 5s 5ms/step - loss: 2.9963
253. 
254. Epoch 00010: loss did not improve from 2.99027
255. LSTM_Pre_word.shape: 
256.  (3, 45)
257. after LSTM read_out, chars: 
258.  3 ["\n,\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n!\n\n '\n\n!!\n\n\n\n !\n\n! ' \n\n\n\n\n", "\n,\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n!\n\n '\n\n!!\n\n\n\n !\n\n! ' \n\n\n\n\n", '\n,\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n!\n\n "\n\n!!\n\n \n !\n\n! \' \n\n\n\n\n']
259. LSTM_Model,Seed:
260. " ent down its head to hide a smile: some of the other birds
261. tittered audibly.
262. 
263. 'what i was going to s "
264. 199 100
265. 
266.  Generated Sequence:
267. 
268. 
269.  Done.
270. _________________________________________________________________
271. Layer (type)                 Output Shape              Param #   
272. =================================================================
273. lstm_2 (LSTM)                (None, 100, 256)          264192
274. _________________________________________________________________
275. dropout_2 (Dropout)          (None, 100, 256)          0
276. _________________________________________________________________
277. lstm_3 (LSTM)                (None, 64)                82176
278. _________________________________________________________________
279. dropout_3 (Dropout)          (None, 64)                0
280. _________________________________________________________________
281. dense_2 (Dense)              (None, 45)                2925
282. =================================================================
283. Total params: 349,293
284. Trainable params: 349,293
285. Non-trainable params: 0
286. _________________________________________________________________
287. DeepLSTM_Model 
288. None
289. F:\File_Jupyter\实用代码\NeuralNetwork(神经网络)\CharacterLanguageLSTM.py:246: UserWarning: The `nb_epoch` argument in `fit` has been renamed `epochs`.
290.   DeepLSTM_Model.fit(X_train[:train_index], Y_train[:train_index], nb_epoch=2, batch_size=256, callbacks=callbacks_list)
291. Epoch 1/2
292. 
293. 256/1000 [======>.......................] - ETA: 11s - loss: 3.8128
294. 512/1000 [==============>...............] - ETA: 5s - loss: 3.8058
295. 768/1000 [======================>.......] - ETA: 2s - loss: 3.7976
296. 1000/1000 [==============================] - 10s 10ms/step - loss: 3.7883
297. 
298. Epoch 00001: loss improved from inf to 3.78827, saving model to hdf5/weights-improvement-01-3.7883.hdf5
299. Epoch 2/2
300. 
301. 256/1000 [======>.......................] - ETA: 5s - loss: 3.7167
302. 512/1000 [==============>...............] - ETA: 4s - loss: 3.6880
303. 768/1000 [======================>.......] - ETA: 1s - loss: 3.6622
304. 1000/1000 [==============================] - 8s 8ms/step - loss: 3.6151
305. 
306. Epoch 00002: loss improved from 3.78827 to 3.61512, saving model to hdf5/weights-improvement-02-3.6151.hdf5
307. DeepLSTM_Pre_word.shape: 
308.  (3, 45)
309. after DeepLSTM read_out, chars: 
310. 3 ["\n,\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n!\n\n '\n\n!!\n\n\n\n !\n\n! ' \n\n\n\n\n", "\n,\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n!\n\n '\n\n!!\n\n\n\n !\n\n! ' \n\n\n\n\n", '\n,\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n!\n\n "\n\n!!\n\n \n !\n\n! \' \n\n\n\n\n']

 

 

核心代码

1. LSTM_Model = Sequential()
2. LSTM_Model.add(LSTM(256, input_shape=(X_train.shape[1], X_train.shape[2])))
3. LSTM_Model.add(Dropout(0.2))
4. LSTM_Model.add(Dense(Y_train.shape[1], activation='softmax'))
5. LSTM_Model.compile(loss='categorical_crossentropy', optimizer='adam')
6. print('LSTM_Model \n',LSTM_Model.summary())
7. 
8. 
9. 
10. 
11. embedding_vector_length = 32
12. LSTMWithE_Model = Sequential()
13. LSTMWithE_Model.add(Embedding(chars_len, embedding_vector_length, input_length=seq_length))
14. LSTMWithE_Model.add(LSTM(256))
15. LSTMWithE_Model.add(Dropout(0.2))
16. LSTMWithE_Model.add(Dense(Y_train.shape[1], activation='softmax'))
17. LSTMWithE_Model.compile(loss='categorical_crossentropy', optimizer='adam')
18. print (LSTMWithE_Model.summary())
19. 
20. 
21. 
22. DeepLSTM_Model = Sequential()
23. DeepLSTM_Model.add(LSTM(256, input_shape=(X_train.shape[1], X_train.shape[2]), return_sequences=True))
24. DeepLSTM_Model.add(Dropout(0.2))
25. DeepLSTM_Model.add(LSTM(64))
26. DeepLSTM_Model.add(Dropout(0.2))
27. DeepLSTM_Model.add(Dense(Y_train.shape[1], activation='softmax'))
28. DeepLSTM_Model.compile(loss='categorical_crossentropy', optimizer='adam')
29. print('DeepLSTM_Model \n',DeepLSTM_Model.summary())

 


相关文章
|
3月前
|
算法 前端开发 数据处理
小白学python-深入解析一位字符判定算法
小白学python-深入解析一位字符判定算法
57 0
|
2月前
|
机器学习/深度学习 算法 数据挖掘
K-means聚类算法是机器学习中常用的一种聚类方法,通过将数据集划分为K个簇来简化数据结构
K-means聚类算法是机器学习中常用的一种聚类方法,通过将数据集划分为K个簇来简化数据结构。本文介绍了K-means算法的基本原理,包括初始化、数据点分配与簇中心更新等步骤,以及如何在Python中实现该算法,最后讨论了其优缺点及应用场景。
146 4
|
5月前
|
算法
【算法】位运算算法——判断字符是否唯一
【算法】位运算算法——判断字符是否唯一
|
5月前
|
算法
【算法】滑动窗口——无重复字符的最长子串
【算法】滑动窗口——无重复字符的最长子串
|
3月前
|
机器学习/深度学习 算法 数据安全/隐私保护
基于贝叶斯优化CNN-LSTM网络的数据分类识别算法matlab仿真
本项目展示了基于贝叶斯优化(BO)的CNN-LSTM网络在数据分类中的应用。通过MATLAB 2022a实现,优化前后效果对比明显。核心代码附带中文注释和操作视频,涵盖BO、CNN、LSTM理论,特别是BO优化CNN-LSTM网络的batchsize和学习率,显著提升模型性能。
|
3月前
|
机器学习/深度学习 人工智能 Rust
MindSpore QuickStart——LSTM算法实践学习
MindSpore QuickStart——LSTM算法实践学习
62 2
|
7月前
|
机器学习/深度学习 算法 PyTorch
【从零开始学习深度学习】38. Pytorch实战案例:梯度下降、随机梯度下降、小批量随机梯度下降3种优化算法对比【含数据集与源码】
【从零开始学习深度学习】38. Pytorch实战案例:梯度下降、随机梯度下降、小批量随机梯度下降3种优化算法对比【含数据集与源码】
|
5月前
|
数据采集 机器学习/深度学习 算法
【python】python客户信息审计风险决策树算法分类预测(源码+数据集+论文)【独一无二】
【python】python客户信息审计风险决策树算法分类预测(源码+数据集+论文)【独一无二】
|
5月前
|
机器学习/深度学习 数据采集 数据可视化
【优秀python系统毕设】基于Python flask的气象数据可视化系统设计与实现,有LSTM算法预测气温
本文介绍了一个基于Python Flask框架开发的气象数据可视化系统,该系统集成了数据获取、处理、存储、LSTM算法气温预测以及多种数据可视化功能,旨在提高气象数据的利用价值并推动气象领域的发展。
337 1
|
7月前
|
机器学习/深度学习 存储 人工智能
算法金 | LSTM 原作者带队,一个强大的算法模型杀回来了
**摘要:** 本文介绍了LSTM(长短期记忆网络)的发展背景和重要性,以及其创始人Sepp Hochreiter新推出的xLSTM。LSTM是为解决传统RNN长期依赖问题而设计的,广泛应用于NLP和时间序列预测。文章详细阐述了LSTM的基本概念、核心原理、实现方法和实际应用案例,包括文本生成和时间序列预测。此外,还讨论了LSTM与Transformer的竞争格局。最后,鼓励读者深入学习和探索AI领域。
76 7
算法金 | LSTM 原作者带队,一个强大的算法模型杀回来了

热门文章

最新文章