DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(基于keras)对word实现预测

简介: DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(基于keras)对word实现预测

 

目录

基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(基于keras)对word实现预测

设计思路

输出结果

核心代码


 

 

 

基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(基于keras)对word实现预测

设计思路

更新……

 

 

 

 

输出结果

1. rawtext_BySpaceConnect: ALICE'S ADVENTURES IN WONDERLAND  Lewis Carroll  THE MILLENNIUM FULCRUM EDITION 3.0  CHAPTER I. Down the Rabbit-Hole  Alice was beginning to get very tired of sitting by her sister on the bank, and of having nothing to do: once or twice she had peeped into the book her sister was reading, but it had no pictures or conversations in it, 'and what is the use of a book,' thought Alice 'without pictures or conversations?'  So she was considering in her own mind (as well as she could, for the hot day 
2. rawtext2WordLists: ["ALICE'S", 'ADVENTURES', 'IN', 'WONDERLAND', 'Lewis', 'Carroll', 'THE', 'MILLENNIUM', 'FULCRUM', 'EDITION', '3.0', 'CHAPTER', 'I', 'Down', 'the', 'Rabbit-Hole', 'Alice', 'was', 'beginning', 'to', 'get', 'very', 'tired', 'of', 'sitting', 'by', 'her', 'sister', 'on', 'the', 'bank', 'and', 'of', 'having', 'nothing', 'to', 'do', 'once', 'or', 'twice', 'she', 'had', 'peeped', 'into', 'the', 'book', 'her', 'sister', 'was', 'reading', 'but', 'it', 'had', 'no', 'pictures', 'or', 'conversations', 'in', 'it', 'and', 'what', 'is', 'the', 'use', 'of', 'a', 'book', 'thought', 'Alice', 'without', 'pictures', 'or', 'conversations', 'So', 'she', 'was', 'considering', 'in', 'her', 'own', 'mind', 'as', 'well', 'as', 'she', 'could', 'for', 'the', 'hot', 'day', 'made', 'her', 'feel', 'very', 'sleepy', 'and', 'stupid', 'whether', 'the', 'pleasure', 'of', 'making', 'a', 'daisy-chain', 'would', 'be', 'worth', 'the', 'trouble', 'of', 'getting', 'up', 'and', 'picking', 'the', 'daisies', 'when', 'suddenly', 'a', 'White', 'Rabbit', 'with', 'pink', 'eyes', 'ran', 'close', 'by', 'her', 'There', 'was', 'nothing', 'so', 'VERY', 'remarkable', 'in', 'that', 'nor', 'did', 'Alice', 'think', 'it', 'so', 'VERY', 'much', 'out', 'of', 'the', 'way', 'to', 'hear', 'the', 'Rabbit', 'say', 'to', 'itself', 'Oh', 'dear', 'Oh', 'dear', 'I', 'shall', 'be', 'late', 'when', 'she', 'thought', 'it', 'over', 'afterwards', 'it', 'occurred', 'to', 'her', 'that', 'she', 'ought', 'to', 'have', 'wondered', 'at', 'this', 'but', 'at', 'the', 'time', 'it', 'all', 'seemed', 'quite', 'natural', 'but', 'when', 'the', 'Rabbit', 'actually', 'TOOK', 'A', 'WATCH', 'OUT', 'OF', 'ITS', 'WAISTCOAT-POCKET', 'and', 'looked', 'at', 'it', 'and', 'then', 'hurried', 'on', 'Alice', 'started', 'to', 'her', 'feet', 'for', 'it', 'flashed', 'across', 'her', 'mind', 'that', 'she', 'had', 'never', 'before', 'seen', 'a', 'rabbit', 'with', 'either', 'a', 'waistcoat-pocket', 'or', 'a', 'watch', 'to', 'take', 'out', 'of', 'it', 'and', 'burning', 'with', 'curiosity', 'she', 'ran', 'across', 'the', 'field', 'after', 'it', 'and', 'fortunately', 'was', 'just', 'in', 'time', 'to', 'see', 'it', 'pop', 'down', 'a', 'large', 'rabbit-hole', 'under', 'the', 'hedge', 'In', 'another', 'moment', 'down', 'went', 'Alice', 'after', 'it', 'never', 'once', 'considering', 'how', 'in', 'the', 'world', 'she', 'was', 'to', 'get', 'out', 'again', 'The', 'rabbit-hole', 'went', 'straight', 'on', 'like', 'a', 'tunnel', 'for', 'some', 'way', 'and', 'then', 'dipped', 'suddenly', 'down', 'so', 'suddenly', 'that', 'Alice', 'had', 'not', 'a', 'moment', 'to', 'think', 'about', 'stopping', 'herself', 'before', 'she', 'found', 'herself', 'falling', 'down', 'a', 'very', 'deep', 'well', 'Either', 'the', 'well', 'was', 'very', 'deep', 'or', 'she', 'fell', 'very', 'slowly', 'for', 'she', 'had', 'plenty', 'of', 'time', 'as', 'she', 'went', 'down', 'to', 'look', 'about', 'her', 'and', 'to', 'wonder', 'what', 'was', 'going', 'to', 'happen', 'next', 'First', 'she', 'tried', 'to', 'look', 'down', 'and', 'make', 'out', 'what', 'she', 'was', 'coming', 'to', 'but', 'it', 'was', 'too', 'dark', 'to', 'see', 'anything', 'then', 'she', 'looked', 'at', 'the', 'sides', 'of', 'the', 'well', 'and', 'noticed', 'that', 'they', 'were', 'filled', 'with', 'cupboards', 'and', 'book-shelves', 'here', 'and', 'there', 'she', 'saw', 'maps', 'and', 'pictures', 'hung', 'upon', 'pegs', 'She', 'took', 'down', 'a', 'jar', 'from', 'one', 'of', 'the', 'shelves', 'as', 'she', 'passed', 'it', 'was', 'labelled', 'ORANGE', 'MARMALADE', 'but', 'to', 'her', 'great', 'disappointment', 'it', 'was', 'empty', 'she', 'did', 'not', 'like', 'to', 'drop', 'the', 'jar', 'for', 'fear', 'of', 'killing', 'somebody', 'so', 'managed', 'to', 'put', 'it', 'into', 'one', 'of', 'the', 'cupboards', 'as', 'she', 'fell', 'past', 'it', 'Well', 'thought', 'Alice', 'to', 'herself', 'after', 'such', 'a', 'fall', 'as', 'this', 'I', 'shall', 'think', 'nothing', 'of', 'tumbling', 'down', 'stairs', 'How', 'brave', "they'll", 'all', 'think', 'me', 'at', 'home', 'Why', 'I', "wouldn't", 'say']
3. rawtext_BySpace: ALICE'S ADVENTURES IN WONDERLAND Lewis Carroll THE MILLENNIUM FULCRUM EDITION 3.0 CHAPTER I Down the Rabbit Hole Alice was beginning to get very tired of sitting by her sister on the bank and of having nothing to do once or twice she had peeped into the book her sister was reading but it had no pictures or conversations in it and what is the use of a book thought Alice without pictures or conversations So she was considering in her own mind as well as she could for the hot day made her feel very
4. words_num: 26694
5. vocab_num: 3063
6. dataX: 26594 100 [[19, 18, 238, 547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713], [18, 238, 547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144], [238, 547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144, 2006], [547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144, 2006, 1851], [278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144, 2006, 1851, 594]]
7. dataY: 26594 [2144, 2006, 1851, 594, 1074]
8. Total patterns: 26594
9. X_train.shape (26594, 100, 1)
10. Y_train.shape (26594, 3063)
11. _________________________________________________________________
12. Layer (type)                 Output Shape              Param #   
13. =================================================================
14. lstm_1 (LSTM)                (None, 256)               264192
15. _________________________________________________________________
16. dropout_1 (Dropout)          (None, 256)               0
17. _________________________________________________________________
18. dense_1 (Dense)              (None, 3063)              787191
19. =================================================================
20. Total params: 1,051,383
21. Trainable params: 1,051,383
22. Non-trainable params: 0
23. _________________________________________________________________
24. LSTM_Model 
25. None
26. 
27. 
28. ……
29. 
30. Epoch 00005: loss improved from 6.26403 to 6.26198, saving model to hdf5/word-weights-improvement-05-6.2620.hdf5
31. Epoch 6/10
32. 
33. 128/26594 [..............................] - ETA: 2:09 - loss: 6.8378
34. 256/26594 [..............................] - ETA: 2:06 - loss: 6.4136
35. 384/26594 [..............................] - ETA: 2:01 - loss: 6.3299
36. 512/26594 [..............................] - ETA: 1:57 - loss: 6.4469
37. 640/26594 [..............................] - ETA: 1:57 - loss: 6.4133
38. 
39. ……
40. 
41. Epoch 00008: loss improved from 6.25725 to 6.25487, saving model to hdf5/word-weights-improvement-08-6.2549.hdf5
42. Epoch 9/10
43. 
44. 128/26594 [..............................] - ETA: 1:57 - loss: 6.2336
45. 256/26594 [..............................] - ETA: 2:02 - loss: 6.1897
46. 384/26594 [..............................] - ETA: 2:04 - loss: 6.3229
47. 512/26594 [..............................] - ETA: 2:01 - loss: 6.3550
48. 640/26594 [..............................] - ETA: 2:02 - loss: 6.3279
49. 768/26594 [..............................] - ETA: 2:05 - loss: 6.2614
50. 896/26594 [>.............................] - ETA: 2:06 - loss: 6.2433
51. 1024/26594 [>.............................] - ETA: 2:07 - loss: 6.2477
52. ……
53. 
54. 25216/26594 [===========================>..] - ETA: 6s - loss: 6.2456
55. 25344/26594 [===========================>..] - ETA: 6s - loss: 6.2469
56. 25472/26594 [===========================>..] - ETA: 5s - loss: 6.2477
57. 25600/26594 [===========================>..] - ETA: 4s - loss: 6.2486
58. 25728/26594 [============================>.] - ETA: 4s - loss: 6.2480
59. 25856/26594 [============================>.] - ETA: 3s - loss: 6.2483
60. 25984/26594 [============================>.] - ETA: 2s - loss: 6.2487
61. 26112/26594 [============================>.] - ETA: 2s - loss: 6.2485
62. 26240/26594 [============================>.] - ETA: 1s - loss: 6.2483
63. 26368/26594 [============================>.] - ETA: 1s - loss: 6.2482
64. 26496/26594 [============================>.] - ETA: 0s - loss: 6.2485
65. 26594/26594 [==============================] - 129s 5ms/step - loss: 6.2499
66. 
67. Epoch 00009: loss improved from 6.25487 to 6.24987, saving model to hdf5/word-weights-improvement-09-6.2499.hdf5
68. Epoch 10/10
69. 
70. 128/26594 [..............................] - ETA: 1:56 - loss: 6.4864
71. 256/26594 [..............................] - ETA: 2:04 - loss: 6.2577
72. 384/26594 [..............................] - ETA: 2:07 - loss: 6.2857
73. 512/26594 [..............................] - ETA: 2:10 - loss: 6.3230
74. ……
75. 
76. 25856/26594 [============================>.] - ETA: 3s - loss: 6.2426
77. 25984/26594 [============================>.] - ETA: 3s - loss: 6.2447
78. 26112/26594 [============================>.] - ETA: 2s - loss: 6.2446
79. 26240/26594 [============================>.] - ETA: 1s - loss: 6.2449
80. 26368/26594 [============================>.] - ETA: 1s - loss: 6.2467
81. 26496/26594 [============================>.] - ETA: 0s - loss: 6.2461
82. 26594/26594 [==============================] - 135s 5ms/step - loss: 6.2465
83. 
84. 
85. 
86. Epoch 00010: loss improved from 6.24987 to 6.24646, saving model to hdf5/word-weights-improvement-10-6.2465.hdf5
87. LSTM_Pre_word.shape: 
88.  (3, 3063)
89. 
90. 
91. 
92. 
93. 
94. LSTM_Model,Seed:
95. " cheerfully he seems to grin How neatly spread his claws And welcome little fishes in With gently smiling jaws I'm sure those are not the right words said poor Alice and her eyes filled with tears again as she went on I must be Mabel after all and I shall have to go and live in that poky little house and have next to no toys to play with and oh ever so many lessons to learn No I've made up my mind about it if I'm Mabel I'll stay down here It'll be no use their putting their heads "
96. 199 100
97. 
98.  Generated Sequence:
99. the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the
100. 
101.  Done.
102. 
103. 
104.

 

 

 

核心代码

1. LSTM_Model = Sequential()
2. LSTM_Model.add(LSTM(256, input_shape=(X_train.shape[1], X_train.shape[2])))
3. LSTM_Model.add(Dropout(0.2))
4. LSTM_Model.add(Dense(Y_train.shape[1], activation='softmax'))
5. LSTM_Model.compile(loss='categorical_crossentropy', optimizer='adam')
6. print('LSTM_Model \n',LSTM_Model.summary())

 


相关文章
|
5月前
|
机器学习/深度学习 算法 算法框架/工具
数据分享|PYTHON用KERAS的LSTM神经网络进行时间序列预测天然气价格例子
数据分享|PYTHON用KERAS的LSTM神经网络进行时间序列预测天然气价格例子
|
2月前
|
机器学习/深度学习 数据采集 数据可视化
【优秀python系统毕设】基于Python flask的气象数据可视化系统设计与实现,有LSTM算法预测气温
本文介绍了一个基于Python Flask框架开发的气象数据可视化系统,该系统集成了数据获取、处理、存储、LSTM算法气温预测以及多种数据可视化功能,旨在提高气象数据的利用价值并推动气象领域的发展。
|
2月前
|
机器学习/深度学习 数据采集 存储
基于Python+flask+echarts的气象数据采集与分析系统,可实现lstm算法进行预测
本文介绍了一个基于Python、Flask和Echarts的气象数据采集与分析系统,该系统集成了LSTM算法进行数据预测,并提供了实时数据监测、历史数据查询、数据可视化以及用户权限管理等功能。
|
2月前
|
机器学习/深度学习 算法 数据可视化
基于Python flask的豆瓣电影数据分析可视化系统,功能多,LSTM算法+注意力机制实现情感分析,准确率高达85%
本文介绍了一个基于Python Flask框架的豆瓣电影数据分析可视化系统,该系统集成了LSTM算法和注意力机制进行情感分析,准确率高达85%,提供了多样化的数据分析和情感识别功能,旨在帮助用户深入理解电影市场和观众喜好。
|
3月前
|
机器学习/深度学习 数据采集 监控
算法金 | DL 骚操作扫盲,神经网络设计与选择、参数初始化与优化、学习率调整与正则化、Loss Function、Bad Gradient
**神经网络与AI学习概览** - 探讨神经网络设计,包括MLP、RNN、CNN,激活函数如ReLU,以及隐藏层设计,强调网络结构与任务匹配。 - 参数初始化与优化涉及Xavier/He初始化,权重和偏置初始化,优化算法如SGD、Adam,针对不同场景选择。 - 学习率调整与正则化,如动态学习率、L1/L2正则化、早停法和Dropout,以改善训练和泛化。
32 0
算法金 | DL 骚操作扫盲,神经网络设计与选择、参数初始化与优化、学习率调整与正则化、Loss Function、Bad Gradient
|
4月前
|
机器学习/深度学习 存储 人工智能
算法金 | LSTM 原作者带队,一个强大的算法模型杀回来了
**摘要:** 本文介绍了LSTM(长短期记忆网络)的发展背景和重要性,以及其创始人Sepp Hochreiter新推出的xLSTM。LSTM是为解决传统RNN长期依赖问题而设计的,广泛应用于NLP和时间序列预测。文章详细阐述了LSTM的基本概念、核心原理、实现方法和实际应用案例,包括文本生成和时间序列预测。此外,还讨论了LSTM与Transformer的竞争格局。最后,鼓励读者深入学习和探索AI领域。
51 7
算法金 | LSTM 原作者带队,一个强大的算法模型杀回来了
|
4月前
|
机器学习/深度学习 算法 数据可视化
m基于PSO-LSTM粒子群优化长短记忆网络的电力负荷数据预测算法matlab仿真
在MATLAB 2022a中,应用PSO优化的LSTM模型提升了电力负荷预测效果。优化前预测波动大,优化后预测更稳定。PSO借鉴群体智能,寻找LSTM超参数(如学习率、隐藏层大小)的最优组合,以最小化误差。LSTM通过门控机制处理序列数据。代码显示了模型训练、预测及误差可视化过程。经过优化,模型性能得到改善。
86 6
|
3月前
|
算法 JavaScript
「AIGC算法」将word文档转换为纯文本
使用Node.js模块`mammoth`和`html-to-text`,该代码示例演示了如何将Word文档(.docx格式)转换为纯文本以适应AIGC的文本识别。流程包括将Word文档转化为HTML,然后进一步转换为纯文本,进行格式调整,并输出到控制台。转换过程中考虑了错误处理。提供的代码片段展示了具体的实现细节,包括关键库的导入和转换函数的调用。
27 0
|
5月前
|
机器学习/深度学习 算法框架/工具
数据分享|R语言用Keras长短期记忆LSTM神经网络分类分析问答文本数据
数据分享|R语言用Keras长短期记忆LSTM神经网络分类分析问答文本数据
|
5月前
|
机器学习/深度学习 自然语言处理 算法
Python遗传算法GA对长短期记忆LSTM深度学习模型超参数调优分析司机数据|附数据代码
Python遗传算法GA对长短期记忆LSTM深度学习模型超参数调优分析司机数据|附数据代码
下一篇
无影云桌面