DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(基于keras)对word实现预测

简介: DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(基于keras)对word实现预测

输出结果

rawtext_BySpaceConnect: ALICE'S ADVENTURES IN WONDERLAND  Lewis Carroll  THE MILLENNIUM FULCRUM EDITION 3.0  CHAPTER I. Down the Rabbit-Hole  Alice was beginning to get very tired of sitting by her sister on the bank, and of having nothing to do: once or twice she had peeped into the book her sister was reading, but it had no pictures or conversations in it, 'and what is the use of a book,' thought Alice 'without pictures or conversations?'  So she was considering in her own mind (as well as she could, for the hot day

rawtext2WordLists: ["ALICE'S", 'ADVENTURES', 'IN', 'WONDERLAND', 'Lewis', 'Carroll', 'THE', 'MILLENNIUM', 'FULCRUM', 'EDITION', '3.0', 'CHAPTER', 'I', 'Down', 'the', 'Rabbit-Hole', 'Alice', 'was', 'beginning', 'to', 'get', 'very', 'tired', 'of', 'sitting', 'by', 'her', 'sister', 'on', 'the', 'bank', 'and', 'of', 'having', 'nothing', 'to', 'do', 'once', 'or', 'twice', 'she', 'had', 'peeped', 'into', 'the', 'book', 'her', 'sister', 'was', 'reading', 'but', 'it', 'had', 'no', 'pictures', 'or', 'conversations', 'in', 'it', 'and', 'what', 'is', 'the', 'use', 'of', 'a', 'book', 'thought', 'Alice', 'without', 'pictures', 'or', 'conversations', 'So', 'she', 'was', 'considering', 'in', 'her', 'own', 'mind', 'as', 'well', 'as', 'she', 'could', 'for', 'the', 'hot', 'day', 'made', 'her', 'feel', 'very', 'sleepy', 'and', 'stupid', 'whether', 'the', 'pleasure', 'of', 'making', 'a', 'daisy-chain', 'would', 'be', 'worth', 'the', 'trouble', 'of', 'getting', 'up', 'and', 'picking', 'the', 'daisies', 'when', 'suddenly', 'a', 'White', 'Rabbit', 'with', 'pink', 'eyes', 'ran', 'close', 'by', 'her', 'There', 'was', 'nothing', 'so', 'VERY', 'remarkable', 'in', 'that', 'nor', 'did', 'Alice', 'think', 'it', 'so', 'VERY', 'much', 'out', 'of', 'the', 'way', 'to', 'hear', 'the', 'Rabbit', 'say', 'to', 'itself', 'Oh', 'dear', 'Oh', 'dear', 'I', 'shall', 'be', 'late', 'when', 'she', 'thought', 'it', 'over', 'afterwards', 'it', 'occurred', 'to', 'her', 'that', 'she', 'ought', 'to', 'have', 'wondered', 'at', 'this', 'but', 'at', 'the', 'time', 'it', 'all', 'seemed', 'quite', 'natural', 'but', 'when', 'the', 'Rabbit', 'actually', 'TOOK', 'A', 'WATCH', 'OUT', 'OF', 'ITS', 'WAISTCOAT-POCKET', 'and', 'looked', 'at', 'it', 'and', 'then', 'hurried', 'on', 'Alice', 'started', 'to', 'her', 'feet', 'for', 'it', 'flashed', 'across', 'her', 'mind', 'that', 'she', 'had', 'never', 'before', 'seen', 'a', 'rabbit', 'with', 'either', 'a', 'waistcoat-pocket', 'or', 'a', 'watch', 'to', 'take', 'out', 'of', 'it', 'and', 'burning', 'with', 'curiosity', 'she', 'ran', 'across', 'the', 'field', 'after', 'it', 'and', 'fortunately', 'was', 'just', 'in', 'time', 'to', 'see', 'it', 'pop', 'down', 'a', 'large', 'rabbit-hole', 'under', 'the', 'hedge', 'In', 'another', 'moment', 'down', 'went', 'Alice', 'after', 'it', 'never', 'once', 'considering', 'how', 'in', 'the', 'world', 'she', 'was', 'to', 'get', 'out', 'again', 'The', 'rabbit-hole', 'went', 'straight', 'on', 'like', 'a', 'tunnel', 'for', 'some', 'way', 'and', 'then', 'dipped', 'suddenly', 'down', 'so', 'suddenly', 'that', 'Alice', 'had', 'not', 'a', 'moment', 'to', 'think', 'about', 'stopping', 'herself', 'before', 'she', 'found', 'herself', 'falling', 'down', 'a', 'very', 'deep', 'well', 'Either', 'the', 'well', 'was', 'very', 'deep', 'or', 'she', 'fell', 'very', 'slowly', 'for', 'she', 'had', 'plenty', 'of', 'time', 'as', 'she', 'went', 'down', 'to', 'look', 'about', 'her', 'and', 'to', 'wonder', 'what', 'was', 'going', 'to', 'happen', 'next', 'First', 'she', 'tried', 'to', 'look', 'down', 'and', 'make', 'out', 'what', 'she', 'was', 'coming', 'to', 'but', 'it', 'was', 'too', 'dark', 'to', 'see', 'anything', 'then', 'she', 'looked', 'at', 'the', 'sides', 'of', 'the', 'well', 'and', 'noticed', 'that', 'they', 'were', 'filled', 'with', 'cupboards', 'and', 'book-shelves', 'here', 'and', 'there', 'she', 'saw', 'maps', 'and', 'pictures', 'hung', 'upon', 'pegs', 'She', 'took', 'down', 'a', 'jar', 'from', 'one', 'of', 'the', 'shelves', 'as', 'she', 'passed', 'it', 'was', 'labelled', 'ORANGE', 'MARMALADE', 'but', 'to', 'her', 'great', 'disappointment', 'it', 'was', 'empty', 'she', 'did', 'not', 'like', 'to', 'drop', 'the', 'jar', 'for', 'fear', 'of', 'killing', 'somebody', 'so', 'managed', 'to', 'put', 'it', 'into', 'one', 'of', 'the', 'cupboards', 'as', 'she', 'fell', 'past', 'it', 'Well', 'thought', 'Alice', 'to', 'herself', 'after', 'such', 'a', 'fall', 'as', 'this', 'I', 'shall', 'think', 'nothing', 'of', 'tumbling', 'down', 'stairs', 'How', 'brave', "they'll", 'all', 'think', 'me', 'at', 'home', 'Why', 'I', "wouldn't", 'say']

rawtext_BySpace: ALICE'S ADVENTURES IN WONDERLAND Lewis Carroll THE MILLENNIUM FULCRUM EDITION 3.0 CHAPTER I Down the Rabbit Hole Alice was beginning to get very tired of sitting by her sister on the bank and of having nothing to do once or twice she had peeped into the book her sister was reading but it had no pictures or conversations in it and what is the use of a book thought Alice without pictures or conversations So she was considering in her own mind as well as she could for the hot day made her feel very

words_num: 26694

vocab_num: 3063

dataX: 26594 100 [[19, 18, 238, 547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713], [18, 238, 547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144], [238, 547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144, 2006], [547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144, 2006, 1851], [278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144, 2006, 1851, 594]]

dataY: 26594 [2144, 2006, 1851, 594, 1074]

Total patterns: 26594

X_train.shape (26594, 100, 1)

Y_train.shape (26594, 3063)

_________________________________________________________________

Layer (type)                 Output Shape              Param #  

=================================================================

lstm_1 (LSTM)                (None, 256)               264192    

_________________________________________________________________

dropout_1 (Dropout)          (None, 256)               0        

_________________________________________________________________

dense_1 (Dense)              (None, 3063)              787191    

=================================================================

Total params: 1,051,383

Trainable params: 1,051,383

Non-trainable params: 0

_________________________________________________________________

LSTM_Model

None

……

Epoch 00005: loss improved from 6.26403 to 6.26198, saving model to hdf5/word-weights-improvement-05-6.2620.hdf5

Epoch 6/10

 128/26594 [..............................] - ETA: 2:09 - loss: 6.8378

 256/26594 [..............................] - ETA: 2:06 - loss: 6.4136

 384/26594 [..............................] - ETA: 2:01 - loss: 6.3299

 512/26594 [..............................] - ETA: 1:57 - loss: 6.4469

 640/26594 [..............................] - ETA: 1:57 - loss: 6.4133

……

Epoch 00008: loss improved from 6.25725 to 6.25487, saving model to hdf5/word-weights-improvement-08-6.2549.hdf5

Epoch 9/10

 128/26594 [..............................] - ETA: 1:57 - loss: 6.2336

 256/26594 [..............................] - ETA: 2:02 - loss: 6.1897

 384/26594 [..............................] - ETA: 2:04 - loss: 6.3229

 512/26594 [..............................] - ETA: 2:01 - loss: 6.3550

 640/26594 [..............................] - ETA: 2:02 - loss: 6.3279

 768/26594 [..............................] - ETA: 2:05 - loss: 6.2614

 896/26594 [>.............................] - ETA: 2:06 - loss: 6.2433

1024/26594 [>.............................] - ETA: 2:07 - loss: 6.2477

……

25216/26594 [===========================>..] - ETA: 6s - loss: 6.2456

25344/26594 [===========================>..] - ETA: 6s - loss: 6.2469

25472/26594 [===========================>..] - ETA: 5s - loss: 6.2477

25600/26594 [===========================>..] - ETA: 4s - loss: 6.2486

25728/26594 [============================>.] - ETA: 4s - loss: 6.2480

25856/26594 [============================>.] - ETA: 3s - loss: 6.2483

25984/26594 [============================>.] - ETA: 2s - loss: 6.2487

26112/26594 [============================>.] - ETA: 2s - loss: 6.2485

26240/26594 [============================>.] - ETA: 1s - loss: 6.2483

26368/26594 [============================>.] - ETA: 1s - loss: 6.2482

26496/26594 [============================>.] - ETA: 0s - loss: 6.2485

26594/26594 [==============================] - 129s 5ms/step - loss: 6.2499

Epoch 00009: loss improved from 6.25487 to 6.24987, saving model to hdf5/word-weights-improvement-09-6.2499.hdf5

Epoch 10/10

 128/26594 [..............................] - ETA: 1:56 - loss: 6.4864

 256/26594 [..............................] - ETA: 2:04 - loss: 6.2577

 384/26594 [..............................] - ETA: 2:07 - loss: 6.2857

 512/26594 [..............................] - ETA: 2:10 - loss: 6.3230

……

25856/26594 [============================>.] - ETA: 3s - loss: 6.2426

25984/26594 [============================>.] - ETA: 3s - loss: 6.2447

26112/26594 [============================>.] - ETA: 2s - loss: 6.2446

26240/26594 [============================>.] - ETA: 1s - loss: 6.2449

26368/26594 [============================>.] - ETA: 1s - loss: 6.2467

26496/26594 [============================>.] - ETA: 0s - loss: 6.2461

26594/26594 [==============================] - 135s 5ms/step - loss: 6.2465

Epoch 00010: loss improved from 6.24987 to 6.24646, saving model to hdf5/word-weights-improvement-10-6.2465.hdf5

LSTM_Pre_word.shape:

(3, 3063)

LSTM_Model,Seed:

" cheerfully he seems to grin How neatly spread his claws And welcome little fishes in With gently smiling jaws I'm sure those are not the right words said poor Alice and her eyes filled with tears again as she went on I must be Mabel after all and I shall have to go and live in that poky little house and have next to no toys to play with and oh ever so many lessons to learn No I've made up my mind about it if I'm Mabel I'll stay down here It'll be no use their putting their heads "

199 100

Generated Sequence:

the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the

Done.


核心代码

LSTM_Model = Sequential()

LSTM_Model.add(LSTM(256, input_shape=(X_train.shape[1], X_train.shape[2])))

LSTM_Model.add(Dropout(0.2))

LSTM_Model.add(Dense(Y_train.shape[1], activation='softmax'))

LSTM_Model.compile(loss='categorical_crossentropy', optimizer='adam')

print('LSTM_Model \n',LSTM_Model.summary())


相关文章
|
机器学习/深度学习 算法 数据挖掘
K-means聚类算法是机器学习中常用的一种聚类方法,通过将数据集划分为K个簇来简化数据结构
K-means聚类算法是机器学习中常用的一种聚类方法,通过将数据集划分为K个簇来简化数据结构。本文介绍了K-means算法的基本原理,包括初始化、数据点分配与簇中心更新等步骤,以及如何在Python中实现该算法,最后讨论了其优缺点及应用场景。
1270 6
|
11月前
|
机器学习/深度学习 算法 数据可视化
利用SVM(支持向量机)分类算法对鸢尾花数据集进行分类
本文介绍了如何使用支持向量机(SVM)算法对鸢尾花数据集进行分类。作者通过Python的sklearn库加载数据,并利用pandas、matplotlib等工具进行数据分析和可视化。
1037 70
|
机器学习/深度学习 算法 PyTorch
【从零开始学习深度学习】38. Pytorch实战案例:梯度下降、随机梯度下降、小批量随机梯度下降3种优化算法对比【含数据集与源码】
【从零开始学习深度学习】38. Pytorch实战案例:梯度下降、随机梯度下降、小批量随机梯度下降3种优化算法对比【含数据集与源码】
|
机器学习/深度学习 数据采集 监控
算法金 | DL 骚操作扫盲,神经网络设计与选择、参数初始化与优化、学习率调整与正则化、Loss Function、Bad Gradient
**神经网络与AI学习概览** - 探讨神经网络设计,包括MLP、RNN、CNN,激活函数如ReLU,以及隐藏层设计,强调网络结构与任务匹配。 - 参数初始化与优化涉及Xavier/He初始化,权重和偏置初始化,优化算法如SGD、Adam,针对不同场景选择。 - 学习率调整与正则化,如动态学习率、L1/L2正则化、早停法和Dropout,以改善训练和泛化。
357 0
算法金 | DL 骚操作扫盲,神经网络设计与选择、参数初始化与优化、学习率调整与正则化、Loss Function、Bad Gradient
|
算法 JavaScript
「AIGC算法」将word文档转换为纯文本
使用Node.js模块`mammoth`和`html-to-text`,该代码示例演示了如何将Word文档(.docx格式)转换为纯文本以适应AIGC的文本识别。流程包括将Word文档转化为HTML,然后进一步转换为纯文本,进行格式调整,并输出到控制台。转换过程中考虑了错误处理。提供的代码片段展示了具体的实现细节,包括关键库的导入和转换函数的调用。
352 0
|
存储 算法 Java
Java数据结构与算法:用于高效地存储和检索字符串数据集
Java数据结构与算法:用于高效地存储和检索字符串数据集
|
机器学习/深度学习 人工智能 算法
分类算法入门:以鸢尾花数据集为例(上)
分类算法入门:以鸢尾花数据集为例(上)
821 2
|
机器学习/深度学习 自然语言处理 前端开发
深度学习-[源码+数据集]基于LSTM神经网络黄金价格预测实战
深度学习-[源码+数据集]基于LSTM神经网络黄金价格预测实战
761 0
|
机器学习/深度学习 分布式计算 并行计算
【机器学习】怎样在非常大的数据集上执行K-means算法?
【5月更文挑战第13天】【机器学习】怎样在非常大的数据集上执行K-means算法?
|
算法 搜索推荐 数据挖掘
MATLAB模糊C均值聚类FCM改进的推荐系统协同过滤算法分析MovieLens电影数据集
MATLAB模糊C均值聚类FCM改进的推荐系统协同过滤算法分析MovieLens电影数据集