DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(基于keras)对word实现预测-阿里云开发者社区

开发者社区> 一个处女座的程序猿> 正文

DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(基于keras)对word实现预测

简介: DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(基于keras)对word实现预测
+关注继续查看

输出结果

rawtext_BySpaceConnect: ALICE'S ADVENTURES IN WONDERLAND  Lewis Carroll  THE MILLENNIUM FULCRUM EDITION 3.0  CHAPTER I. Down the Rabbit-Hole  Alice was beginning to get very tired of sitting by her sister on the bank, and of having nothing to do: once or twice she had peeped into the book her sister was reading, but it had no pictures or conversations in it, 'and what is the use of a book,' thought Alice 'without pictures or conversations?'  So she was considering in her own mind (as well as she could, for the hot day

rawtext2WordLists: ["ALICE'S", 'ADVENTURES', 'IN', 'WONDERLAND', 'Lewis', 'Carroll', 'THE', 'MILLENNIUM', 'FULCRUM', 'EDITION', '3.0', 'CHAPTER', 'I', 'Down', 'the', 'Rabbit-Hole', 'Alice', 'was', 'beginning', 'to', 'get', 'very', 'tired', 'of', 'sitting', 'by', 'her', 'sister', 'on', 'the', 'bank', 'and', 'of', 'having', 'nothing', 'to', 'do', 'once', 'or', 'twice', 'she', 'had', 'peeped', 'into', 'the', 'book', 'her', 'sister', 'was', 'reading', 'but', 'it', 'had', 'no', 'pictures', 'or', 'conversations', 'in', 'it', 'and', 'what', 'is', 'the', 'use', 'of', 'a', 'book', 'thought', 'Alice', 'without', 'pictures', 'or', 'conversations', 'So', 'she', 'was', 'considering', 'in', 'her', 'own', 'mind', 'as', 'well', 'as', 'she', 'could', 'for', 'the', 'hot', 'day', 'made', 'her', 'feel', 'very', 'sleepy', 'and', 'stupid', 'whether', 'the', 'pleasure', 'of', 'making', 'a', 'daisy-chain', 'would', 'be', 'worth', 'the', 'trouble', 'of', 'getting', 'up', 'and', 'picking', 'the', 'daisies', 'when', 'suddenly', 'a', 'White', 'Rabbit', 'with', 'pink', 'eyes', 'ran', 'close', 'by', 'her', 'There', 'was', 'nothing', 'so', 'VERY', 'remarkable', 'in', 'that', 'nor', 'did', 'Alice', 'think', 'it', 'so', 'VERY', 'much', 'out', 'of', 'the', 'way', 'to', 'hear', 'the', 'Rabbit', 'say', 'to', 'itself', 'Oh', 'dear', 'Oh', 'dear', 'I', 'shall', 'be', 'late', 'when', 'she', 'thought', 'it', 'over', 'afterwards', 'it', 'occurred', 'to', 'her', 'that', 'she', 'ought', 'to', 'have', 'wondered', 'at', 'this', 'but', 'at', 'the', 'time', 'it', 'all', 'seemed', 'quite', 'natural', 'but', 'when', 'the', 'Rabbit', 'actually', 'TOOK', 'A', 'WATCH', 'OUT', 'OF', 'ITS', 'WAISTCOAT-POCKET', 'and', 'looked', 'at', 'it', 'and', 'then', 'hurried', 'on', 'Alice', 'started', 'to', 'her', 'feet', 'for', 'it', 'flashed', 'across', 'her', 'mind', 'that', 'she', 'had', 'never', 'before', 'seen', 'a', 'rabbit', 'with', 'either', 'a', 'waistcoat-pocket', 'or', 'a', 'watch', 'to', 'take', 'out', 'of', 'it', 'and', 'burning', 'with', 'curiosity', 'she', 'ran', 'across', 'the', 'field', 'after', 'it', 'and', 'fortunately', 'was', 'just', 'in', 'time', 'to', 'see', 'it', 'pop', 'down', 'a', 'large', 'rabbit-hole', 'under', 'the', 'hedge', 'In', 'another', 'moment', 'down', 'went', 'Alice', 'after', 'it', 'never', 'once', 'considering', 'how', 'in', 'the', 'world', 'she', 'was', 'to', 'get', 'out', 'again', 'The', 'rabbit-hole', 'went', 'straight', 'on', 'like', 'a', 'tunnel', 'for', 'some', 'way', 'and', 'then', 'dipped', 'suddenly', 'down', 'so', 'suddenly', 'that', 'Alice', 'had', 'not', 'a', 'moment', 'to', 'think', 'about', 'stopping', 'herself', 'before', 'she', 'found', 'herself', 'falling', 'down', 'a', 'very', 'deep', 'well', 'Either', 'the', 'well', 'was', 'very', 'deep', 'or', 'she', 'fell', 'very', 'slowly', 'for', 'she', 'had', 'plenty', 'of', 'time', 'as', 'she', 'went', 'down', 'to', 'look', 'about', 'her', 'and', 'to', 'wonder', 'what', 'was', 'going', 'to', 'happen', 'next', 'First', 'she', 'tried', 'to', 'look', 'down', 'and', 'make', 'out', 'what', 'she', 'was', 'coming', 'to', 'but', 'it', 'was', 'too', 'dark', 'to', 'see', 'anything', 'then', 'she', 'looked', 'at', 'the', 'sides', 'of', 'the', 'well', 'and', 'noticed', 'that', 'they', 'were', 'filled', 'with', 'cupboards', 'and', 'book-shelves', 'here', 'and', 'there', 'she', 'saw', 'maps', 'and', 'pictures', 'hung', 'upon', 'pegs', 'She', 'took', 'down', 'a', 'jar', 'from', 'one', 'of', 'the', 'shelves', 'as', 'she', 'passed', 'it', 'was', 'labelled', 'ORANGE', 'MARMALADE', 'but', 'to', 'her', 'great', 'disappointment', 'it', 'was', 'empty', 'she', 'did', 'not', 'like', 'to', 'drop', 'the', 'jar', 'for', 'fear', 'of', 'killing', 'somebody', 'so', 'managed', 'to', 'put', 'it', 'into', 'one', 'of', 'the', 'cupboards', 'as', 'she', 'fell', 'past', 'it', 'Well', 'thought', 'Alice', 'to', 'herself', 'after', 'such', 'a', 'fall', 'as', 'this', 'I', 'shall', 'think', 'nothing', 'of', 'tumbling', 'down', 'stairs', 'How', 'brave', "they'll", 'all', 'think', 'me', 'at', 'home', 'Why', 'I', "wouldn't", 'say']

rawtext_BySpace: ALICE'S ADVENTURES IN WONDERLAND Lewis Carroll THE MILLENNIUM FULCRUM EDITION 3.0 CHAPTER I Down the Rabbit Hole Alice was beginning to get very tired of sitting by her sister on the bank and of having nothing to do once or twice she had peeped into the book her sister was reading but it had no pictures or conversations in it and what is the use of a book thought Alice without pictures or conversations So she was considering in her own mind as well as she could for the hot day made her feel very

words_num: 26694

vocab_num: 3063

dataX: 26594 100 [[19, 18, 238, 547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713], [18, 238, 547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144], [238, 547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144, 2006], [547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144, 2006, 1851], [278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144, 2006, 1851, 594]]

dataY: 26594 [2144, 2006, 1851, 594, 1074]

Total patterns: 26594

X_train.shape (26594, 100, 1)

Y_train.shape (26594, 3063)

_________________________________________________________________

Layer (type)                 Output Shape              Param #  

=================================================================

lstm_1 (LSTM)                (None, 256)               264192    

_________________________________________________________________

dropout_1 (Dropout)          (None, 256)               0        

_________________________________________________________________

dense_1 (Dense)              (None, 3063)              787191    

=================================================================

Total params: 1,051,383

Trainable params: 1,051,383

Non-trainable params: 0

_________________________________________________________________

LSTM_Model

None

……

Epoch 00005: loss improved from 6.26403 to 6.26198, saving model to hdf5/word-weights-improvement-05-6.2620.hdf5

Epoch 6/10

 128/26594 [..............................] - ETA: 2:09 - loss: 6.8378

 256/26594 [..............................] - ETA: 2:06 - loss: 6.4136

 384/26594 [..............................] - ETA: 2:01 - loss: 6.3299

 512/26594 [..............................] - ETA: 1:57 - loss: 6.4469

 640/26594 [..............................] - ETA: 1:57 - loss: 6.4133

……

Epoch 00008: loss improved from 6.25725 to 6.25487, saving model to hdf5/word-weights-improvement-08-6.2549.hdf5

Epoch 9/10

 128/26594 [..............................] - ETA: 1:57 - loss: 6.2336

 256/26594 [..............................] - ETA: 2:02 - loss: 6.1897

 384/26594 [..............................] - ETA: 2:04 - loss: 6.3229

 512/26594 [..............................] - ETA: 2:01 - loss: 6.3550

 640/26594 [..............................] - ETA: 2:02 - loss: 6.3279

 768/26594 [..............................] - ETA: 2:05 - loss: 6.2614

 896/26594 [>.............................] - ETA: 2:06 - loss: 6.2433

1024/26594 [>.............................] - ETA: 2:07 - loss: 6.2477

……

25216/26594 [===========================>..] - ETA: 6s - loss: 6.2456

25344/26594 [===========================>..] - ETA: 6s - loss: 6.2469

25472/26594 [===========================>..] - ETA: 5s - loss: 6.2477

25600/26594 [===========================>..] - ETA: 4s - loss: 6.2486

25728/26594 [============================>.] - ETA: 4s - loss: 6.2480

25856/26594 [============================>.] - ETA: 3s - loss: 6.2483

25984/26594 [============================>.] - ETA: 2s - loss: 6.2487

26112/26594 [============================>.] - ETA: 2s - loss: 6.2485

26240/26594 [============================>.] - ETA: 1s - loss: 6.2483

26368/26594 [============================>.] - ETA: 1s - loss: 6.2482

26496/26594 [============================>.] - ETA: 0s - loss: 6.2485

26594/26594 [==============================] - 129s 5ms/step - loss: 6.2499

Epoch 00009: loss improved from 6.25487 to 6.24987, saving model to hdf5/word-weights-improvement-09-6.2499.hdf5

Epoch 10/10

 128/26594 [..............................] - ETA: 1:56 - loss: 6.4864

 256/26594 [..............................] - ETA: 2:04 - loss: 6.2577

 384/26594 [..............................] - ETA: 2:07 - loss: 6.2857

 512/26594 [..............................] - ETA: 2:10 - loss: 6.3230

……

25856/26594 [============================>.] - ETA: 3s - loss: 6.2426

25984/26594 [============================>.] - ETA: 3s - loss: 6.2447

26112/26594 [============================>.] - ETA: 2s - loss: 6.2446

26240/26594 [============================>.] - ETA: 1s - loss: 6.2449

26368/26594 [============================>.] - ETA: 1s - loss: 6.2467

26496/26594 [============================>.] - ETA: 0s - loss: 6.2461

26594/26594 [==============================] - 135s 5ms/step - loss: 6.2465

Epoch 00010: loss improved from 6.24987 to 6.24646, saving model to hdf5/word-weights-improvement-10-6.2465.hdf5

LSTM_Pre_word.shape:

(3, 3063)

LSTM_Model,Seed:

" cheerfully he seems to grin How neatly spread his claws And welcome little fishes in With gently smiling jaws I'm sure those are not the right words said poor Alice and her eyes filled with tears again as she went on I must be Mabel after all and I shall have to go and live in that poky little house and have next to no toys to play with and oh ever so many lessons to learn No I've made up my mind about it if I'm Mabel I'll stay down here It'll be no use their putting their heads "

199 100

Generated Sequence:

the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the

Done.


核心代码

LSTM_Model = Sequential()

LSTM_Model.add(LSTM(256, input_shape=(X_train.shape[1], X_train.shape[2])))

LSTM_Model.add(Dropout(0.2))

LSTM_Model.add(Dense(Y_train.shape[1], activation='softmax'))

LSTM_Model.compile(loss='categorical_crossentropy', optimizer='adam')

print('LSTM_Model \n',LSTM_Model.summary())


版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。

相关文章
一文快速搞定Redis_数据类型及JavaApi操作
大家好,我是**ChinaManor**,直译过来就是中国码农的意思,我希望自己能成为国家复兴道路的铺路人,大数据领域的耕耘者,平凡但不甘于平庸的人。
7 0
偏序关系
偏序 关系 离散数学
6 0
数据类型-数值和字符串 | 学习笔记
快速学习数据类型-数值和字符串。
5 0
【大数据组件】一篇文章让你快速入门Docker
Docker 是什么? Docker 是一个开源的应用容器引擎,基于 Go 语言 并遵从 Apache2.0 协议开源。 Docker 可以让开发者打包他们的应用以及依赖包到一个轻量级、可移植的容器中,然后发布到任何流行的 Linux 机器上,也可以实现虚拟化。 容器是完全使用沙箱机制,相互之间不会有任何接口(类似 iPhone 的 app),更重要的是容器性能开销极低。 Docker 从 17.03 版本之后分为 CE(Community Edition: 社区版) 和 EE(Enterprise Edition: 企业版),我们用社区版就可以了。
15 0
PG+MySQL第9课-实时精准营销
通常业务场景会涉及基于标签条件圈选目标客户、基于用户特征值扩选相似人群、群体用户画像分析这些技术,本文将围绕这三个场景去介绍在实施精准营销里面的PG数据库的使用
6 0
冬季实战营第一期学习报告
通过五天学习与实操,对ECS云服务器入门、快速搭建LAMP环境、部署MySQL数据库、回顾搭建Docker环境和Spring Boot以及使用PolarDB和ECS搭建门户网站操作,对本期学习与实操的认识。
7 0
Java classloader详解
Java程序并不是一个可执行文件,而是由很多的Java类组成,其运行是由JVM来控制的。而JVM从内存中查找到类,而真正将类加载进内存的就是ClassLoader,可以说我们每天都在接触ClassLoader,但是很多时候我们没有明白其执行的流程和原理。
5 0
冬季实战营第一期:从零到一上手玩转云服务器实验报告
第一期主要进行了六次实验,分别是《动手实操ECS云服务器》、《动手实操快速搭建LAMP环境》、《使用ECS服务器部署MySQL数据库》、《通过workbench远程登录ECS,快速搭建Docker环境》、《从零搭建Spring Boot的Hello World》以及《使用PolarDB和ECS搭建门户网站》。首先远程登陆ECS实例,搭建LAMP环境,其中LAMP分别代表Linux、Apache、MySQL和PHP。然后配置及远程访问MySQL。冯晓帅老师在直播上带大家通过workbench登录ECS并快速搭建Docker环境,运行Spring Boot,最后安装WordPress并搭建博客。
8 0
架构修炼之道 | 一个传统网关系统有几种 “死” 法(上)
架构修炼之道 | 一个传统网关系统有几种 “死” 法(上)
5 0
冬季实战营第一期:从零到一上手玩转云服务器学习总结
冬季实战营第一期:从零到一上手玩转云服务器学习总结
6 0
+关注
一个处女座的程序猿
国内互联网圈知名博主、人工智能领域优秀创作者,全球最大中文IT社区博客专家、CSDN开发者联盟生态成员、中国开源社区专家、华为云社区专家、51CTO社区专家、Python社区专家等,曾受邀采访和评审十多次。仅在国内的CSDN平台,博客文章浏览量超过2500万,拥有超过57万的粉丝。
1701
文章
0
问答
文章排行榜
最热
最新
相关电子书
更多
《2021云上架构与运维峰会演讲合集》
立即下载
《零基础CSS入门教程》
立即下载
《零基础HTML入门教程》
立即下载