DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(基于keras)对word实现预测daiding

简介: DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(基于keras)对word实现预测daiding
+关注继续查看

输出结果

rawtext_BySpaceConnect: ALICE'S ADVENTURES IN WONDERLAND  Lewis Carroll  THE MILLENNIUM FULCRUM EDITION 3.0  CHAPTER I. Down the Rabbit-Hole  Alice was beginning to get very tired of sitting by her sister on the bank, and of having nothing to do: once or twice she had peeped into the book her sister was reading, but it had no pictures or conversations in it, 'and what is the use of a book,' thought Alice 'without pictures or conversations?'  So she was considering in her own mind (as well as she could, for the hot day

rawtext2WordLists: ["ALICE'S", 'ADVENTURES', 'IN', 'WONDERLAND', 'Lewis', 'Carroll', 'THE', 'MILLENNIUM', 'FULCRUM', 'EDITION', '3.0', 'CHAPTER', 'I', 'Down', 'the', 'Rabbit-Hole', 'Alice', 'was', 'beginning', 'to', 'get', 'very', 'tired', 'of', 'sitting', 'by', 'her', 'sister', 'on', 'the', 'bank', 'and', 'of', 'having', 'nothing', 'to', 'do', 'once', 'or', 'twice', 'she', 'had', 'peeped', 'into', 'the', 'book', 'her', 'sister', 'was', 'reading', 'but', 'it', 'had', 'no', 'pictures', 'or', 'conversations', 'in', 'it', 'and', 'what', 'is', 'the', 'use', 'of', 'a', 'book', 'thought', 'Alice', 'without', 'pictures', 'or', 'conversations', 'So', 'she', 'was', 'considering', 'in', 'her', 'own', 'mind', 'as', 'well', 'as', 'she', 'could', 'for', 'the', 'hot', 'day', 'made', 'her', 'feel', 'very', 'sleepy', 'and', 'stupid', 'whether', 'the', 'pleasure', 'of', 'making', 'a', 'daisy-chain', 'would', 'be', 'worth', 'the', 'trouble', 'of', 'getting', 'up', 'and', 'picking', 'the', 'daisies', 'when', 'suddenly', 'a', 'White', 'Rabbit', 'with', 'pink', 'eyes', 'ran', 'close', 'by', 'her', 'There', 'was', 'nothing', 'so', 'VERY', 'remarkable', 'in', 'that', 'nor', 'did', 'Alice', 'think', 'it', 'so', 'VERY', 'much', 'out', 'of', 'the', 'way', 'to', 'hear', 'the', 'Rabbit', 'say', 'to', 'itself', 'Oh', 'dear', 'Oh', 'dear', 'I', 'shall', 'be', 'late', 'when', 'she', 'thought', 'it', 'over', 'afterwards', 'it', 'occurred', 'to', 'her', 'that', 'she', 'ought', 'to', 'have', 'wondered', 'at', 'this', 'but', 'at', 'the', 'time', 'it', 'all', 'seemed', 'quite', 'natural', 'but', 'when', 'the', 'Rabbit', 'actually', 'TOOK', 'A', 'WATCH', 'OUT', 'OF', 'ITS', 'WAISTCOAT-POCKET', 'and', 'looked', 'at', 'it', 'and', 'then', 'hurried', 'on', 'Alice', 'started', 'to', 'her', 'feet', 'for', 'it', 'flashed', 'across', 'her', 'mind', 'that', 'she', 'had', 'never', 'before', 'seen', 'a', 'rabbit', 'with', 'either', 'a', 'waistcoat-pocket', 'or', 'a', 'watch', 'to', 'take', 'out', 'of', 'it', 'and', 'burning', 'with', 'curiosity', 'she', 'ran', 'across', 'the', 'field', 'after', 'it', 'and', 'fortunately', 'was', 'just', 'in', 'time', 'to', 'see', 'it', 'pop', 'down', 'a', 'large', 'rabbit-hole', 'under', 'the', 'hedge', 'In', 'another', 'moment', 'down', 'went', 'Alice', 'after', 'it', 'never', 'once', 'considering', 'how', 'in', 'the', 'world', 'she', 'was', 'to', 'get', 'out', 'again', 'The', 'rabbit-hole', 'went', 'straight', 'on', 'like', 'a', 'tunnel', 'for', 'some', 'way', 'and', 'then', 'dipped', 'suddenly', 'down', 'so', 'suddenly', 'that', 'Alice', 'had', 'not', 'a', 'moment', 'to', 'think', 'about', 'stopping', 'herself', 'before', 'she', 'found', 'herself', 'falling', 'down', 'a', 'very', 'deep', 'well', 'Either', 'the', 'well', 'was', 'very', 'deep', 'or', 'she', 'fell', 'very', 'slowly', 'for', 'she', 'had', 'plenty', 'of', 'time', 'as', 'she', 'went', 'down', 'to', 'look', 'about', 'her', 'and', 'to', 'wonder', 'what', 'was', 'going', 'to', 'happen', 'next', 'First', 'she', 'tried', 'to', 'look', 'down', 'and', 'make', 'out', 'what', 'she', 'was', 'coming', 'to', 'but', 'it', 'was', 'too', 'dark', 'to', 'see', 'anything', 'then', 'she', 'looked', 'at', 'the', 'sides', 'of', 'the', 'well', 'and', 'noticed', 'that', 'they', 'were', 'filled', 'with', 'cupboards', 'and', 'book-shelves', 'here', 'and', 'there', 'she', 'saw', 'maps', 'and', 'pictures', 'hung', 'upon', 'pegs', 'She', 'took', 'down', 'a', 'jar', 'from', 'one', 'of', 'the', 'shelves', 'as', 'she', 'passed', 'it', 'was', 'labelled', 'ORANGE', 'MARMALADE', 'but', 'to', 'her', 'great', 'disappointment', 'it', 'was', 'empty', 'she', 'did', 'not', 'like', 'to', 'drop', 'the', 'jar', 'for', 'fear', 'of', 'killing', 'somebody', 'so', 'managed', 'to', 'put', 'it', 'into', 'one', 'of', 'the', 'cupboards', 'as', 'she', 'fell', 'past', 'it', 'Well', 'thought', 'Alice', 'to', 'herself', 'after', 'such', 'a', 'fall', 'as', 'this', 'I', 'shall', 'think', 'nothing', 'of', 'tumbling', 'down', 'stairs', 'How', 'brave', "they'll", 'all', 'think', 'me', 'at', 'home', 'Why', 'I', "wouldn't", 'say']

rawtext_BySpace: ALICE'S ADVENTURES IN WONDERLAND Lewis Carroll THE MILLENNIUM FULCRUM EDITION 3.0 CHAPTER I Down the Rabbit Hole Alice was beginning to get very tired of sitting by her sister on the bank and of having nothing to do once or twice she had peeped into the book her sister was reading but it had no pictures or conversations in it and what is the use of a book thought Alice without pictures or conversations So she was considering in her own mind as well as she could for the hot day made her feel very

words_num: 26694

vocab_num: 3063

dataX: 26594 100 [[19, 18, 238, 547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713], [18, 238, 547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144], [238, 547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144, 2006], [547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144, 2006, 1851], [278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144, 2006, 1851, 594]]

dataY: 26594 [2144, 2006, 1851, 594, 1074]

Total patterns: 26594

X_train.shape (26594, 100, 1)

Y_train.shape (26594, 3063)

_________________________________________________________________

Layer (type)                 Output Shape              Param #  

=================================================================

lstm_1 (LSTM)                (None, 256)               264192    

_________________________________________________________________

dropout_1 (Dropout)          (None, 256)               0        

_________________________________________________________________

dense_1 (Dense)              (None, 3063)              787191    

=================================================================

Total params: 1,051,383

Trainable params: 1,051,383

Non-trainable params: 0

_________________________________________________________________

LSTM_Model

None

……

Epoch 00005: loss improved from 6.26403 to 6.26198, saving model to hdf5/word-weights-improvement-05-6.2620.hdf5

Epoch 6/10

 128/26594 [..............................] - ETA: 2:09 - loss: 6.8378

 256/26594 [..............................] - ETA: 2:06 - loss: 6.4136

 384/26594 [..............................] - ETA: 2:01 - loss: 6.3299

 512/26594 [..............................] - ETA: 1:57 - loss: 6.4469

 640/26594 [..............................] - ETA: 1:57 - loss: 6.4133

……

Epoch 00008: loss improved from 6.25725 to 6.25487, saving model to hdf5/word-weights-improvement-08-6.2549.hdf5

Epoch 9/10

 128/26594 [..............................] - ETA: 1:57 - loss: 6.2336

 256/26594 [..............................] - ETA: 2:02 - loss: 6.1897

 384/26594 [..............................] - ETA: 2:04 - loss: 6.3229

 512/26594 [..............................] - ETA: 2:01 - loss: 6.3550

 640/26594 [..............................] - ETA: 2:02 - loss: 6.3279

 768/26594 [..............................] - ETA: 2:05 - loss: 6.2614

 896/26594 [>.............................] - ETA: 2:06 - loss: 6.2433

1024/26594 [>.............................] - ETA: 2:07 - loss: 6.2477

……

25216/26594 [===========================>..] - ETA: 6s - loss: 6.2456

25344/26594 [===========================>..] - ETA: 6s - loss: 6.2469

25472/26594 [===========================>..] - ETA: 5s - loss: 6.2477

25600/26594 [===========================>..] - ETA: 4s - loss: 6.2486

25728/26594 [============================>.] - ETA: 4s - loss: 6.2480

25856/26594 [============================>.] - ETA: 3s - loss: 6.2483

25984/26594 [============================>.] - ETA: 2s - loss: 6.2487

26112/26594 [============================>.] - ETA: 2s - loss: 6.2485

26240/26594 [============================>.] - ETA: 1s - loss: 6.2483

26368/26594 [============================>.] - ETA: 1s - loss: 6.2482

26496/26594 [============================>.] - ETA: 0s - loss: 6.2485

26594/26594 [==============================] - 129s 5ms/step - loss: 6.2499

Epoch 00009: loss improved from 6.25487 to 6.24987, saving model to hdf5/word-weights-improvement-09-6.2499.hdf5

Epoch 10/10

 128/26594 [..............................] - ETA: 1:56 - loss: 6.4864

 256/26594 [..............................] - ETA: 2:04 - loss: 6.2577

 384/26594 [..............................] - ETA: 2:07 - loss: 6.2857

 512/26594 [..............................] - ETA: 2:10 - loss: 6.3230

……

25856/26594 [============================>.] - ETA: 3s - loss: 6.2426

25984/26594 [============================>.] - ETA: 3s - loss: 6.2447

26112/26594 [============================>.] - ETA: 2s - loss: 6.2446

26240/26594 [============================>.] - ETA: 1s - loss: 6.2449

26368/26594 [============================>.] - ETA: 1s - loss: 6.2467

26496/26594 [============================>.] - ETA: 0s - loss: 6.2461

26594/26594 [==============================] - 135s 5ms/step - loss: 6.2465

Epoch 00010: loss improved from 6.24987 to 6.24646, saving model to hdf5/word-weights-improvement-10-6.2465.hdf5

LSTM_Pre_word.shape:

(3, 3063)

LSTM_Model,Seed:

" cheerfully he seems to grin How neatly spread his claws And welcome little fishes in With gently smiling jaws I'm sure those are not the right words said poor Alice and her eyes filled with tears again as she went on I must be Mabel after all and I shall have to go and live in that poky little house and have next to no toys to play with and oh ever so many lessons to learn No I've made up my mind about it if I'm Mabel I'll stay down here It'll be no use their putting their heads "

199 100

Generated Sequence:

the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the

Done.


核心代码

LSTM_Model = Sequential()

LSTM_Model.add(LSTM(256, input_shape=(X_train.shape[1], X_train.shape[2])))

LSTM_Model.add(Dropout(0.2))

LSTM_Model.add(Dense(Y_train.shape[1], activation='softmax'))

LSTM_Model.compile(loss='categorical_crossentropy', optimizer='adam')

print('LSTM_Model \n',LSTM_Model.summary())


相关文章
|
2月前
|
机器学习/深度学习 存储 自然语言处理
文本情感识别系统python+Django网页界面+SVM算法模型+数据集
文本情感分析系统,使用Python作为开发语言,基于文本数据集,使用Word2vec对文本进行处理。通过支持向量机SVM算法训练情绪分类模型。实现对文本消极情感和文本积极情感的识别。并基于Django框架开发网页平台实现对用户的可视化操作和数据存储。
46 0
文本情感识别系统python+Django网页界面+SVM算法模型+数据集
|
3月前
|
XML JSON 算法
计算机视觉,算法应用自定义数据集制作
计算机视觉,算法应用自定义数据集制作
|
3月前
|
机器学习/深度学习 传感器 编解码
深度学习应用篇-计算机视觉-语义分割综述[5]:FCN、SegNet、Deeplab等分割算法、常用二维三维半立体数据集汇总、前景展望等
深度学习应用篇-计算机视觉-语义分割综述[5]:FCN、SegNet、Deeplab等分割算法、常用二维三维半立体数据集汇总、前景展望等
深度学习应用篇-计算机视觉-语义分割综述[5]:FCN、SegNet、Deeplab等分割算法、常用二维三维半立体数据集汇总、前景展望等
|
4月前
|
机器学习/深度学习 人工智能 编解码
AIGC技术解读:数据集、算法、模型和结果处理
本文深入探讨了人工智能生成内容(AIGC)背后的技术解读,包括数据集准备、算法选择、模型训练和结果处理等方面。通过对AIGC实现的核心环节进行详细说明,帮助读者更好地理解AIGC技术的原理和应用。
930 0
|
7月前
|
人工智能 搜索推荐 算法
秒懂算法 | 推荐系统常用数据集和验证方法
推荐系统常用数据集和验证方法
727 1
秒懂算法  | 推荐系统常用数据集和验证方法
|
10月前
|
机器学习/深度学习 人工智能 算法
KNN 算法-数据集拆分|学习笔记
快速学习 KNN 算法-数据集拆分
90 0
|
10月前
|
算法 数据可视化 数据挖掘
【聚类算法】| Kmeans算法的Python实现(以西瓜数据集为例)
【聚类算法】| Kmeans算法的Python实现(以西瓜数据集为例)
323 0
【聚类算法】| Kmeans算法的Python实现(以西瓜数据集为例)
ML之NB、LoR:基于NB和LoR算法对Kaggle IMDB影评数据集(国外类似豆瓣电影)情感分析进行分类
ML之NB、LoR:基于NB和LoR算法对Kaggle IMDB影评数据集(国外类似豆瓣电影)情感分析进行分类
ML之NB、LoR:基于NB和LoR算法对Kaggle IMDB影评数据集(国外类似豆瓣电影)情感分析进行分类
|
机器学习/深度学习 算法 数据可视化
DL之LSTM:基于tensorflow框架利用LSTM算法对气温数据集训练并回归预测
DL之LSTM:基于tensorflow框架利用LSTM算法对气温数据集训练并回归预测
DL之LSTM:基于tensorflow框架利用LSTM算法对气温数据集训练并回归预测
|
数据采集 算法 PyTorch
DL之GRU:基于2022年6月最新上证指数数据集结合Pytorch框架利用GRU算法预测最新股票上证指数实现回归预测
DL之GRU:基于2022年6月最新上证指数数据集结合Pytorch框架利用GRU算法预测最新股票上证指数实现回归预测
DL之GRU:基于2022年6月最新上证指数数据集结合Pytorch框架利用GRU算法预测最新股票上证指数实现回归预测
热门文章
最新文章
相关产品
机器翻译
推荐文章
更多