DL之Attention-ED:基于TF NMT利用带有Attention的 ED模型训练、测试(中英文平行语料库)实现将英文翻译为中文的LSTM翻译模型过程全记录

本文涉及的产品
文本翻译,文本翻译 100万字符
文档翻译,文档翻译 1千页
图片翻译,图片翻译 100张
简介: DL之Attention-ED:基于TF NMT利用带有Attention的 ED模型训练、测试(中英文平行语料库)实现将英文翻译为中文的LSTM翻译模型过程全记录

测试输出结果


轻轻的我走了, 正如我轻轻的来; 我轻轻的招手, 作别...




模型监控


1、SCALARS

image.png

image.png

image.png

image.png


clipped_gradient

grad_norm

train_loss

dev_bleu  dev_ppl

lr_1  test_bleu

test_ppl  train_ppl

2、IMAGES

image.png

attention_images_1/image/0   step 6,000




3、GRAPHS

image.png

image.png






训练过程全记录


开始训练

# Job id 0

# Loading hparams from tmp/nmt_model_zh\hparams

 saving hparams to tmp/nmt_model_zh\hparams

 saving hparams to tmp/nmt_model_zh\best_bleu\hparams

 attention=scaled_luong

 attention_architecture=standard

 batch_size=128

 beam_width=0

 best_bleu=0

 best_bleu_dir=tmp/nmt_model_zh\best_bleu

 bpe_delimiter=None

 colocate_gradients_with_ops=True

 decay_factor=0.98

 decay_steps=10000

 dev_prefix=tmp/nmt_zh/dev

 dropout=0.2

 encoder_type=uni

 eos=</s>

 epoch_step=0

 forget_bias=1.0

 infer_batch_size=32

 init_op=uniform

 init_weight=0.1

 learning_rate=1.0

 length_penalty_weight=0.0

 log_device_placement=False

 max_gradient_norm=5.0

 max_train=0

 metrics=['bleu']

 num_buckets=5

 num_embeddings_partitions=0

 num_gpus=1

 num_layers=3

 num_residual_layers=0

 num_train_steps=200000

 num_units=256

 optimizer=sgd

 out_dir=tmp/nmt_model_zh

 pass_hidden_state=True

 random_seed=None

 residual=False

 share_vocab=False

 sos=<s>

 source_reverse=False

 src=en

 src_max_len=50

 src_max_len_infer=None

 src_vocab_file=tmp/nmt_zh/vocab.en

 src_vocab_size=35028

 start_decay_step=0

 steps_per_external_eval=None

 steps_per_stats=100

 test_prefix=tmp/nmt_zh/test

 tgt=zh

 tgt_max_len=50

 tgt_max_len_infer=None

 tgt_vocab_file=tmp/nmt_zh/vocab.zh

 tgt_vocab_size=53712

 time_major=True

 train_prefix=tmp/nmt_zh/train

 unit_type=lstm

 vocab_prefix=tmp/nmt_zh/vocab

# creating train graph ...

 num_layers = 3, num_residual_layers=0

 cell 0  LSTM, forget_bias=1WARNING:tensorflow:From

Instructions for updating:

This class is deprecated, please use tf.nn.rnn_cell.LSTMCell, which supports all the feature this cell currently has. Please replace the existing code with tf.nn.rnn_cell.LSTMCell(name='basic_lstm_cell').

 DropoutWrapper, dropout=0.2   DeviceWrapper, device=/gpu:0

 cell 1  LSTM, forget_bias=1  DropoutWrapper, dropout=0.2   DeviceWrapper, device=/gpu:0

 cell 2  LSTM, forget_bias=1  DropoutWrapper, dropout=0.2   DeviceWrapper, device=/gpu:0

 cell 0  LSTM, forget_bias=1  DropoutWrapper, dropout=0.2   DeviceWrapper, device=/gpu:0

 cell 1  LSTM, forget_bias=1  DropoutWrapper, dropout=0.2   DeviceWrapper, device=/gpu:0

 cell 2  LSTM, forget_bias=1  DropoutWrapper, dropout=0.2   DeviceWrapper, device=/gpu:0

 start_decay_step=0, learning_rate=1, decay_steps 10000,decay_factor 0.98

# Trainable variables

 embeddings/encoder/embedding_encoder:0, (35028, 256),

 embeddings/decoder/embedding_decoder:0, (53712, 256),

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_0/basic_lstm_cell/kernel:0, (512, 1024), /device:GPU:0

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_0/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_1/basic_lstm_cell/kernel:0, (512, 1024), /device:GPU:0

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_1/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_2/basic_lstm_cell/kernel:0, (512, 1024), /device:GPU:0

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_2/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/decoder/memory_layer/kernel:0, (256, 256),

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_0/basic_lstm_cell/kernel:0, (768, 1024), /device:GPU:0

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_0/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_1/basic_lstm_cell/kernel:0, (512, 1024), /device:GPU:0

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_1/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_2/basic_lstm_cell/kernel:0, (512, 1024), /device:GPU:0

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_2/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/decoder/attention/luong_attention/attention_g:0, (), /device:GPU:0

 dynamic_seq2seq/decoder/attention/attention_layer/kernel:0, (512, 256), /device:GPU:0

 dynamic_seq2seq/decoder/output_projection/kernel:0, (256, 53712), /device:GPU:0

# creating eval graph ...

 num_layers = 3, num_residual_layers=0

 cell 0  LSTM, forget_bias=1  DeviceWrapper, device=/gpu:0

 cell 1  LSTM, forget_bias=1  DeviceWrapper, device=/gpu:0

 cell 2  LSTM, forget_bias=1  DeviceWrapper, device=/gpu:0

 cell 0  LSTM, forget_bias=1  DeviceWrapper, device=/gpu:0

 cell 1  LSTM, forget_bias=1  DeviceWrapper, device=/gpu:0

 cell 2  LSTM, forget_bias=1  DeviceWrapper, device=/gpu:0

 start_decay_step=0, learning_rate=1, decay_steps 10000,decay_factor 0.98

# Trainable variables

 embeddings/encoder/embedding_encoder:0, (35028, 256),

 embeddings/decoder/embedding_decoder:0, (53712, 256),

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_0/basic_lstm_cell/kernel:0, (512, 1024), /device:GPU:0

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_0/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_1/basic_lstm_cell/kernel:0, (512, 1024), /device:GPU:0

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_1/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_2/basic_lstm_cell/kernel:0, (512, 1024), /device:GPU:0

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_2/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/decoder/memory_layer/kernel:0, (256, 256),

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_0/basic_lstm_cell/kernel:0, (768, 1024), /device:GPU:0

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_0/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_1/basic_lstm_cell/kernel:0, (512, 1024), /device:GPU:0

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_1/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_2/basic_lstm_cell/kernel:0, (512, 1024), /device:GPU:0

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_2/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/decoder/attention/luong_attention/attention_g:0, (), /device:GPU:0

 dynamic_seq2seq/decoder/attention/attention_layer/kernel:0, (512, 256), /device:GPU:0

 dynamic_seq2seq/decoder/output_projection/kernel:0, (256, 53712), /device:GPU:0

# creating infer graph ...

 num_layers = 3, num_residual_layers=0

 cell 0  LSTM, forget_bias=1  DeviceWrapper, device=/gpu:0

 cell 1  LSTM, forget_bias=1  DeviceWrapper, device=/gpu:0

 cell 2  LSTM, forget_bias=1  DeviceWrapper, device=/gpu:0

 cell 0  LSTM, forget_bias=1  DeviceWrapper, device=/gpu:0

 cell 1  LSTM, forget_bias=1  DeviceWrapper, device=/gpu:0

 cell 2  LSTM, forget_bias=1  DeviceWrapper, device=/gpu:0

 start_decay_step=0, learning_rate=1, decay_steps 10000,decay_factor 0.98

# Trainable variables

 embeddings/encoder/embedding_encoder:0, (35028, 256),

 embeddings/decoder/embedding_decoder:0, (53712, 256),

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_0/basic_lstm_cell/kernel:0, (512, 1024), /device:GPU:0

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_0/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_1/basic_lstm_cell/kernel:0, (512, 1024), /device:GPU:0

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_1/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_2/basic_lstm_cell/kernel:0, (512, 1024), /device:GPU:0

 dynamic_seq2seq/encoder/rnn/multi_rnn_cell/cell_2/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/decoder/memory_layer/kernel:0, (256, 256),

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_0/basic_lstm_cell/kernel:0, (768, 1024), /device:GPU:0

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_0/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_1/basic_lstm_cell/kernel:0, (512, 1024), /device:GPU:0

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_1/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_2/basic_lstm_cell/kernel:0, (512, 1024), /device:GPU:0

 dynamic_seq2seq/decoder/attention/multi_rnn_cell/cell_2/basic_lstm_cell/bias:0, (1024,), /device:GPU:0

 dynamic_seq2seq/decoder/attention/luong_attention/attention_g:0, (), /device:GPU:0

 dynamic_seq2seq/decoder/attention/attention_layer/kernel:0, (512, 256), /device:GPU:0

 dynamic_seq2seq/decoder/output_projection/kernel:0, (256, 53712),

# log_file=tmp/nmt_model_zh\log_1539923931

2018-10-19 12:38:51.109178: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2

 created train model with fresh parameters, time 0.94s

 created infer model with fresh parameters, time 0.50s

 # 215

   src: second , the human capital , knowledge reserves , and other high - grade economic elements have formed the comparative advantages of the province in terms of technical innovation and the development of new high - tech industries .

   ref: 其二 , 人力 资本 , 知识 储备 等 高级 经济 要素 , 形成 了 吉林省 技术 创新 和 发展 高新技术 产业 的 比较 优势 .

   nmt: b'\xe9\xaa\x87 \xe6\x9d\x83\xe8\xa1\xa1 \xe6\x9d\x83\xe8\xa1\xa1 \xe6\x9d\x83\xe8\xa1\xa1 \xe6\x9d\x83\xe8\xa1\xa1 \xe6\x9d\x83\xe8\xa1\xa1 \xe5\x8d\x8a\xe8\xbe\xb9\xe5\xa4\xa9 \xe5\x8d\x8a\xe8\xbe\xb9\xe5\xa4\xa9 \xe5\x8d\x8a\xe8\xbe\xb9\xe5\xa4\xa9 \xe8\xb4\xa7\xe6\xac\xbe \xe8\xb4\xa7\xe6\xac\xbe \xe8\xb4\xa7\xe6\xac\xbe \xe8\xb4\xa7\xe6\xac\xbe \xe8\xb4\xa7\xe6\xac\xbe \xe6\xb1\x9f\xe5\x8d\xab\xe5\x9b\xbd \xe6\xb1\x9f\xe5\x8d\xab\xe5\x9b\xbd \xe6\xb1\x9f\xe5\x8d\xab\xe5\x9b\xbd \xe6\xb1\x9f\xe5\x8d\xab\xe5\x9b\xbd \xe6\xa1\x91\xe4\xb9\x85\xe7\xbe\x8e \xe6\xa1\x91\xe4\xb9\x85\xe7\xbe\x8e \xe9\x93\xb8\xe6\x88\x90\xe5\xa4\xa7\xe9\x94\x99 \xe9\x93\xb8\xe6\x88\x90\xe5\xa4\xa7\xe9\x94\x99 \xe9\x93\xb8\xe6\x88\x90\xe5\xa4\xa7\xe9\x94\x99 \xe9\x93\xb8\xe6\x88\x90\xe5\xa4\xa7\xe9\x94\x99 \xe9\x93\xb8\xe6\x88\x90\xe5\xa4\xa7\xe9\x94\x99 \xe9\x93\xb8\xe6\x88\x90\xe5\xa4\xa7\xe9\x94\x99 \xe9\x93\xb8\xe6\x88\x90\xe5\xa4\xa7\xe9\x94\x99 \xe9\x93\xb8\xe6\x88\x90\xe5\xa4\xa7\xe9\x94\x99 \xe9\x93\xb8\xe6\x88\x90\xe5\xa4\xa7\xe9\x94\x99 \xe9\x93\xb8\xe6\x88\x90\xe5\xa4\xa7\xe9\x94\x99 \xe4\xbe\x9b\xe9\x94\x80 \xe6\x9d\x8e\xe4\xb8\xbd\xe8\xbe\x89 \xe6\x9d\x8e\xe4\xb8\xbd\xe8\xbe\x89 \xe6\x9d\x8e\xe4\xb8\xbd\xe8\xbe\x89 \xe6\x97\xa5\xe6\x9c\xac\xe6\xb5\xb7 \xe6\x97\xa5\xe6\x9c\xac\xe6\xb5\xb7 \xe6\x97\xa5\xe6\x9c\xac\xe6\xb5\xb7 \xe6\x97\xa5\xe6\x9c\xac\xe6\xb5\xb7 \xe6\x97\xa5\xe6\x9c\xac\xe6\xb5\xb7 \xe6\x97\xa5\xe6\x9c\xac\xe6\xb5\xb7 \xe6\x97\xa5\xe6\x9c\xac\xe6\xb5\xb7 \xe6\x9d\x8e\xe6\xb0\xb8\xe5\x88\x9d \xe6\x9d\x8e\xe6\xb0\xb8\xe5\x88\x9d \xe6\x9d\x8e\xe6\xb0\xb8\xe5\x88\x9d \xe6\x9d\x8e\xe6\xb0\xb8\xe5\x88\x9d \xe6\x9d\x8e\xe6\xb0\xb8\xe5\x88\x9d \xe5\x9b\xb4\xe6\xad\xbc \xe5\x9b\xb4\xe6\xad\xbc \xe5\x9b\xb4\xe6\xad\xbc \xe5\x9b\xb4\xe6\xad\xbc \xe5\x9b\xb4\xe6\xad\xbc \xe6\x9d\x8e\xe5\xae\x89\xe5\xb9\xb3 \xe6\x9d\x8e\xe5\xae\x89\xe5\xb9\xb3 \xe5\xae\x8b\xe8\xb1\xab \xe5\xae\x8b\xe8\xb1\xab \xe5\xae\x8b\xe8\xb1\xab \xe5\xae\x8b\xe8\xb1\xab \xe5\xae\x8b\xe8\xb1\xab \xe5\xae\x8b\xe8\xb1\xab \xe5\xae\x8b\xe8\xb1\xab \xe6\xaf\x8f\xe5\x86\xb5\xe6\x84\x88\xe4\xb8\x8b \xe5\xbd\xb1\xe5\x93\x8d\xe5\x88\xb0 \xe5\xbd\xb1\xe5\x93\x8d\xe5\x88\xb0 \xe5\xbd\xb1\xe5\x93\x8d\xe5\x88\xb0 \xe5\xbd\xb1\xe5\x93\x8d\xe5\x88\xb0 \xe5\xbd\xb1\xe5\x93\x8d\xe5\x88\xb0 \xe7\xbb\xa7\xe4\xbd\x8d \xe4\xbe\x9d\xe9\x82\xa3\xe5\x90\x90\xe6\x8b\x89 \xe4\xbe\x9d\xe9\x82\xa3\xe5\x90\x90\xe6\x8b\x89 \xe4\xbe\x9d\xe9\x82\xa3\xe5\x90\x90\xe6\x8b\x89 \xe4\xbe\x9d\xe9\x82\xa3\xe5\x90\x90\xe6\x8b\x89 \xe4\xbe\x9d\xe9\x82\xa3\xe5\x90\x90\xe6\x8b\x89 \xe4\xbe\x9d\xe9\x82\xa3\xe5\x90\x90\xe6\x8b\x89 \xe4\xbe\x9d\xe9\x82\xa3\xe5\x90\x90\xe6\x8b\x89 \xe4\xbe\x9d\xe9\x82\xa3\xe5\x90\x90\xe6\x8b\x89 \xe4\xbe\x9d\xe9\x82\xa3\xe5\x90\x90\xe6\x8b\x89 \xe4\xbe\x9d\xe9\x82\xa3\xe5\x90\x90\xe6\x8b\x89 \xe4\xbb\xa5\xe9\x82\xbb\xe4\xb8\xba\xe5\xa3\x91'

 created eval model with fresh parameters, time 0.61s

2018-10-19 12:38:57.783584: W tensorflow/core/framework/allocator.cc:113] Allocation of 641750976 exceeds 10% of system memory.

2018-10-19 12:39:01.312543: W tensorflow/core/framework/allocator.cc:113] Allocation of 812769984 exceeds 10% of system memory.

2018-10-19 12:39:05.244676: W tensorflow/core/framework/allocator.cc:113] Allocation of 660013056 exceeds 10% of system memory.

2018-10-19 12:39:08.661067: W tensorflow/core/framework/allocator.cc:113] Allocation of 800523648 exceeds 10% of system memory.

 eval dev: perplexity 53779.34, time 16s, Fri Oct 19 12:39:11 2018.

2018-10-19 12:39:14.135415: W tensorflow/core/framework/allocator.cc:113] Allocation of 641750976 exceeds 10% of system memory.

 eval test: perplexity 53779.35, time 17s, Fri Oct 19 12:39:29 2018.

2018-10-19 12:39:29.544562: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-19 12:39:29.544589: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-19 12:39:29.544778: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 created infer model with fresh parameters, time 0.50s

# Start step 0, lr 1, Fri Oct 19 12:39:29 2018

# Init train iterator, skipping 0 elements

 global step 100 lr 1 step-time 12.18s wps 0.61K ppl 370383.80 bleu 0.00

 global step 200 lr 1 step-time 11.44s wps 0.64K ppl 69384.04 bleu 0.00

 global step 300 lr 1 step-time 11.49s wps 0.65K ppl 22598.76 bleu 0.00

 global step 400 lr 1 step-time 11.55s wps 0.64K ppl 14178.53 bleu 0.00

 global step 500 lr 1 step-time 11.50s wps 0.65K ppl 10184.07 bleu 0.00

 global step 600 lr 1 step-time 12.13s wps 0.61K ppl 6656.94 bleu 0.00

 global step 700 lr 1 step-time 13.51s wps 0.55K ppl 2673.24 bleu 0.00

# Finished an epoch, step 785. Perform external evaluation

2018-10-19 15:16:44.846427: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-19 15:16:44.846427: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-19 15:16:44.867191: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

 created infer model with fresh parameters, time 1.07s

 # 64

   src: under the leadership of the state civil air defense office , the center will assume overall work related to china 's civil air defense engineering as well as nuclear and chemical protection .

   ref: 这个 中心 将 在 人民 办公室  下 , 担负 我国 人防 工程 化 防护 的 全面 工作 .

   nmt: b'\xe6\x96\xb9\xe5\x85\xb3\xe6\x96\xbc 2020\xe5\xb9\xb4 \xe9\x9f\xa6\xe5\xa4\x9a \xe9\x9f\xa6\xe5\xa4\x9a 19-02-2000 19-02-2000 19-02-2000 19-02-2000 \xe7\xb3\xbb\xe7\xbb\x9f\xe5\x9c\xb0 \xe7\xb3\xbb\xe7\xbb\x9f\xe5\x9c\xb0 \xe5\x97\x85\xe5\x87\xba \xe5\x97\x85\xe5\x87\xba \xe5\x97\x85\xe5\x87\xba \xe6\xac\xa1\xe4\xbe\x8d \xe6\xac\xa1\xe4\xbe\x8d \xe6\xac\xa1\xe4\xbe\x8d \xe6\xac\xa1\xe4\xbe\x8d \xe6\xac\xa1\xe4\xbe\x8d \xe5\x8f\x8d\xe8\xb4\xaa\xe5\xb1\x80 \xe5\x8f\x8d\xe8\xb4\xaa\xe5\xb1\x80 \xe5\x8f\x8d\xe8\xb4\xaa\xe5\xb1\x80 \xe8\x96\x84\xe2\x85\xb0\xe5\x82\xb2 \xe8\x96\x84\xe2\x85\xb0\xe5\x82\xb2 \xe8\x96\x84\xe2\x85\xb0\xe5\x82\xb2 \xe8\x96\x84\xe2\x85\xb0\xe5\x82\xb2 \xe5\x8f\x8c\xe5\xba\xa7 \xe5\x8f\x8c\xe5\xba\xa7 \xe6\x96\xbd\xe5\x90\x9b\xe7\x8e\x89 \xe6\x96\xbd\xe5\x90\x9b\xe7\x8e\x89 \xe6\x96\xbd\xe5\x90\x9b\xe7\x8e\x89 \xe6\x96\xbd\xe5\x90\x9b\xe7\x8e\x89 \xe6\x96\xbd\xe5\x90\x9b\xe7\x8e\x89 \xe6\x96\xbd\xe5\x90\x9b\xe7\x8e\x89 \xe6\x96\xbd\xe5\x90\x9b\xe7\x8e\x89 \xe6\x96\xbd\xe5\x90\x9b\xe7\x8e\x89 \xe9\x99\xa2 \xe9\x99\xa2 \xe9\x99\xa2 \xe9\x99\xa2 \xe9\x99\xa2 \xe9\x99\xa2 \xe9\x99\xa2 \xe9\x99\xa2 \xe9\x99\xa2 \xe9\x99\xa2 \xe9\x99\xa2 \xe9\x99\xa2 \xe9\x99\xa2 \xe5\xa4\xa7\xe4\xbd\xbf\xe7\xba\xa7 \xe5\xa4\xa7\xe4\xbd\xbf\xe7\xba\xa7 \xe5\xa4\xa7\xe4\xbd\xbf\xe7\xba\xa7 \xe5\xa4\xa7\xe4\xbd\xbf\xe7\xba\xa7 \xe5\xa4\xa7\xe4\xbd\xbf\xe7\xba\xa7 \xe5\xa4\xa7\xe4\xbd\xbf\xe7\xba\xa7 \xe5\xa4\xa7\xe4\xbd\xbf\xe7\xba\xa7 \xe5\xa4\xa7\xe4\xbd\xbf\xe7\xba\xa7 \xe5\x98\x89 \xe5\x98\x89 \xe5\x98\x89 \xe5\x98\x89 \xe5\x98\x89 \xe5\x98\x89 \xe5\x98\x89 \xe6\xb1\x82\xe6\x95\x99 \xe6\xb1\x82\xe6\x95\x99 \xe6\xb1\x82\xe6\x95\x99'

2018-10-19 15:16:46.110426: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-19 15:16:46.110445: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-19 15:16:46.110444: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 created infer model with fresh parameters, time 0.88s

 global step 800 lr 1 step-time 12.50s wps 0.57K ppl 2048.36 bleu 0.00

 global step 900 lr 1 step-time 13.74s wps 0.54K ppl 1883.18 bleu 0.00

 global step 1000 lr 1 step-time 14.76s wps 0.50K ppl 1583.37 bleu 0.00

# Save eval, global step 1000

2018-10-19 16:07:39.926588: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-19 16:07:39.926612: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-19 16:07:39.926588: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-1000, time 1.62s

 # 125

   src: the united states suffered heavy casualties in the korean war in the 1950 's and the vietnam war of the 1960 's , especially the latter , and this directly caused strong antiwar sentiment within the united states and played a very great driving role in the ultimate and undoubted us defeat in the war .

   ref: 最终 战败 无疑 起到 了 巨大 的 推动 作用 .

   nmt: b'\xe4\xbb\x96 \xe8\xaf\xb4 , \xe4\xbb\x96 \xe5\x9c\xa8 \xe4\xbb\x96 , \xe4\xbb\x96 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 . </s>'

2018-10-19 16:07:41.873322: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-19 16:07:41.873376: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded eval model parameters from tmp/nmt_model_zh\translate.ckpt-1000, time 1.54s

 eval dev: perplexity 1055.18, time 19s, Fri Oct 19 16:08:01 2018.

 eval test: perplexity 1055.18, time 22s, Fri Oct 19 16:08:24 2018.

 global step 1100 lr 1 step-time 14.13s wps 0.53K ppl 1435.07 bleu 0.00

 global step 1200 lr 1 step-time 13.73s wps 0.54K ppl 1276.74 bleu 0.00

 global step 1300 lr 1 step-time 12.87s wps 0.58K ppl 1170.78 bleu 0.00

 global step 1400 lr 1 step-time 14.89s wps 0.50K ppl 1066.19 bleu 0.00

 global step 1500 lr 1 step-time 15.34s wps 0.48K ppl 1046.93 bleu 0.00

# Finished an epoch, step 1570. Perform external evaluation

2018-10-19 18:26:54.761219: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-19 18:26:54.784158: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-19 18:26:54.784232: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-1000, time 3.16s

 # 21

   src: this saying figuratively shows that this is indeed an absurd cycle .

   ref: 这个 说法 以 拟人化 手法 生动 地 表明 , 这 的确 是 个 怪圈 .

   nmt: b'\xe4\xbb\x96 \xe8\xaf\xb4 , \xe4\xbb\x96 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 \xe7\x9a\x84 . </s>'

2018-10-19 18:26:55.501112: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-19 18:26:55.501112: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-19 18:26:55.501131: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-1000, time 0.59s

# External evaluation, global step 1000

 decoding to output tmp/nmt_model_zh\output_dev.

 done, num sentences 400, time 34s, Fri Oct 19 18:27:29 2018.

 bleu dev: 0.0

 saving hparams to tmp/nmt_model_zh\hparams

# External evaluation, global step 1000

 decoding to output tmp/nmt_model_zh\output_test.

 done, num sentences 400, time 33s, Fri Oct 19 18:28:04 2018.

 bleu test: 0.0

 saving hparams to tmp/nmt_model_zh\hparams

 global step 1600 lr 1 step-time 17.33s wps 0.41K ppl 935.22 bleu 0.00

 global step 1700 lr 1 step-time 17.57s wps 0.43K ppl 914.06 bleu 0.00

 global step 1800 lr 1 step-time 15.53s wps 0.48K ppl 833.33 bleu 0.00

 global step 1900 lr 1 step-time 14.46s wps 0.51K ppl 784.47 bleu 0.00

 global step 2000 lr 1 step-time 15.27s wps 0.49K ppl 730.18 bleu 0.00

# Save eval, global step 2000

2018-10-19 20:21:36.887237: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-19 20:21:36.887234: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-19 20:21:36.887234: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-2000, time 2.08s

 # 171

   src: the sino - dprk traditional friendship created and nurtured personally by chairman mao zedong , premier zhou enlai , comrade deng xiaoping , president kim il - song and other leaders of the older generation have withstood historical tests and have taken deep roots in the hearts of peoples of the two countries .

   ref: 友谊 经受 了 历史 的 考验 , 已 深深 扎根 於 人民 心中 .

   nmt: b'\xe4\xb8\xad\xe5\x9b\xbd \xe6\x98\xaf \xe4\xb8\xad\xe5\x9b\xbd \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 \xe7\x9a\x84 \xe4\xb8\xad\xe5\x9b\xbd \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 \xe7\x9a\x84 \xe4\xb8\xad\xe5\x9b\xbd \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 \xe7\x9a\x84 \xe4\xb8\xad\xe5\x9b\xbd \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 \xe5\x92\x8c \xe4\xb8\xad\xe5\x9b\xbd \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 \xe5\x92\x8c \xe5\x8f\x91\xe5\xb1\x95 \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 \xe5\x92\x8c \xe5\x8f\x91\xe5\xb1\x95 \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 \xe5\x92\x8c \xe5\x8f\x91\xe5\xb1\x95 \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 \xe5\x92\x8c \xe5\x8f\x91\xe5\xb1\x95 \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 . </s>'

2018-10-19 20:21:37.898203: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-19 20:21:37.898825: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded eval model parameters from tmp/nmt_model_zh\translate.ckpt-2000, time 0.49s

 eval dev: perplexity 694.89, time 26s, Fri Oct 19 20:22:04 2018.

 eval test: perplexity 694.89, time 28s, Fri Oct 19 20:22:32 2018.

 global step 2100 lr 1 step-time 14.39s wps 0.51K ppl 689.72 bleu 0.00

 global step 2200 lr 1 step-time 13.70s wps 0.54K ppl 649.35 bleu 0.00

 global step 2300 lr 1 step-time 13.78s wps 0.54K ppl 609.52 bleu 0.00

# Finished an epoch, step 2355. Perform external evaluation

2018-10-19 21:44:15.494641: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-19 21:44:15.494641: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-19 21:44:15.494654: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-2000, time 0.75s

 # 277

   src: the complete reunification of the motherland is the trend of the times .

   ref: 统一 是 大势所趋 .

   nmt: b'\xe4\xb8\xad\xe5\x9b\xbd \xe6\x98\xaf \xe4\xb8\xad\xe5\x9b\xbd \xe7\x9a\x84 \xe4\xb8\xad\xe5\x9b\xbd \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 . </s>'

2018-10-19 21:44:16.136067: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-19 21:44:16.136100: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-19 21:44:16.136067: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-2000, time 0.57s

# External evaluation, global step 2000

 decoding to output tmp/nmt_model_zh\output_dev.

 done, num sentences 400, time 44s, Fri Oct 19 21:45:00 2018.

 bleu dev: 0.0

 saving hparams to tmp/nmt_model_zh\hparams

# External evaluation, global step 2000

 decoding to output tmp/nmt_model_zh\output_test.

 done, num sentences 400, time 42s, Fri Oct 19 21:45:43 2018.

 bleu test: 0.0

 saving hparams to tmp/nmt_model_zh\hparams

 global step 2400 lr 1 step-time 13.68s wps 0.52K ppl 578.41 bleu 0.00

 global step 2500 lr 1 step-time 14.21s wps 0.52K ppl 532.20 bleu 0.00

 global step 2600 lr 1 step-time 14.41s wps 0.52K ppl 522.94 bleu 0.00

 global step 2700 lr 1 step-time 13.97s wps 0.53K ppl 477.39 bleu 0.00

 global step 2800 lr 1 step-time 11.82s wps 0.63K ppl 458.58 bleu 0.00

 global step 2900 lr 1 step-time 11.21s wps 0.66K ppl 426.22 bleu 0.00

 global step 3000 lr 1 step-time 11.19s wps 0.66K ppl 408.34 bleu 0.00

# Save eval, global step 3000

2018-10-20 00:04:43.141117: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 00:04:43.141117: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 00:04:43.162239: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-3000, time 1.25s

 # 22

   src: we will seize the historic opportunity of the development of the western region and earnestly study new ideas , mechanism , and measures suited to tibet 's characteristics .

   ref: 我们 要 抓住大 开发 的 历史 机遇 , 认真 研究 适应 特点 的 新 思路 , 新 机制 , 新 措施 .

   nmt: b'\xe5\x9c\xa8 \xe8\xbf\x99 \xe4\xb8\x80 \xe9\x97\xae\xe9\xa2\x98 \xe4\xb8\x8a , \xe5\x9c\xa8 \xe7\xbb\x8f\xe6\xb5\x8e \xe4\xb8\x8a , \xe7\xbb\x8f\xe6\xb5\x8e \xe5\x8f\x91\xe5\xb1\x95 , \xe5\x8f\x91\xe5\xb1\x95 \xe5\x8f\x91\xe5\xb1\x95 , \xe5\x8f\x91\xe5\xb1\x95 \xe5\x8f\x91\xe5\xb1\x95 , \xe5\x8f\x91\xe5\xb1\x95 \xe5\x8f\x91\xe5\xb1\x95 . </s>'

2018-10-20 00:04:43.760353: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-20 00:04:43.760378: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded eval model parameters from tmp/nmt_model_zh\translate.ckpt-3000, time 0.45s

 eval dev: perplexity 387.54, time 15s, Sat Oct 20 00:04:59 2018.

 eval test: perplexity 387.54, time 17s, Sat Oct 20 00:05:17 2018.

 global step 3100 lr 1 step-time 11.16s wps 0.66K ppl 385.81 bleu 0.00

# Finished an epoch, step 3140. Perform external evaluation

2018-10-20 00:30:34.679555: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-20 00:30:34.680522: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 00:30:34.680586: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-3000, time 0.56s

 # 232

   src: because taiwan 's new leader deliberately obscured the one - china principle and denied the " 1992 consensus " of the association for relations across the taiwan strait and the strait exchange foundation , the two sides have been unable to ease their tense relations and achieve a breakthrough in their political impasse .

   ref: 过去 一 年 , .

   nmt: b'\xe5\x9c\xa8 \xe8\xbf\x99 \xe4\xb8\x80 \xe9\x97\xae\xe9\xa2\x98 \xe4\xb8\x8a , \xe5\x9c\xa8 \xe7\xbb\x8f\xe6\xb5\x8e \xe4\xb8\x8a , \xe4\xb8\xad\xe5\x9b\xbd \xe4\xba\xba\xe6\xb0\x91 , \xe4\xba\xba\xe6\xb0\x91 , \xe4\xba\xba\xe6\xb0\x91 , \xe4\xba\xba\xe6\xb0\x91 , \xe4\xba\xba\xe6\xb0\x91 , \xe4\xba\xba\xe6\xb0\x91 , \xe4\xba\xba\xe6\xb0\x91 , \xe4\xba\xba\xe6\xb0\x91 , \xe4\xba\xba\xe6\xb0\x91 \xe7\xad\x89 \xe7\xbb\x8f\xe6\xb5\x8e \xe5\x90\x88\xe4\xbd\x9c , \xe4\xbf\x83\xe8\xbf\x9b \xe7\xbb\x8f\xe6\xb5\x8e \xe5\x8f\x91\xe5\xb1\x95 , \xe4\xbf\x83\xe8\xbf\x9b \xe7\xbb\x8f\xe6\xb5\x8e \xe5\x8f\x91\xe5\xb1\x95 . </s>'

2018-10-20 00:30:35.313256: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 00:30:35.313274: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-20 00:30:35.313289: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-3000, time 0.38s

# External evaluation, global step 3000

 decoding to output tmp/nmt_model_zh\output_dev.

 done, num sentences 400, time 30s, Sat Oct 20 00:31:05 2018.

 bleu dev: 0.2

 saving hparams to tmp/nmt_model_zh\hparams

# External evaluation, global step 3000

 decoding to output tmp/nmt_model_zh\output_test.

 done, num sentences 400, time 28s, Sat Oct 20 00:31:37 2018.

 bleu test: 0.2

 saving hparams to tmp/nmt_model_zh\hparams

 global step 3200 lr 1 step-time 10.83s wps 0.66K ppl 357.57 bleu 0.17

 global step 3300 lr 1 step-time 11.15s wps 0.66K ppl 342.12 bleu 0.17

 global step 3400 lr 1 step-time 11.37s wps 0.66K ppl 325.57 bleu 0.17

 global step 3500 lr 1 step-time 11.14s wps 0.66K ppl 312.20 bleu 0.17

 global step 3600 lr 1 step-time 11.14s wps 0.66K ppl 307.63 bleu 0.17

 global step 3700 lr 1 step-time 11.28s wps 0.66K ppl 296.23 bleu 0.17

 global step 3800 lr 1 step-time 11.14s wps 0.66K ppl 278.77 bleu 0.17

 global step 3900 lr 1 step-time 11.11s wps 0.67K ppl 275.24 bleu 0.17

# Finished an epoch, step 3925. Perform external evaluation

2018-10-20 02:57:30.646778: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 02:57:30.646780: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 02:57:30.646874: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-3000, time 0.54s

 # 120

   src: in china , socialism has succeeded in greatly emancipating and developing productive forces in society .

   ref: 生产力 得到 极大 解放 和 发展 .

   nmt: b'\xe5\x9c\xa8 \xe8\xbf\x99 \xe4\xb8\x80 \xe9\x97\xae\xe9\xa2\x98 \xe4\xb8\x8a , \xe6\x88\x91\xe4\xbb\xac \xe8\xa6\x81 \xe5\x8a\xa0\xe5\xbc\xba \xe5\x8f\x91\xe5\xb1\x95 \xe5\x92\x8c \xe5\x8f\x91\xe5\xb1\x95 . </s>'

2018-10-20 02:57:31.074731: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-20 02:57:31.074738: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 02:57:31.074738: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-3000, time 0.35s

# External evaluation, global step 3000

 decoding to output tmp/nmt_model_zh\output_dev.

 done, num sentences 400, time 29s, Sat Oct 20 02:58:01 2018.

 bleu dev: 0.2

 saving hparams to tmp/nmt_model_zh\hparams

# External evaluation, global step 3000

 decoding to output tmp/nmt_model_zh\output_test.

 done, num sentences 400, time 30s, Sat Oct 20 02:58:31 2018.

 bleu test: 0.2

 saving hparams to tmp/nmt_model_zh\hparams

 global step 4000 lr 1 step-time 10.69s wps 0.66K ppl 252.99 bleu 0.17

# Save eval, global step 4000

2018-10-20 03:12:28.024198: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-20 03:12:28.024218: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 03:12:28.024223: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-4000, time 0.36s

 # 224

   src: the ninth guangdong provincial people 's congress concluded its meeting this morning . pursuant to the proposal of guangdong provincial higher people 's court president lu botao , ling qiman and li yifeng were appointed vice presidents of the provincial higher people 's court .

   ref: 今天 上午 闭幕 的  .

   nmt: b'\xe5\x9c\xa8 \xe8\xb0\x88\xe5\x88\xb0 \xe4\xb8\xad\xe5\x9b\xbd \xe5\x85\xb1\xe4\xba\xa7\xe5\x85\x9a , \xe5\x9b\xbd\xe5\x8a\xa1\xe9\x99\xa2 \xe6\x80\xbb\xe7\x90\x86 \xe6\x9c\xb1\xe9\x8e\x94\xe5\x9f\xba \xe4\xbb\x8a\xe5\xa4\xa9 \xe4\xb8\x8a\xe5\x8d\x88 , \xe5\x9b\xbd\xe5\x8a\xa1\xe9\x99\xa2 \xe5\x89\xaf\xe6\x80\xbb\xe7\x90\x86 \xe9\x92\xb1\xe5\x85\xb6\xe7\x90\x9b , \xe5\x9b\xbd\xe5\x8a\xa1\xe9\x99\xa2 , \xe4\xb9\xa6\xe8\xae\xb0\xe5\xa4\x84 , \xe4\xb9\xa6\xe8\xae\xb0\xe5\xa4\x84 , \xe4\xb9\xa6\xe8\xae\xb0\xe5\xa4\x84 , \xe6\x9b\xbe\xe5\xba\x86\xe7\xba\xa2 , \xe6\x9b\xbe\xe5\xba\x86\xe7\xba\xa2 , \xe6\x9b\xbe\xe5\xba\x86\xe7\xba\xa2 , \xe6\x9b\xbe\xe5\xba\x86\xe7\xba\xa2 , \xe6\x9b\xbe\xe5\xba\x86\xe7\xba\xa2 , \xe6\x9b\xbe\xe5\xba\x86\xe7\xba\xa2 , \xe6\x9b\xbe\xe5\xba\x86\xe7\xba\xa2 , \xe6\x9b\xbe\xe5\xba\x86\xe7\xba\xa2 , \xe6\x9b\xbe\xe5\xba\x86\xe7\xba\xa2 , \xe6\x9b\xbe\xe5\xba\x86\xe7\xba\xa2 . </s>'

2018-10-20 03:12:28.632761: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 03:12:28.632820: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

 loaded eval model parameters from tmp/nmt_model_zh\translate.ckpt-4000, time 0.36s

 eval dev: perplexity 252.28, time 15s, Sat Oct 20 03:12:44 2018.

 eval test: perplexity 252.28, time 16s, Sat Oct 20 03:13:00 2018.

 global step 4100 lr 1 step-time 11.56s wps 0.64K ppl 246.02 bleu 0.17

 global step 4200 lr 1 step-time 11.29s wps 0.66K ppl 237.87 bleu 0.17

 global step 4300 lr 1 step-time 11.32s wps 0.66K ppl 233.55 bleu 0.17

 global step 4400 lr 1 step-time 11.36s wps 0.66K ppl 221.90 bleu 0.17

 global step 4500 lr 1 step-time 11.06s wps 0.66K ppl 224.83 bleu 0.17

 global step 4600 lr 1 step-time 11.32s wps 0.66K ppl 211.41 bleu 0.17

 global step 4700 lr 1 step-time 11.21s wps 0.66K ppl 213.31 bleu 0.17

# Finished an epoch, step 4710. Perform external evaluation

2018-10-20 05:26:06.693124: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 05:26:06.693124: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-20 05:26:06.693124: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-4000, time 0.84s

 # 33

   src: peng peiyun and he luli respectively attended the plenary meetings of the zhejiang and beijing delegations .

   ref: 云 参加 了 浙江 代表团 的 全体 会议 . 何鲁丽 参加 了 北京 代表团 的 全体 会议 .

   nmt: b'\xe4\xb8\xad\xe5\x9b\xbd \xe6\x94\xbf\xe5\xba\x9c \xe5\x92\x8c \xe4\xba\xba\xe6\xb0\x91 , \xe4\xba\xba\xe6\xb0\x91 , \xe7\x9b\xb4\xe8\xbe\x96\xe5\xb8\x82 , \xe7\x9b\xb4\xe8\xbe\x96\xe5\xb8\x82 , \xe7\x9b\xb4\xe8\xbe\x96\xe5\xb8\x82 . </s>'

2018-10-20 05:26:07.132553: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-20 05:26:07.132553: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 05:26:07.132618: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-4000, time 0.37s

# External evaluation, global step 4000

 decoding to output tmp/nmt_model_zh\output_dev.

 done, num sentences 400, time 45s, Sat Oct 20 05:26:52 2018.

 bleu dev: 0.2

 saving hparams to tmp/nmt_model_zh\hparams

# External evaluation, global step 4000

 decoding to output tmp/nmt_model_zh\output_test.

 done, num sentences 400, time 43s, Sat Oct 20 05:27:39 2018.

 bleu test: 0.2

 saving hparams to tmp/nmt_model_zh\hparams

 global step 4800 lr 1 step-time 10.88s wps 0.66K ppl 199.44 bleu 0.18

 global step 4900 lr 1 step-time 11.33s wps 0.66K ppl 192.33 bleu 0.18

 global step 5000 lr 1 step-time 11.41s wps 0.65K ppl 186.30 bleu 0.18

# Save eval, global step 5000

2018-10-20 06:22:32.135143: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-20 06:22:32.135150: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 06:22:32.135223: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-5000, time 0.76s

 # 119

   src: as a result they could only avoid the differences of opinion and delay the solutions .

   ref: 结果 只能 是 回避 分歧 , 推迟 解决 .

   nmt: b'\xe4\xbb\x8e \xe6\xa0\xb9\xe6\x9c\xac\xe4\xb8\x8a \xe7\x9c\x8b , \xe6\x88\x91\xe4\xbb\xac \xe5\xbf\x85\xe9\xa1\xbb \xe5\x9d\x9a\xe6\x8c\x81 \xe4\xb8\x80\xe4\xb8\xaa \xe4\xb8\xad\xe5\x9b\xbd \xe7\x9a\x84 \xe5\x8e\x9f\xe5\x88\x99 . </s>'

2018-10-20 06:22:32.937075: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 06:22:32.937075: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

 loaded eval model parameters from tmp/nmt_model_zh\translate.ckpt-5000, time 0.94s

 eval dev: perplexity 188.49, time 15s, Sat Oct 20 06:22:49 2018.

 eval test: perplexity 188.49, time 17s, Sat Oct 20 06:23:06 2018.

2018-10-20 06:23:09.323110: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 06:23:09.323133: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-20 06:23:09.323159: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-5000, time 0.36s

 # 208

   src: by the middle of this century , we should build the western region into a new region enjoying economic prosperity , social progress , a good living environment , and nationality unity , and having green mountains and clean rivers .

   ref: 到 21世纪中叶 , 将 西部 地区 建成 一个 经济 繁荣 , 社会 进步 , 生活 安定 , 民族团结 , 山川 秀美 的 新 西部 .

   nmt: b'\xe4\xbb\x8e \xe4\xbb\x8a\xe5\xb9\xb4 \xe4\xb8\x8b\xe5\x8d\x8a\xe5\xb9\xb4 \xe5\xbc\x80\xe5\xa7\x8b , \xe6\x88\x91\xe5\x9b\xbd \xe5\xb0\x86 \xe7\xbb\xa7\xe7\xbb\xad \xe5\x8a\xa0\xe5\xbc\xba \xe5\xaf\xb9 \xe8\xa5\xbf\xe9\x83\xa8 \xe5\x9c\xb0\xe5\x8c\xba \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 , \xe5\xb9\xb6 \xe8\xbf\x9b\xe4\xb8\x80\xe6\xad\xa5 \xe6\x8e\xa8\xe5\x8a\xa8 \xe4\xb8\xad\xe5\x9b\xbd \xe6\x94\xbf\xe5\xba\x9c \xe5\xaf\xb9 \xe4\xb8\xad\xe5\x9b\xbd \xe4\xba\xba\xe6\xb0\x91 \xe7\x9a\x84 \xe5\x88\xa9\xe7\x9b\x8a . </s>'

2018-10-20 06:23:09.887178: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-20 06:23:09.887162: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 06:23:09.887366: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-5000, time 0.77s

# External evaluation, global step 5000

 decoding to output tmp/nmt_model_zh\output_dev.

 done, num sentences 400, time 29s, Sat Oct 20 06:23:39 2018.

 bleu dev: 0.4

 saving hparams to tmp/nmt_model_zh\hparams

# External evaluation, global step 5000

 decoding to output tmp/nmt_model_zh\output_test.

 done, num sentences 400, time 29s, Sat Oct 20 06:24:12 2018.

 bleu test: 0.4

 saving hparams to tmp/nmt_model_zh\hparams

 global step 5100 lr 1 step-time 11.38s wps 0.65K ppl 185.57 bleu 0.39

 global step 5200 lr 1 step-time 11.30s wps 0.65K ppl 178.95 bleu 0.39

 global step 5300 lr 1 step-time 11.18s wps 0.66K ppl 174.15 bleu 0.39

 global step 5400 lr 1 step-time 11.39s wps 0.66K ppl 173.02 bleu 0.39

# Finished an epoch, step 5495. Perform external evaluation

2018-10-20 07:57:03.000782: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 07:57:03.000804: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-20 07:57:03.000921: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-5000, time 0.82s

 # 6

   src: we should truly realize " six unifications , " especially the unification of fees to improve efficiency . we should never allow the practice of " ostensibly combining the three types of inspections while failing to do so in private , " and should truly combine the " three types of inspections . "

   ref: 真正 实现率 , 决不能 出现 那种 " 明合暗 不合 , 面 合心 不合 " 的 现 像 , 把真正 落实 下去 .

   nmt: b'\xe6\x88\x91\xe4\xbb\xac \xe8\xa6\x81 \xe6\x8a\x8a " \xe4\xb8\x89 \xe4\xbb\xa3\xe8\xa1\xa8 " \xe4\xb8\xba \xe6\xa0\xb8\xe5\xbf\x83 \xe7\x9a\x84 \xe6\x80\x9d\xe6\x83\xb3 \xe6\x94\xbf\xe6\xb2\xbb \xe5\xbb\xba\xe8\xae\xbe \xe4\xbd\x9c\xe4\xb8\xba \xe6\x96\xb0 \xe7\x9a\x84 \xe4\xbc\x9f\xe5\xa4\xa7 \xe7\x9a\x84 \xe4\xbc\x9f\xe5\xa4\xa7 \xe4\xbb\xbb\xe5\x8a\xa1 , \xe6\x88\x91\xe4\xbb\xac \xe5\x85\x9a \xe7\x9a\x84 \xe5\xbb\xba\xe8\xae\xbe \xe6\x98\xaf \xe4\xb8\x80\xe4\xb8\xaa \xe6\x96\xb0 \xe7\x9a\x84 \xe5\x8f\x91\xe5\xb1\x95 \xe6\x97\xb6\xe6\x9c\x9f , \xe6\x98\xaf \xe6\x88\x91\xe4\xbb\xac \xe5\x85\x9a \xe7\x9a\x84 \xe5\xbb\xba\xe8\xae\xbe \xe7\x9a\x84 \xe4\xbc\x9f\xe5\xa4\xa7 \xe4\xbb\xbb\xe5\x8a\xa1 . </s>'

2018-10-20 07:57:04.047105: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.en is already initialized.

2018-10-20 07:57:04.047147: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

2018-10-20 07:57:04.047113: I tensorflow/core/kernels/lookup_util.cc:376] Table trying to initialize from file tmp/nmt_zh/vocab.zh is already initialized.

 loaded infer model parameters from tmp/nmt_model_zh\translate.ckpt-5000, time 0.79s

# External evaluation, global step 5000

 decoding to output tmp/nmt_model_zh\output_dev.

 done, num sentences 400, time 30s, Sat Oct 20 07:57:35 2018.

 bleu dev: 0.4

 saving hparams to tmp/nmt_model_zh\hparams

# External evaluation, global step 5000

 decoding to output tmp/nmt_model_zh\output_test.

 done, num sentences 400, time 30s, Sat Oct 20 07:58:06 2018.

 bleu test: 0.4

 saving hparams to tmp/nmt_model_zh\hparams

 global step 5500 lr 1 step-time 10.96s wps 0.64K ppl 170.63 bleu 0.39

 global step 5600 lr 1 step-time 11.52s wps 0.65K ppl 154.09 bleu 0.39

……

 global step 6100


相关文章
|
21天前
|
机器学习/深度学习 API 异构计算
7.1.3.2、使用飞桨实现基于LSTM的情感分析模型的网络定义
该文章详细介绍了如何使用飞桨框架实现基于LSTM的情感分析模型,包括网络定义、模型训练、评估和预测的完整流程,并提供了相应的代码实现。
|
21天前
|
机器学习/深度学习 自然语言处理 算法
7.1.3、使用飞桨实现基于LSTM的情感分析模型
该文章介绍了如何使用飞桨(PaddlePaddle)实现基于长短时记忆网络(LSTM)的情感分析模型,包括数据处理、网络定义、模型训练、评估和预测的详细步骤。
|
1天前
|
人工智能 测试技术 PyTorch
AI计算机视觉笔记二十四:YOLOP 训练+测试+模型评估
本文介绍了通过正点原子的ATK-3568了解并实现YOLOP(You Only Look Once for Panoptic Driving Perception)的过程,包括训练、测试、转换为ONNX格式及在ONNX Runtime上的部署。YOLOP由华中科技大学团队于2021年发布,可在Jetson TX2上达到23FPS,实现了目标检测、可行驶区域分割和车道线检测的多任务学习。文章详细记录了环境搭建、训练数据准备、模型转换和测试等步骤,并解决了ONNX转换过程中的问题。
|
1月前
|
机器学习/深度学习 人工智能
高于临床测试3倍准确率!剑桥大学开发AI模型,提前6年预测阿尔茨海默症
【8月更文挑战第9天】剑桥大学研发的人工智能模型在预测阿尔茨海默症方面取得突破,准确率比传统临床测试高三倍,能提前六年预测疾病发生。该模型基于深度学习,利用大量临床及神经影像数据识别生物标志物,预测准确性达80%。这一成果有望促进早期干预,改善患者预后,但仍需更大规模研究验证,并解决隐私与公平性等问题。论文已发表于《The Lancet》子刊。
35 6
|
2月前
|
机器学习/深度学习 存储 数据可视化
谷歌的时间序列预测的基础模型TimesFM详解和对比测试
在本文中,我们将介绍模型架构、训练,并进行实际预测案例研究。将对TimesFM的预测能力进行分析,并将该模型与统计和机器学习模型进行对比。
81 2
|
14天前
|
机器学习/深度学习 数据采集 测试技术
利用Python实现简单的机器学习模型软件测试的艺术与科学:探索自动化测试框架的奥秘
【8月更文挑战第27天】在本文中,我们将一起探索如何通过Python编程语言创建一个简单的机器学习模型。我们将使用scikit-learn库中的线性回归模型作为示例,并通过一个实际的数据集来训练我们的模型。文章将详细解释每一步的过程,包括数据预处理、模型训练和预测结果的评估。最后,我们会用代码块展示整个过程,确保读者能够跟随步骤实践并理解每个阶段的重要性。
|
3月前
|
机器学习/深度学习 存储 人工智能
算法金 | LSTM 原作者带队,一个强大的算法模型杀回来了
**摘要:** 本文介绍了LSTM(长短期记忆网络)的发展背景和重要性,以及其创始人Sepp Hochreiter新推出的xLSTM。LSTM是为解决传统RNN长期依赖问题而设计的,广泛应用于NLP和时间序列预测。文章详细阐述了LSTM的基本概念、核心原理、实现方法和实际应用案例,包括文本生成和时间序列预测。此外,还讨论了LSTM与Transformer的竞争格局。最后,鼓励读者深入学习和探索AI领域。
45 7
算法金 | LSTM 原作者带队,一个强大的算法模型杀回来了
|
2月前
|
机器学习/深度学习 数据采集 数据挖掘
Python实现循环神经网络RNN-LSTM回归模型项目实战(股票价格预测)
Python实现循环神经网络RNN-LSTM回归模型项目实战(股票价格预测)
111 0
|
3月前
|
机器学习/深度学习 自然语言处理 PyTorch
【自然语言处理NLP】Bert预训练模型、Bert上搭建CNN、LSTM模型的输入、输出详解
【自然语言处理NLP】Bert预训练模型、Bert上搭建CNN、LSTM模型的输入、输出详解
74 0
|
3月前
|
测试技术 程序员 开发者
软件测试项目式学习一(认识软件生命周期与开发模型及软件质量)
软件测试项目式学习一(认识软件生命周期与开发模型及软件质量)
44 0

热门文章

最新文章

下一篇
DDNS