波士顿房价预测
波士顿房价预测问题是根据一些特定的房屋属性(如房间数量,面积等)来预测波士顿地区房屋的中位数价格。这个问题是一个典型的回归问题,目标是利用给定的特征数据来预测连续的房价数值。
机器学习-Sklearn
# 导入机器学习库 from sklearn.linear_model import LinearRegression, SGDRegressor, Ridge, LogisticRegression, RidgeCV from sklearn.datasets import load_boston from sklearn.model_selection import train_test_split from sklearn.preprocessing import StandardScaler from sklearn.metrics import mean_squared_error from sklearn.externals import joblib from sklearn.metrics import r2_score from sklearn.neural_network import MLPRegressor import pandas as pd import numpy as np import warnings warnings.filterwarnings("ignore")
# 波士顿房价数据集 lb = load_boston() lb
{'data': array([[6.3200e-03, 1.8000e+01, 2.3100e+00, ..., 1.5300e+01, 3.9690e+02, 4.9800e+00], [2.7310e-02, 0.0000e+00, 7.0700e+00, ..., 1.7800e+01, 3.9690e+02, 9.1400e+00], [2.7290e-02, 0.0000e+00, 7.0700e+00, ..., 1.7800e+01, 3.9283e+02, 4.0300e+00], ..., [6.0760e-02, 0.0000e+00, 1.1930e+01, ..., 2.1000e+01, 3.9690e+02, 5.6400e+00], [1.0959e-01, 0.0000e+00, 1.1930e+01, ..., 2.1000e+01, 3.9345e+02, 6.4800e+00], [4.7410e-02, 0.0000e+00, 1.1930e+01, ..., 2.1000e+01, 3.9690e+02, 7.8800e+00]]), 'target': array([24. , 21.6, 34.7, 33.4, 36.2, 28.7, 22.9, 27.1, 16.5, 18.9, 15. , 18.9, 21.7, 20.4, 18.2, 19.9, 23.1, 17.5, 20.2, 18.2, 13.6, 19.6, 15.2, 14.5, 15.6, 13.9, 16.6, 14.8, 18.4, 21. , 12.7, 14.5, 13.2, 13.1, 13.5, 18.9, 20. , 21. , 24.7, 30.8, 34.9, 26.6, 25.3, 24.7, 21.2, 19.3, 20. , 16.6, 14.4, 19.4, 19.7, 20.5, 25. , 23.4, 18.9, 35.4, 24.7, 31.6, 23.3, 19.6, 18.7, 16. , 22.2, 25. , 33. , 23.5, 19.4, 22. , 17.4, 20.9, 24.2, 21.7, 22.8, 23.4, 24.1, 21.4, 20. , 20.8, 21.2, 20.3, 28. , 23.9, 24.8, 22.9, 23.9, 26.6, 22.5, 22.2, 23.6, 28.7, 22.6, 22. , 22.9, 25. , 20.6, 28.4, 21.4, 38.7, 43.8, 33.2, 27.5, 26.5, 18.6, 19.3, 20.1, 19.5, 19.5, 20.4, 19.8, 19.4, 21.7, 22.8, 18.8, 18.7, 18.5, 18.3, 21.2, 19.2, 20.4, 19.3, 22. , 20.3, 20.5, 17.3, 18.8, 21.4, 15.7, 16.2, 18. , 14.3, 19.2, 19.6, 23. , 18.4, 15.6, 18.1, 17.4, 17.1, 13.3, 17.8, 14. , 14.4, 13.4, 15.6, 11.8, 13.8, 15.6, 14.6, 17.8, 15.4, 21.5, 19.6, 15.3, 19.4, 17. , 15.6, 13.1, 41.3, 24.3, 23.3, 27. , 50. , 50. , 50. , 22.7, 25. , 50. , 23.8, 23.8, 22.3, 17.4, 19.1, 23.1, 23.6, 22.6, 29.4, 23.2, 24.6, 29.9, 37.2, 39.8, 36.2, 37.9, 32.5, 26.4, 29.6, 50. , 32. , 29.8, 34.9, 37. , 30.5, 36.4, 31.1, 29.1, 50. , 33.3, 30.3, 34.6, 34.9, 32.9, 24.1, 42.3, 48.5, 50. , 22.6, 24.4, 22.5, 24.4, 20. , 21.7, 19.3, 22.4, 28.1, 23.7, 25. , 23.3, 28.7, 21.5, 23. , 26.7, 21.7, 27.5, 30.1, 44.8, 50. , 37.6, 31.6, 46.7, 31.5, 24.3, 31.7, 41.7, 48.3, 29. , 24. , 25.1, 31.5, 23.7, 23.3, 22. , 20.1, 22.2, 23.7, 17.6, 18.5, 24.3, 20.5, 24.5, 26.2, 24.4, 24.8, 29.6, 42.8, 21.9, 20.9, 44. , 50. , 36. , 30.1, 33.8, 43.1, 48.8, 31. , 36.5, 22.8, 30.7, 50. , 43.5, 20.7, 21.1, 25.2, 24.4, 35.2, 32.4, 32. , 33.2, 33.1, 29.1, 35.1, 45.4, 35.4, 46. , 50. , 32.2, 22. , 20.1, 23.2, 22.3, 24.8, 28.5, 37.3, 27.9, 23.9, 21.7, 28.6, 27.1, 20.3, 22.5, 29. , 24.8, 22. , 26.4, 33.1, 36.1, 28.4, 33.4, 28.2, 22.8, 20.3, 16.1, 22.1, 19.4, 21.6, 23.8, 16.2, 17.8, 19.8, 23.1, 21. , 23.8, 23.1, 20.4, 18.5, 25. , 24.6, 23. , 22.2, 19.3, 22.6, 19.8, 17.1, 19.4, 22.2, 20.7, 21.1, 19.5, 18.5, 20.6, 19. , 18.7, 32.7, 16.5, 23.9, 31.2, 17.5, 17.2, 23.1, 24.5, 26.6, 22.9, 24.1, 18.6, 30.1, 18.2, 20.6, 17.8, 21.7, 22.7, 22.6, 25. , 19.9, 20.8, 16.8, 21.9, 27.5, 21.9, 23.1, 50. , 50. , 50. , 50. , 50. , 13.8, 13.8, 15. , 13.9, 13.3, 13.1, 10.2, 10.4, 10.9, 11.3, 12.3, 8.8, 7.2, 10.5, 7.4, 10.2, 11.5, 15.1, 23.2, 9.7, 13.8, 12.7, 13.1, 12.5, 8.5, 5. , 6.3, 5.6, 7.2, 12.1, 8.3, 8.5, 5. , 11.9, 27.9, 17.2, 27.5, 15. , 17.2, 17.9, 16.3, 7. , 7.2, 7.5, 10.4, 8.8, 8.4, 16.7, 14.2, 20.8, 13.4, 11.7, 8.3, 10.2, 10.9, 11. , 9.5, 14.5, 14.1, 16.1, 14.3, 11.7, 13.4, 9.6, 8.7, 8.4, 12.8, 10.5, 17.1, 18.4, 15.4, 10.8, 11.8, 14.9, 12.6, 14.1, 13. , 13.4, 15.2, 16.1, 17.8, 14.9, 14.1, 12.7, 13.5, 14.9, 20. , 16.4, 17.7, 19.5, 20.2, 21.4, 19.9, 19. , 19.1, 19.1, 20.1, 19.9, 19.6, 23.2, 29.8, 13.8, 13.3, 16.7, 12. , 14.6, 21.4, 23. , 23.7, 25. , 21.8, 20.6, 21.2, 19.1, 20.6, 15.2, 7. , 8.1, 13.6, 20.1, 21.8, 24.5, 23.1, 19.7, 18.3, 21.2, 17.5, 16.8, 22.4, 20.6, 23.9, 22. , 11.9]), 'feature_names': array(['CRIM', 'ZN', 'INDUS', 'CHAS', 'NOX', 'RM', 'AGE', 'DIS', 'RAD', 'TAX', 'PTRATIO', 'B', 'LSTAT'], dtype='<U7'), 'DESCR': "Boston House Prices dataset\n===========================\n\nNotes\n------\nData Set Characteristics: \n\n :Number of Instances: 506 \n\n :Number of Attributes: 13 numeric/categorical predictive\n \n :Median Value (attribute 14) is usually the target\n\n :Attribute Information (in order):\n - CRIM per capita crime rate by town\n - ZN proportion of residential land zoned for lots over 25,000 sq.ft.\n - INDUS proportion of non-retail business acres per town\n - CHAS Charles River dummy variable (= 1 if tract bounds river; 0 otherwise)\n - NOX nitric oxides concentration (parts per 10 million)\n - RM average number of rooms per dwelling\n - AGE proportion of owner-occupied units built prior to 1940\n - DIS weighted distances to five Boston employment centres\n - RAD index of accessibility to radial highways\n - TAX full-value property-tax rate per $10,000\n - PTRATIO pupil-teacher ratio by town\n - B 1000(Bk - 0.63)^2 where Bk is the proportion of blacks by town\n - LSTAT % lower status of the population\n - MEDV Median value of owner-occupied homes in $1000's\n\n :Missing Attribute Values: None\n\n :Creator: Harrison, D. and Rubinfeld, D.L.\n\nThis is a copy of UCI ML housing dataset.\nhttp://archive.ics.uci.edu/ml/datasets/Housing\n\n\nThis dataset was taken from the StatLib library which is maintained at Carnegie Mellon University.\n\nThe Boston house-price data of Harrison, D. and Rubinfeld, D.L. 'Hedonic\nprices and the demand for clean air', J. Environ. Economics & Management,\nvol.5, 81-102, 1978. Used in Belsley, Kuh & Welsch, 'Regression diagnostics\n...', Wiley, 1980. N.B. Various transformations are used in the table on\npages 244-261 of the latter.\n\nThe Boston house-price data has been used in many machine learning papers that address regression\nproblems. \n \n**References**\n\n - Belsley, Kuh & Welsch, 'Regression diagnostics: Identifying Influential Data and Sources of Collinearity', Wiley, 1980. 244-261.\n - Quinlan,R. (1993). Combining Instance-Based and Model-Based Learning. In Proceedings on the Tenth International Conference of Machine Learning, 236-243, University of Massachusetts, Amherst. Morgan Kaufmann.\n - many more! (see http://archive.ics.uci.edu/ml/datasets/Housing)\n"}
# 划分训练集与测试集 X_train, X_test, y_train, y_test = train_test_split(lb.data, lb.target, test_size = 0.2) X_train, X_test, y_train, y_test
(array([[6.72400e-02, 0.00000e+00, 3.24000e+00, ..., 1.69000e+01, 3.75210e+02, 7.34000e+00], [3.75780e-01, 0.00000e+00, 1.05900e+01, ..., 1.86000e+01, 3.95240e+02, 2.39800e+01], [1.39134e+01, 0.00000e+00, 1.81000e+01, ..., 2.02000e+01, 1.00630e+02, 1.51700e+01], ..., [2.89600e-01, 0.00000e+00, 9.69000e+00, ..., 1.92000e+01, 3.96900e+02, 2.11400e+01], [1.42310e-01, 0.00000e+00, 1.00100e+01, ..., 1.78000e+01, 3.88740e+02, 1.04500e+01], [1.75050e-01, 0.00000e+00, 5.96000e+00, ..., 1.92000e+01, 3.93430e+02, 1.01300e+01]]), array([[4.68400e-02, 0.00000e+00, 3.41000e+00, ..., 1.78000e+01, 3.92180e+02, 8.81000e+00], [1.00245e+00, 0.00000e+00, 8.14000e+00, ..., 2.10000e+01, 3.80230e+02, 1.19800e+01], [1.54450e-01, 2.50000e+01, 5.13000e+00, ..., 1.97000e+01, 3.90680e+02, 6.86000e+00], ..., [6.26300e-02, 0.00000e+00, 1.19300e+01, ..., 2.10000e+01, 3.91990e+02, 9.67000e+00], [5.82401e+00, 0.00000e+00, 1.81000e+01, ..., 2.02000e+01, 3.96900e+02, 1.07400e+01], [1.87000e-02, 8.50000e+01, 4.15000e+00, ..., 1.79000e+01, 3.92430e+02, 6.36000e+00]]), array([22.6, 19.3, 11.7, 20. , 24.3, 8.5, 14.6, 28.7, 15.4, 30.3, 21.4, 19.1, 20.5, 19.3, 50. , 36.2, 25. , 28.5, 17.8, 18.9, 23.9, 16.5, 30.8, 19.9, 23.2, 18.8, 32. , 29.4, 12. , 20.7, 44.8, 19.6, 19.3, 10.5, 14.1, 15.6, 23.8, 50. , 48.3, 23.7, 18.6, 33.8, 27.5, 20.1, 20.7, 35.1, 25. , 24.1, 32.7, 14.4, 13.5, 12.8, 33.2, 16.8, 21.6, 28. , 23.9, 20.4, 10.8, 22.9, 23.2, 34.7, 14.3, 22.6, 18.7, 21.5, 33.2, 22.2, 33. , 20.4, 27.9, 23.1, 19.1, 15.6, 29.6, 24.4, 23.5, 13.6, 19.2, 29.1, 35.2, 22.2, 17.2, 9.6, 18.7, 13.4, 18.8, 25.1, 15. , 21.1, 46.7, 18.4, 19.4, 17.2, 8.8, 14.5, 19.9, 38.7, 16.6, 24.8, 29.9, 20.2, 19.8, 23. , 33.1, 22. , 18.3, 12.7, 20.9, 11. , 25. , 18.5, 45.4, 17.7, 21.8, 19.4, 30.7, 18.6, 23.2, 24.8, 22.6, 29.8, 27.5, 22.8, 29.6, 50. , 29. , 10.5, 21.4, 21.9, 10.4, 20.6, 28.7, 37. , 20.6, 31.1, 23.9, 19.1, 30.1, 13.1, 16.7, 23.9, 23.8, 19.3, 18.1, 13.8, 16.8, 50. , 24.5, 26.6, 50. , 32.4, 17.5, 25. , 14.5, 43.8, 19.5, 18.5, 13.6, 19.4, 11.3, 18.5, 22.2, 34.6, 24. , 50. , 22.2, 18.2, 11.7, 22.2, 18.7, 19.3, 21.7, 21.2, 9.5, 25. , 6.3, 22. , 17.8, 10.4, 23.1, 30.5, 7.5, 13.1, 20.5, 21.8, 20.8, 19.5, 22.8, 19.6, 21.4, 13.5, 17.1, 36.5, 24.6, 7.2, 22.9, 33.4, 15.6, 7. , 16.6, 12.7, 26.2, 28.6, 34.9, 16.3, 31.5, 15.2, 10.9, 13.9, 12.1, 22.1, 31. , 19.6, 21.4, 41.3, 23.4, 50. , 41.7, 15.3, 15. , 17.1, 20. , 20.3, 23. , 24.2, 50. , 19.4, 35.4, 20.3, 8.3, 10.2, 17.2, 18. , 17.4, 32.9, 21.1, 20.1, 21.5, 24.3, 24.5, 48.5, 24.6, 20.8, 50. , 36.2, 14. , 21.7, 23.7, 26.6, 24.7, 24.5, 10.2, 36.4, 17.8, 19.9, 13.3, 25. , 13. , 8.4, 13.4, 26.5, 27.5, 17.6, 31.7, 32.2, 22.7, 10.2, 16.1, 20.4, 20. , 20.6, 16.7, 20.1, 19.5, 13.2, 21.2, 50. , 42.8, 8.5, 42.3, 19.8, 17.9, 24.4, 10.9, 16.1, 23.1, 50. , 23.1, 24.3, 22.3, 36.1, 22. , 17.3, 13.8, 15. , 50. , 12.3, 9.7, 13.3, 24.8, 19.4, 39.8, 23.7, 12.6, 31.5, 21.7, 20.3, 13.1, 15.7, 19.6, 13.8, 22.5, 22. , 14.9, 20.2, 20.6, 18.9, 14.8, 21. , 18.4, 22. , 50. , 25.2, 19.8, 23.8, 14.1, 33.4, 12.5, 23.1, 24.7, 19.1, 21.4, 13.3, 13.8, 23.1, 27.5, 25.3, 50. , 23.7, 17.1, 5. , 43.5, 17.4, 8.3, 17.8, 18.4, 22.3, 24.8, 15.6, 16.2, 17.4, 28.2, 13.9, 17. , 31.2, 24.1, 32.5, 26.4, 46. , 17.8, 20.5, 16. , 5. , 28.7, 30.1, 16.2, 29.8, 18.2, 20.6, 43.1, 21.2, 16.1, 21.2, 18.3, 21.9, 37.6, 50. , 8.8, 22. , 29. , 23.8, 15.1, 25. , 21.7, 14.5, 13.8, 23.6, 21.9, 17.5, 23. , 23.9, 22. , 22.5, 37.3, 31.6, 16.5, 27.1, 21.2, 19.9, 15.6, 19.7, 18.5, 24.7]), array([22.6, 21. , 23.3, 7.4, 16.4, 30.1, 35.4, 27.9, 21.7, 15.4, 22.9, 20.9, 22.9, 33.3, 28.1, 8.4, 7.2, 36. , 22.5, 19.4, 33.1, 14.1, 14.9, 37.2, 14.4, 23.6, 8.1, 23.3, 24.4, 21.7, 28.4, 27.1, 20. , 20.4, 15.2, 14.3, 19.7, 32. , 13.4, 20.6, 11.9, 48.8, 14.2, 18.9, 21.7, 20.1, 24. , 11.8, 19.6, 24.4, 13.1, 5.6, 50. , 11.9, 15.2, 29.1, 23.4, 34.9, 18.9, 22.8, 13.4, 44. , 7.2, 20.1, 22.4, 17.5, 20.8, 18.2, 22.7, 25. , 7. , 24.1, 26.6, 20.3, 19.5, 37.9, 21. , 12.7, 26.7, 31.6, 28.4, 23.2, 23.3, 19. , 22.6, 19.2, 11.8, 22.8, 8.7, 26.4, 19. , 20. , 34.9, 27. , 23.3, 14.6, 11.5, 14.9, 21.6, 22.4, 23. , 23.1]))
# 为数据增加一个维度,相当于把[1, 5, 10]变成[[1, 5, 10]] # 在新版sklearn中,所有数据都应该是二维矩阵,哪怕它只是单独一行或一列 y_train = y_train.reshape(-1,1) y_test = y_test.reshape(-1,1)
# 进行标准化 std_x = StandardScaler() X_train = std_x.fit_transform(X_train) X_test = std_x.transform(X_test) std_y = StandardScaler() y_train = std_y.fit_transform(y_train) y_test = std_y.transform(y_test)
#线性回归器LinearRegression lr = LinearRegression() lr.fit(X_train, y_train) print("r2 score of Linear regression is",r2_score(y_test,lr.predict(X_test)))
r2 score of Linear regression is 0.7778158147557528
#岭回归 cv = RidgeCV(alphas=np.logspace(-3, 2, 100)) cv.fit (X_train , y_train) print("r2 score of Linear regression is",r2_score(y_test,cv.predict(X_test)))
r2 score of Linear regression is 0.7798009579941207
#线性回归器SGDRegressor sgd = SGDRegressor() sgd.fit(X_train, y_train) print("r2 score of Linear regression is",r2_score(y_test,sgd.predict(X_test)))
r2 score of Linear regression is 0.77430200232832
深度学习-Keras
程序设计
# 使用Keras试试 from keras.models import Sequential from keras.layers import Dense #基准NN #使用标准化后的数据 seq = Sequential() #构建神经网络模型 #input_dim来隐含的指定输入数据shape seq.add(Dense(64, activation='relu',input_dim=lb.data.shape[1])) seq.add(Dense(64, activation='relu')) seq.add(Dense(1, activation='relu')) seq.compile(optimizer='rmsprop', loss='mse', metrics=['mae']) seq.fit(X_train, y_train, epochs=300, batch_size = 16, shuffle = False) score = seq.evaluate(X_test, y_test,batch_size=16) #loss value & metrics values print("score:",score) print('r2 score:',r2_score(y_test, seq.predict(X_test)))
Using TensorFlow backend. WARNING:tensorflow:From /home/nlp/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:422: The name tf.global_variables is deprecated. Please use tf.compat.v1.global_variables instead. Epoch 1/300 404/404 [==============================] - 0s 688us/step - loss: 0.8272 - mae: 0.6820 Epoch 2/300 404/404 [==============================] - 0s 223us/step - loss: 0.5090 - mae: 0.5378 Epoch 3/300 404/404 [==============================] - 0s 114us/step - loss: 0.4726 - mae: 0.5096 Epoch 4/300 404/404 [==============================] - 0s 112us/step - loss: 0.4600 - mae: 0.4980 Epoch 5/300 404/404 [==============================] - 0s 299us/step - loss: 0.4517 - mae: 0.4884 Epoch 6/300 404/404 [==============================] - 0s 148us/step - loss: 0.4444 - mae: 0.4838 Epoch 7/300 404/404 [==============================] - 0s 109us/step - loss: 0.4388 - mae: 0.4784 Epoch 8/300 404/404 [==============================] - 0s 148us/step - loss: 0.4342 - mae: 0.4745 Epoch 9/300 404/404 [==============================] - 0s 166us/step - loss: 0.4316 - mae: 0.4721 Epoch 10/300 404/404 [==============================] - 0s 194us/step - loss: 0.4271 - mae: 0.4695 Epoch 11/300 404/404 [==============================] - 0s 110us/step - loss: 0.4242 - mae: 0.4667 Epoch 12/300 404/404 [==============================] - 0s 109us/step - loss: 0.4212 - mae: 0.4646 Epoch 13/300 404/404 [==============================] - 0s 190us/step - loss: 0.4190 - mae: 0.4630 Epoch 14/300 404/404 [==============================] - 0s 297us/step - loss: 0.4165 - mae: 0.4602 Epoch 15/300 404/404 [==============================] - 0s 129us/step - loss: 0.4142 - mae: 0.4584 Epoch 16/300 404/404 [==============================] - 0s 116us/step - loss: 0.4130 - mae: 0.4575 Epoch 17/300 404/404 [==============================] - 0s 167us/step - loss: 0.4098 - mae: 0.4558 Epoch 18/300 404/404 [==============================] - 0s 218us/step - loss: 0.4082 - mae: 0.4540 Epoch 19/300 404/404 [==============================] - 0s 253us/step - loss: 0.4072 - mae: 0.4529 Epoch 20/300 404/404 [==============================] - 0s 189us/step - loss: 0.4053 - mae: 0.4511 Epoch 21/300 404/404 [==============================] - 0s 235us/step - loss: 0.4042 - mae: 0.4501 Epoch 22/300 404/404 [==============================] - 0s 360us/step - loss: 0.4028 - mae: 0.4477 Epoch 23/300 404/404 [==============================] - 0s 184us/step - loss: 0.4020 - mae: 0.4463 Epoch 24/300 404/404 [==============================] - 0s 250us/step - loss: 0.4009 - mae: 0.4456 Epoch 25/300 404/404 [==============================] - 0s 331us/step - loss: 0.4002 - mae: 0.4446 Epoch 26/300 404/404 [==============================] - 0s 170us/step - loss: 0.3994 - mae: 0.4429 Epoch 27/300 404/404 [==============================] - 0s 159us/step - loss: 0.3982 - mae: 0.4430 Epoch 28/300 404/404 [==============================] - 0s 198us/step - loss: 0.3990 - mae: 0.4427 Epoch 29/300 404/404 [==============================] - 0s 224us/step - loss: 0.3970 - mae: 0.4412 Epoch 30/300 404/404 [==============================] - 0s 189us/step - loss: 0.3966 - mae: 0.4399 Epoch 31/300 404/404 [==============================] - 0s 165us/step - loss: 0.3894 - mae: 0.4406 Epoch 32/300 404/404 [==============================] - 0s 116us/step - loss: 0.3777 - mae: 0.4396 Epoch 33/300 404/404 [==============================] - 0s 87us/step - loss: 0.3740 - mae: 0.4369 Epoch 34/300 404/404 [==============================] - 0s 61us/step - loss: 0.3708 - mae: 0.4341 Epoch 35/300 404/404 [==============================] - 0s 130us/step - loss: 0.3690 - mae: 0.4330 Epoch 36/300 404/404 [==============================] - 0s 93us/step - loss: 0.3670 - mae: 0.4315 Epoch 37/300 404/404 [==============================] - ETA: 0s - loss: 0.3741 - mae: 0.447 - 0s 87us/step - loss: 0.3653 - mae: 0.4295 Epoch 38/300 404/404 [==============================] - 0s 64us/step - loss: 0.3633 - mae: 0.4279 Epoch 39/300 404/404 [==============================] - 0s 119us/step - loss: 0.3622 - mae: 0.4272 Epoch 40/300 404/404 [==============================] - 0s 76us/step - loss: 0.3604 - mae: 0.4254 Epoch 41/300 404/404 [==============================] - 0s 75us/step - loss: 0.3587 - mae: 0.4243 Epoch 42/300 404/404 [==============================] - 0s 66us/step - loss: 0.3578 - mae: 0.4238 Epoch 43/300 404/404 [==============================] - 0s 85us/step - loss: 0.3565 - mae: 0.4222 Epoch 44/300 404/404 [==============================] - 0s 81us/step - loss: 0.3557 - mae: 0.4221 Epoch 45/300 404/404 [==============================] - 0s 102us/step - loss: 0.3550 - mae: 0.4213 Epoch 46/300 404/404 [==============================] - 0s 68us/step - loss: 0.3542 - mae: 0.4202 Epoch 47/300 404/404 [==============================] - 0s 87us/step - loss: 0.3537 - mae: 0.4191 Epoch 48/300 404/404 [==============================] - 0s 64us/step - loss: 0.3553 - mae: 0.4193 Epoch 49/300 404/404 [==============================] - 0s 88us/step - loss: 0.3517 - mae: 0.4171 Epoch 50/300 404/404 [==============================] - 0s 79us/step - loss: 0.3523 - mae: 0.4172 Epoch 51/300 404/404 [==============================] - 0s 135us/step - loss: 0.3520 - mae: 0.4165 Epoch 52/300 404/404 [==============================] - 0s 90us/step - loss: 0.3510 - mae: 0.4154 Epoch 53/300 404/404 [==============================] - 0s 110us/step - loss: 0.3504 - mae: 0.4173 Epoch 54/300 404/404 [==============================] - 0s 112us/step - loss: 0.3501 - mae: 0.4145 Epoch 55/300 404/404 [==============================] - 0s 80us/step - loss: 0.3499 - mae: 0.4164 Epoch 56/300 404/404 [==============================] - 0s 230us/step - loss: 0.3492 - mae: 0.4129 Epoch 57/300 404/404 [==============================] - 0s 132us/step - loss: 0.3492 - mae: 0.4134 Epoch 58/300 404/404 [==============================] - 0s 102us/step - loss: 0.3508 - mae: 0.4135 Epoch 59/300 404/404 [==============================] - 0s 133us/step - loss: 0.3481 - mae: 0.4149 Epoch 60/300 404/404 [==============================] - 0s 50us/step - loss: 0.3488 - mae: 0.4151 Epoch 61/300 404/404 [==============================] - 0s 112us/step - loss: 0.3481 - mae: 0.4116 Epoch 62/300 404/404 [==============================] - 0s 286us/step - loss: 0.3477 - mae: 0.4123 Epoch 63/300 404/404 [==============================] - 0s 258us/step - loss: 0.3475 - mae: 0.4117 Epoch 64/300 404/404 [==============================] - 0s 227us/step - loss: 0.3470 - mae: 0.4110 Epoch 65/300 404/404 [==============================] - 0s 177us/step - loss: 0.3483 - mae: 0.4113 Epoch 66/300 404/404 [==============================] - 0s 170us/step - loss: 0.3472 - mae: 0.4116 Epoch 67/300 404/404 [==============================] - 0s 130us/step - loss: 0.3467 - mae: 0.4084 Epoch 68/300 404/404 [==============================] - 0s 209us/step - loss: 0.3467 - mae: 0.4108 Epoch 69/300 404/404 [==============================] - 0s 140us/step - loss: 0.3460 - mae: 0.4086 Epoch 70/300 404/404 [==============================] - 0s 196us/step - loss: 0.3459 - mae: 0.4100 Epoch 71/300 404/404 [==============================] - 0s 170us/step - loss: 0.3458 - mae: 0.4077 Epoch 72/300 404/404 [==============================] - 0s 182us/step - loss: 0.3467 - mae: 0.4111 Epoch 73/300 404/404 [==============================] - 0s 162us/step - loss: 0.3455 - mae: 0.4075 Epoch 74/300 404/404 [==============================] - 0s 223us/step - loss: 0.3456 - mae: 0.4085 Epoch 75/300 404/404 [==============================] - 0s 157us/step - loss: 0.3462 - mae: 0.4107 Epoch 76/300 404/404 [==============================] - 0s 120us/step - loss: 0.3450 - mae: 0.4069 Epoch 77/300 404/404 [==============================] - 0s 170us/step - loss: 0.3460 - mae: 0.4106 Epoch 78/300 404/404 [==============================] - 0s 149us/step - loss: 0.3452 - mae: 0.4070 Epoch 79/300 404/404 [==============================] - 0s 126us/step - loss: 0.3452 - mae: 0.4065 Epoch 80/300 404/404 [==============================] - 0s 85us/step - loss: 0.3451 - mae: 0.4104 Epoch 81/300 404/404 [==============================] - 0s 164us/step - loss: 0.3454 - mae: 0.4087 Epoch 82/300 404/404 [==============================] - 0s 113us/step - loss: 0.3453 - mae: 0.4077 Epoch 83/300 404/404 [==============================] - 0s 158us/step - loss: 0.3447 - mae: 0.4053 Epoch 84/300 404/404 [==============================] - 0s 155us/step - loss: 0.3443 - mae: 0.4057 Epoch 85/300 404/404 [==============================] - 0s 160us/step - loss: 0.3442 - mae: 0.4076 Epoch 86/300 404/404 [==============================] - 0s 221us/step - loss: 0.3451 - mae: 0.4076 Epoch 87/300 404/404 [==============================] - 0s 343us/step - loss: 0.3463 - mae: 0.4114 Epoch 88/300 404/404 [==============================] - 0s 218us/step - loss: 0.3442 - mae: 0.4031 Epoch 89/300 404/404 [==============================] - 0s 130us/step - loss: 0.3446 - mae: 0.4074 Epoch 90/300 404/404 [==============================] - 0s 210us/step - loss: 0.3449 - mae: 0.4073 Epoch 91/300 404/404 [==============================] - 0s 101us/step - loss: 0.3454 - mae: 0.4082 Epoch 92/300 404/404 [==============================] - 0s 187us/step - loss: 0.3435 - mae: 0.4025 Epoch 93/300 404/404 [==============================] - 0s 148us/step - loss: 0.3445 - mae: 0.4082 Epoch 94/300 404/404 [==============================] - 0s 155us/step - loss: 0.3430 - mae: 0.4022 Epoch 95/300 404/404 [==============================] - 0s 158us/step - loss: 0.3454 - mae: 0.4084 Epoch 96/300 404/404 [==============================] - 0s 191us/step - loss: 0.3449 - mae: 0.4074 Epoch 97/300 404/404 [==============================] - 0s 224us/step - loss: 0.3438 - mae: 0.4032 Epoch 98/300 404/404 [==============================] - 0s 122us/step - loss: 0.3455 - mae: 0.4057 Epoch 99/300 404/404 [==============================] - 0s 248us/step - loss: 0.3446 - mae: 0.4051 Epoch 100/300 404/404 [==============================] - 0s 223us/step - loss: 0.3438 - mae: 0.4053 Epoch 101/300 404/404 [==============================] - 0s 225us/step - loss: 0.3447 - mae: 0.4060 Epoch 102/300 404/404 [==============================] - 0s 255us/step - loss: 0.3439 - mae: 0.4032 Epoch 103/300 404/404 [==============================] - 0s 169us/step - loss: 0.3446 - mae: 0.4084 Epoch 104/300 404/404 [==============================] - 0s 196us/step - loss: 0.3450 - mae: 0.4062 Epoch 105/300 404/404 [==============================] - 0s 153us/step - loss: 0.3440 - mae: 0.4034 Epoch 106/300 404/404 [==============================] - 0s 64us/step - loss: 0.3442 - mae: 0.4056 Epoch 107/300 404/404 [==============================] - 0s 179us/step - loss: 0.3451 - mae: 0.4082 Epoch 108/300 404/404 [==============================] - 0s 163us/step - loss: 0.3438 - mae: 0.4055 Epoch 109/300 404/404 [==============================] - 0s 120us/step - loss: 0.3441 - mae: 0.4057 Epoch 110/300 404/404 [==============================] - 0s 221us/step - loss: 0.3448 - mae: 0.4066 Epoch 111/300 404/404 [==============================] - 0s 106us/step - loss: 0.3435 - mae: 0.4037 Epoch 112/300 404/404 [==============================] - 0s 103us/step - loss: 0.3428 - mae: 0.4026 Epoch 113/300 404/404 [==============================] - 0s 107us/step - loss: 0.3455 - mae: 0.4093 Epoch 114/300 404/404 [==============================] - 0s 204us/step - loss: 0.3444 - mae: 0.4060 Epoch 115/300 404/404 [==============================] - 0s 266us/step - loss: 0.3445 - mae: 0.4035 Epoch 116/300 404/404 [==============================] - 0s 223us/step - loss: 0.3436 - mae: 0.4035 Epoch 117/300 404/404 [==============================] - 0s 142us/step - loss: 0.3432 - mae: 0.4039 Epoch 118/300 404/404 [==============================] - 0s 277us/step - loss: 0.3447 - mae: 0.4073 Epoch 119/300 404/404 [==============================] - 0s 257us/step - loss: 0.3440 - mae: 0.4047 Epoch 120/300 404/404 [==============================] - 0s 203us/step - loss: 0.3432 - mae: 0.4035 Epoch 121/300 404/404 [==============================] - 0s 261us/step - loss: 0.3449 - mae: 0.4056 Epoch 122/300 404/404 [==============================] - 0s 161us/step - loss: 0.3430 - mae: 0.4031 Epoch 123/300 404/404 [==============================] - 0s 206us/step - loss: 0.3442 - mae: 0.4038 Epoch 124/300 404/404 [==============================] - 0s 151us/step - loss: 0.3433 - mae: 0.4024 Epoch 125/300 404/404 [==============================] - 0s 142us/step - loss: 0.3434 - mae: 0.4043 Epoch 126/300 404/404 [==============================] - 0s 116us/step - loss: 0.3438 - mae: 0.4036 Epoch 127/300 404/404 [==============================] - 0s 103us/step - loss: 0.3439 - mae: 0.4013 Epoch 128/300 404/404 [==============================] - 0s 218us/step - loss: 0.3440 - mae: 0.4071 Epoch 129/300 404/404 [==============================] - 0s 180us/step - loss: 0.3432 - mae: 0.4007 Epoch 130/300 404/404 [==============================] - 0s 122us/step - loss: 0.3442 - mae: 0.4040 Epoch 131/300 404/404 [==============================] - 0s 95us/step - loss: 0.3439 - mae: 0.4056 Epoch 132/300 404/404 [==============================] - 0s 91us/step - loss: 0.3439 - mae: 0.4049 Epoch 133/300 404/404 [==============================] - 0s 175us/step - loss: 0.3423 - mae: 0.3999 Epoch 134/300 404/404 [==============================] - 0s 125us/step - loss: 0.3445 - mae: 0.4068 Epoch 135/300 404/404 [==============================] - 0s 311us/step - loss: 0.3428 - mae: 0.4003 Epoch 136/300 404/404 [==============================] - 0s 177us/step - loss: 0.3446 - mae: 0.4046 Epoch 137/300 404/404 [==============================] - 0s 99us/step - loss: 0.3432 - mae: 0.4024 Epoch 138/300 404/404 [==============================] - 0s 74us/step - loss: 0.3443 - mae: 0.4050 Epoch 139/300 404/404 [==============================] - 0s 94us/step - loss: 0.3430 - mae: 0.4019 Epoch 140/300 404/404 [==============================] - 0s 123us/step - loss: 0.3442 - mae: 0.4045 Epoch 141/300 404/404 [==============================] - 0s 265us/step - loss: 0.3440 - mae: 0.4024 Epoch 142/300 404/404 [==============================] - 0s 246us/step - loss: 0.3428 - mae: 0.4014 Epoch 143/300 404/404 [==============================] - 0s 109us/step - loss: 0.3430 - mae: 0.4018 Epoch 144/300 404/404 [==============================] - 0s 129us/step - loss: 0.3444 - mae: 0.4061 Epoch 145/300 404/404 [==============================] - 0s 92us/step - loss: 0.3428 - mae: 0.4011 Epoch 146/300 404/404 [==============================] - 0s 80us/step - loss: 0.3426 - mae: 0.4000 Epoch 147/300 404/404 [==============================] - 0s 100us/step - loss: 0.3429 - mae: 0.4000 Epoch 148/300 404/404 [==============================] - 0s 119us/step - loss: 0.3445 - mae: 0.4033 Epoch 149/300 404/404 [==============================] - 0s 64us/step - loss: 0.3432 - mae: 0.4045 Epoch 150/300 404/404 [==============================] - 0s 83us/step - loss: 0.3430 - mae: 0.4018 Epoch 151/300 404/404 [==============================] - 0s 89us/step - loss: 0.3434 - mae: 0.4027 Epoch 152/300 404/404 [==============================] - 0s 88us/step - loss: 0.3433 - mae: 0.4020 Epoch 153/300 404/404 [==============================] - 0s 68us/step - loss: 0.3426 - mae: 0.3991 Epoch 154/300 404/404 [==============================] - 0s 93us/step - loss: 0.3437 - mae: 0.4046 Epoch 155/300 404/404 [==============================] - 0s 88us/step - loss: 0.3433 - mae: 0.4027 Epoch 156/300 404/404 [==============================] - 0s 219us/step - loss: 0.3433 - mae: 0.4023 Epoch 157/300 404/404 [==============================] - 0s 104us/step - loss: 0.3429 - mae: 0.4012 Epoch 158/300 404/404 [==============================] - 0s 117us/step - loss: 0.3438 - mae: 0.4029 Epoch 159/300 404/404 [==============================] - 0s 126us/step - loss: 0.3431 - mae: 0.4024 Epoch 160/300 404/404 [==============================] - 0s 89us/step - loss: 0.3436 - mae: 0.4018 Epoch 161/300 404/404 [==============================] - 0s 100us/step - loss: 0.3440 - mae: 0.4044 Epoch 162/300 404/404 [==============================] - 0s 181us/step - loss: 0.3428 - mae: 0.4013 Epoch 163/300 404/404 [==============================] - 0s 72us/step - loss: 0.3435 - mae: 0.4031 Epoch 164/300 404/404 [==============================] - 0s 76us/step - loss: 0.3426 - mae: 0.4001 Epoch 165/300 404/404 [==============================] - 0s 108us/step - loss: 0.3443 - mae: 0.4018 Epoch 166/300 404/404 [==============================] - 0s 77us/step - loss: 0.3425 - mae: 0.3993 Epoch 167/300 404/404 [==============================] - 0s 116us/step - loss: 0.3445 - mae: 0.4065 Epoch 168/300 404/404 [==============================] - 0s 109us/step - loss: 0.3427 - mae: 0.4014 Epoch 169/300 404/404 [==============================] - 0s 164us/step - loss: 0.3427 - mae: 0.3975 Epoch 170/300 404/404 [==============================] - 0s 208us/step - loss: 0.3428 - mae: 0.4019 Epoch 171/300 404/404 [==============================] - 0s 194us/step - loss: 0.3428 - mae: 0.4018 Epoch 172/300 404/404 [==============================] - 0s 122us/step - loss: 0.3434 - mae: 0.4012 Epoch 173/300 404/404 [==============================] - 0s 221us/step - loss: 0.3425 - mae: 0.4002 Epoch 174/300 404/404 [==============================] - 0s 98us/step - loss: 0.3443 - mae: 0.4053 Epoch 175/300 404/404 [==============================] - 0s 153us/step - loss: 0.3430 - mae: 0.3996 Epoch 176/300 404/404 [==============================] - 0s 106us/step - loss: 0.3435 - mae: 0.4020 Epoch 177/300 404/404 [==============================] - 0s 83us/step - loss: 0.3433 - mae: 0.4021 Epoch 178/300 404/404 [==============================] - 0s 158us/step - loss: 0.3434 - mae: 0.4018 Epoch 179/300 404/404 [==============================] - 0s 168us/step - loss: 0.3427 - mae: 0.4021 Epoch 180/300 404/404 [==============================] - 0s 189us/step - loss: 0.3435 - mae: 0.4024 Epoch 181/300 404/404 [==============================] - 0s 127us/step - loss: 0.3433 - mae: 0.4019 Epoch 182/300 404/404 [==============================] - 0s 180us/step - loss: 0.3431 - mae: 0.4014 Epoch 183/300 404/404 [==============================] - 0s 179us/step - loss: 0.3429 - mae: 0.3998 Epoch 184/300 404/404 [==============================] - 0s 62us/step - loss: 0.3430 - mae: 0.4006 Epoch 185/300 404/404 [==============================] - 0s 147us/step - loss: 0.3430 - mae: 0.4003 Epoch 186/300 404/404 [==============================] - 0s 86us/step - loss: 0.3428 - mae: 0.3997 Epoch 187/300 404/404 [==============================] - 0s 88us/step - loss: 0.3426 - mae: 0.3995 Epoch 188/300 404/404 [==============================] - 0s 185us/step - loss: 0.3436 - mae: 0.4005 Epoch 189/300 404/404 [==============================] - 0s 178us/step - loss: 0.3431 - mae: 0.4019 Epoch 190/300 404/404 [==============================] - 0s 108us/step - loss: 0.3419 - mae: 0.3982 Epoch 191/300 404/404 [==============================] - 0s 111us/step - loss: 0.3436 - mae: 0.4059 Epoch 192/300 404/404 [==============================] - 0s 117us/step - loss: 0.3431 - mae: 0.4012 Epoch 193/300 404/404 [==============================] - 0s 224us/step - loss: 0.3424 - mae: 0.3980 Epoch 194/300 404/404 [==============================] - 0s 117us/step - loss: 0.3430 - mae: 0.4022 Epoch 195/300 404/404 [==============================] - 0s 213us/step - loss: 0.3441 - mae: 0.4024 Epoch 196/300 404/404 [==============================] - 0s 135us/step - loss: 0.3419 - mae: 0.3966 Epoch 197/300 404/404 [==============================] - 0s 82us/step - loss: 0.3434 - mae: 0.4033 Epoch 198/300 404/404 [==============================] - 0s 129us/step - loss: 0.3423 - mae: 0.4000 Epoch 199/300 404/404 [==============================] - 0s 180us/step - loss: 0.3441 - mae: 0.4050 Epoch 200/300 404/404 [==============================] - 0s 88us/step - loss: 0.3428 - mae: 0.4001 Epoch 201/300 404/404 [==============================] - 0s 176us/step - loss: 0.3427 - mae: 0.4009 Epoch 202/300 404/404 [==============================] - 0s 103us/step - loss: 0.3422 - mae: 0.3985 Epoch 203/300 404/404 [==============================] - 0s 219us/step - loss: 0.3429 - mae: 0.4008 Epoch 204/300 404/404 [==============================] - 0s 82us/step - loss: 0.3419 - mae: 0.3975 Epoch 205/300 404/404 [==============================] - 0s 62us/step - loss: 0.3439 - mae: 0.4042 Epoch 206/300 404/404 [==============================] - 0s 117us/step - loss: 0.3418 - mae: 0.3973 Epoch 207/300 404/404 [==============================] - 0s 91us/step - loss: 0.3432 - mae: 0.4021 Epoch 208/300 404/404 [==============================] - 0s 60us/step - loss: 0.3436 - mae: 0.4039 Epoch 209/300 404/404 [==============================] - 0s 170us/step - loss: 0.3423 - mae: 0.3976 Epoch 210/300 404/404 [==============================] - 0s 84us/step - loss: 0.3429 - mae: 0.4025 Epoch 211/300 404/404 [==============================] - 0s 105us/step - loss: 0.3425 - mae: 0.4000 Epoch 212/300 404/404 [==============================] - 0s 88us/step - loss: 0.3429 - mae: 0.3990 Epoch 213/300 404/404 [==============================] - 0s 78us/step - loss: 0.3419 - mae: 0.3979 Epoch 214/300 404/404 [==============================] - 0s 107us/step - loss: 0.3431 - mae: 0.4028 Epoch 215/300 404/404 [==============================] - 0s 81us/step - loss: 0.3430 - mae: 0.3996 Epoch 216/300 404/404 [==============================] - 0s 88us/step - loss: 0.3425 - mae: 0.3988 Epoch 217/300 404/404 [==============================] - 0s 67us/step - loss: 0.3420 - mae: 0.3992 Epoch 218/300 404/404 [==============================] - 0s 87us/step - loss: 0.3425 - mae: 0.4016 Epoch 219/300 404/404 [==============================] - 0s 79us/step - loss: 0.3436 - mae: 0.4004 Epoch 220/300 404/404 [==============================] - 0s 123us/step - loss: 0.3418 - mae: 0.3983 Epoch 221/300 404/404 [==============================] - 0s 125us/step - loss: 0.3442 - mae: 0.4021 Epoch 222/300 404/404 [==============================] - 0s 62us/step - loss: 0.3423 - mae: 0.4005 Epoch 223/300 404/404 [==============================] - 0s 82us/step - loss: 0.3415 - mae: 0.3970 Epoch 224/300 404/404 [==============================] - 0s 78us/step - loss: 0.3432 - mae: 0.4030 Epoch 225/300 404/404 [==============================] - 0s 98us/step - loss: 0.3430 - mae: 0.4006 Epoch 226/300 404/404 [==============================] - 0s 111us/step - loss: 0.3429 - mae: 0.4007 Epoch 227/300 404/404 [==============================] - 0s 120us/step - loss: 0.3424 - mae: 0.4007 Epoch 228/300 404/404 [==============================] - 0s 72us/step - loss: 0.3426 - mae: 0.3984 Epoch 229/300 404/404 [==============================] - 0s 117us/step - loss: 0.3419 - mae: 0.3979 Epoch 230/300 404/404 [==============================] - 0s 98us/step - loss: 0.3436 - mae: 0.4034 Epoch 231/300 404/404 [==============================] - 0s 63us/step - loss: 0.3422 - mae: 0.3994 Epoch 232/300 404/404 [==============================] - 0s 107us/step - loss: 0.3430 - mae: 0.4020 Epoch 233/300 404/404 [==============================] - 0s 188us/step - loss: 0.3425 - mae: 0.3997 Epoch 234/300 404/404 [==============================] - 0s 100us/step - loss: 0.3426 - mae: 0.4005 Epoch 235/300 404/404 [==============================] - 0s 117us/step - loss: 0.3426 - mae: 0.3995 Epoch 236/300 404/404 [==============================] - 0s 175us/step - loss: 0.3421 - mae: 0.3980 Epoch 237/300 404/404 [==============================] - 0s 104us/step - loss: 0.3425 - mae: 0.4006 Epoch 238/300 404/404 [==============================] - 0s 71us/step - loss: 0.3423 - mae: 0.3982 Epoch 239/300 404/404 [==============================] - 0s 51us/step - loss: 0.3419 - mae: 0.4000 Epoch 240/300 404/404 [==============================] - 0s 74us/step - loss: 0.3440 - mae: 0.4016 Epoch 241/300 404/404 [==============================] - 0s 82us/step - loss: 0.3418 - mae: 0.3980 Epoch 242/300 404/404 [==============================] - 0s 122us/step - loss: 0.3416 - mae: 0.3971 Epoch 243/300 404/404 [==============================] - 0s 156us/step - loss: 0.3424 - mae: 0.3987 Epoch 244/300 404/404 [==============================] - 0s 58us/step - loss: 0.3424 - mae: 0.4002 Epoch 245/300 404/404 [==============================] - 0s 75us/step - loss: 0.3412 - mae: 0.3954 Epoch 246/300 404/404 [==============================] - 0s 80us/step - loss: 0.3421 - mae: 0.3997 Epoch 247/300 404/404 [==============================] - 0s 124us/step - loss: 0.3434 - mae: 0.4032 Epoch 248/300 404/404 [==============================] - 0s 109us/step - loss: 0.3424 - mae: 0.3991 Epoch 249/300 404/404 [==============================] - 0s 90us/step - loss: 0.3412 - mae: 0.3958 Epoch 250/300 404/404 [==============================] - 0s 77us/step - loss: 0.3427 - mae: 0.3999 Epoch 251/300 404/404 [==============================] - 0s 70us/step - loss: 0.3426 - mae: 0.4001 Epoch 252/300 404/404 [==============================] - 0s 80us/step - loss: 0.3418 - mae: 0.3980 Epoch 253/300 404/404 [==============================] - 0s 82us/step - loss: 0.3417 - mae: 0.3987 Epoch 254/300 404/404 [==============================] - 0s 207us/step - loss: 0.3431 - mae: 0.4014 Epoch 255/300 404/404 [==============================] - 0s 132us/step - loss: 0.3422 - mae: 0.3978 Epoch 256/300 404/404 [==============================] - 0s 128us/step - loss: 0.3416 - mae: 0.3966 Epoch 257/300 404/404 [==============================] - 0s 192us/step - loss: 0.3424 - mae: 0.3968 Epoch 258/300 404/404 [==============================] - 0s 99us/step - loss: 0.3418 - mae: 0.3971 Epoch 259/300 404/404 [==============================] - 0s 93us/step - loss: 0.3423 - mae: 0.3977 Epoch 260/300 404/404 [==============================] - 0s 84us/step - loss: 0.3422 - mae: 0.3994 Epoch 261/300 404/404 [==============================] - 0s 195us/step - loss: 0.3419 - mae: 0.3976 Epoch 262/300 404/404 [==============================] - 0s 121us/step - loss: 0.3413 - mae: 0.3943 Epoch 263/300 404/404 [==============================] - 0s 107us/step - loss: 0.3432 - mae: 0.4019 Epoch 264/300 404/404 [==============================] - 0s 56us/step - loss: 0.3414 - mae: 0.3955 Epoch 265/300 404/404 [==============================] - 0s 62us/step - loss: 0.3429 - mae: 0.3983 Epoch 266/300 404/404 [==============================] - 0s 87us/step - loss: 0.3422 - mae: 0.3986 Epoch 267/300 404/404 [==============================] - 0s 212us/step - loss: 0.3414 - mae: 0.3953 Epoch 268/300 404/404 [==============================] - 0s 176us/step - loss: 0.3423 - mae: 0.3989 Epoch 269/300 404/404 [==============================] - 0s 97us/step - loss: 0.3426 - mae: 0.3987 Epoch 270/300 404/404 [==============================] - 0s 175us/step - loss: 0.3409 - mae: 0.3932 Epoch 271/300 404/404 [==============================] - 0s 78us/step - loss: 0.3427 - mae: 0.4008 Epoch 272/300 404/404 [==============================] - 0s 74us/step - loss: 0.3423 - mae: 0.3999 Epoch 273/300 404/404 [==============================] - 0s 72us/step - loss: 0.3413 - mae: 0.3934 Epoch 274/300 404/404 [==============================] - 0s 111us/step - loss: 0.3429 - mae: 0.4001 Epoch 275/300 404/404 [==============================] - 0s 99us/step - loss: 0.3418 - mae: 0.3981 Epoch 276/300 404/404 [==============================] - 0s 138us/step - loss: 0.3424 - mae: 0.3986 Epoch 277/300 404/404 [==============================] - 0s 92us/step - loss: 0.3420 - mae: 0.3996 Epoch 278/300 404/404 [==============================] - 0s 99us/step - loss: 0.3430 - mae: 0.3979 Epoch 279/300 404/404 [==============================] - 0s 82us/step - loss: 0.3419 - mae: 0.3960 Epoch 280/300 404/404 [==============================] - 0s 96us/step - loss: 0.3416 - mae: 0.3967 Epoch 281/300 404/404 [==============================] - 0s 79us/step - loss: 0.3425 - mae: 0.3987 Epoch 282/300 404/404 [==============================] - 0s 89us/step - loss: 0.3420 - mae: 0.3963 Epoch 283/300 404/404 [==============================] - 0s 65us/step - loss: 0.3412 - mae: 0.3942 Epoch 284/300 404/404 [==============================] - 0s 78us/step - loss: 0.3416 - mae: 0.3955 Epoch 285/300 404/404 [==============================] - 0s 84us/step - loss: 0.3421 - mae: 0.3962 Epoch 286/300 404/404 [==============================] - 0s 88us/step - loss: 0.3418 - mae: 0.3975 Epoch 287/300 404/404 [==============================] - 0s 64us/step - loss: 0.3411 - mae: 0.3958 Epoch 288/300 404/404 [==============================] - 0s 105us/step - loss: 0.3429 - mae: 0.3992 Epoch 289/300 404/404 [==============================] - 0s 93us/step - loss: 0.3419 - mae: 0.3973 Epoch 290/300 404/404 [==============================] - 0s 129us/step - loss: 0.3409 - mae: 0.3931 Epoch 291/300 404/404 [==============================] - 0s 87us/step - loss: 0.3419 - mae: 0.3976 Epoch 292/300 404/404 [==============================] - 0s 128us/step - loss: 0.3416 - mae: 0.3982 Epoch 293/300 404/404 [==============================] - 0s 99us/step - loss: 0.3417 - mae: 0.3967 Epoch 294/300 404/404 [==============================] - 0s 93us/step - loss: 0.3413 - mae: 0.3946 Epoch 295/300 404/404 [==============================] - 0s 157us/step - loss: 0.3419 - mae: 0.3972 Epoch 296/300 404/404 [==============================] - 0s 113us/step - loss: 0.3426 - mae: 0.3958 Epoch 297/300 404/404 [==============================] - 0s 69us/step - loss: 0.3412 - mae: 0.3954 Epoch 298/300 404/404 [==============================] - 0s 75us/step - loss: 0.3419 - mae: 0.3972 Epoch 299/300 404/404 [==============================] - 0s 44us/step - loss: 0.3406 - mae: 0.3922 Epoch 300/300 404/404 [==============================] - 0s 54us/step - loss: 0.3423 - mae: 0.4000 102/102 [==============================] - 0s 309us/step score: [0.44682052264026567, 0.4667647182941437] r2 score: 0.45171405496274775
本段代码是一个基于Keras构建神经网络回归模型的示例。该模型使用了Sequential模型,依次添加了三个Dense层,激活函数分别为ReLU和线性函数。其中第一层的输入维度使用了lb.data.shape[1]来获取数据的特征数,以此来指定了输入数据的形状。
在模型的编译过程中,使用了rmsprop优化器,损失函数为MSE,评估指标为MAE。训练过程中使用了X_train和y_train训练集,并设置了epochs为300,batch_size为16,shuffle为False。最后通过evaluate方法计算了模型在测试集上的得分,并使用r2_score计算了模型的R2指标。
此外,这段代码还使用了数据标准化的预处理方法,使得数据的分布更加符合标准正态分布,有助于提高模型的准确性。
总的来说,这段代码是一个简单但完整的神经网络模型,可以用于解决回归问题并作为其他问题的基础模型进行修改和调试。同时,使用Keras编写神经网络模型也相对容易上手,方便大家进行学习和实践。