Mon Apr 29 15:11:41 2019 hourly_wages: python version: 3.6.8 keras version: 2.2.4 Neural network to solve a multivariate regression problem. The data is read from an external file. Data contains 534 records with 10 features. Training data uses 450 records with 9 features. Test data uses 84 records with 9 features. train_data[0,0:9]: [ 0. 8. 21. 35. 1. 1. 0. 1. 0.] train_targets[0]: 5.1 Training: Train on 360 samples, validate on 90 samples Epoch 1/30 32/360 [=>............................] - ETA: 1s - loss: 36.5260 360/360 [==============================] - 0s 526us/step - loss: 33.9855 - val_loss: 43.1946 Epoch 2/30 32/360 [=>............................] - ETA: 0s - loss: 18.9527 360/360 [==============================] - 0s 24us/step - loss: 30.1776 - val_loss: 38.5688 Epoch 3/30 32/360 [=>............................] - ETA: 0s - loss: 28.5958 360/360 [==============================] - 0s 22us/step - loss: 27.4063 - val_loss: 31.7084 Epoch 4/30 32/360 [=>............................] - ETA: 0s - loss: 16.1907 360/360 [==============================] - 0s 22us/step - loss: 24.2516 - val_loss: 29.0803 Epoch 5/30 32/360 [=>............................] - ETA: 0s - loss: 55.2114 360/360 [==============================] - 0s 22us/step - loss: 22.8150 - val_loss: 26.5093 Epoch 6/30 32/360 [=>............................] - ETA: 0s - loss: 26.3843 360/360 [==============================] - 0s 21us/step - loss: 21.8046 - val_loss: 24.8767 Epoch 7/30 32/360 [=>............................] - ETA: 0s - loss: 61.3507 360/360 [==============================] - 0s 22us/step - loss: 21.5499 - val_loss: 24.7906 Epoch 8/30 32/360 [=>............................] - ETA: 0s - loss: 61.7298 360/360 [==============================] - 0s 22us/step - loss: 21.1386 - val_loss: 24.0752 Epoch 9/30 32/360 [=>............................] - ETA: 0s - loss: 54.9687 360/360 [==============================] - 0s 22us/step - loss: 20.7820 - val_loss: 23.6670 Epoch 10/30 32/360 [=>............................] - ETA: 0s - loss: 25.1367 360/360 [==============================] - 0s 22us/step - loss: 20.5280 - val_loss: 23.5214 Epoch 11/30 32/360 [=>............................] - ETA: 0s - loss: 54.0471 360/360 [==============================] - 0s 23us/step - loss: 20.3169 - val_loss: 23.7110 Epoch 12/30 32/360 [=>............................] - ETA: 0s - loss: 10.1587 360/360 [==============================] - 0s 23us/step - loss: 20.1277 - val_loss: 23.5924 Epoch 13/30 32/360 [=>............................] - ETA: 0s - loss: 17.1292 360/360 [==============================] - 0s 22us/step - loss: 19.8265 - val_loss: 22.8444 Epoch 14/30 32/360 [=>............................] - ETA: 0s - loss: 12.6901 360/360 [==============================] - 0s 22us/step - loss: 19.7716 - val_loss: 23.1190 Epoch 15/30 32/360 [=>............................] - ETA: 0s - loss: 8.4796 360/360 [==============================] - 0s 23us/step - loss: 19.6957 - val_loss: 23.0540 Epoch 16/30 32/360 [=>............................] - ETA: 0s - loss: 8.6733 360/360 [==============================] - 0s 22us/step - loss: 19.6217 - val_loss: 22.5916 Epoch 17/30 32/360 [=>............................] - ETA: 0s - loss: 10.9685 360/360 [==============================] - 0s 22us/step - loss: 19.5573 - val_loss: 23.1809 Epoch 18/30 32/360 [=>............................] - ETA: 0s - loss: 13.5554 360/360 [==============================] - 0s 23us/step - loss: 19.4848 - val_loss: 22.2367 Epoch 19/30 32/360 [=>............................] - ETA: 0s - loss: 11.6552 360/360 [==============================] - 0s 23us/step - loss: 19.4093 - val_loss: 22.5820 Epoch 20/30 32/360 [=>............................] - ETA: 0s - loss: 9.2920 360/360 [==============================] - 0s 23us/step - loss: 19.4535 - val_loss: 22.5006 Epoch 21/30 32/360 [=>............................] - ETA: 0s - loss: 16.1311 360/360 [==============================] - 0s 23us/step - loss: 19.3585 - val_loss: 21.9781 Epoch 22/30 32/360 [=>............................] - ETA: 0s - loss: 9.6522 360/360 [==============================] - 0s 23us/step - loss: 19.3029 - val_loss: 22.2624 Epoch 23/30 32/360 [=>............................] - ETA: 0s - loss: 14.5235 360/360 [==============================] - 0s 22us/step - loss: 19.2317 - val_loss: 22.4516 Epoch 24/30 32/360 [=>............................] - ETA: 0s - loss: 14.5170 360/360 [==============================] - 0s 22us/step - loss: 19.1656 - val_loss: 22.4065 Testing: 0: 8.613173 10.280000 1: 10.327003 15.000000 2: 12.371456 12.000000 3: 9.502140 10.580000 4: 10.768035 5.850000 5: 12.993261 11.220000 6: 9.719757 8.560000 7: 10.818596 13.890000 8: 9.499022 5.710000 9: 9.899394 15.790000 10: 11.727979 7.500000 11: 8.891652 11.250000 12: 10.935744 6.150000 13: 11.074924 13.450000 14: 8.605358 6.250000 15: 8.276031 6.500000 16: 8.255935 12.000000 17: 8.997073 8.500000 18: 13.459388 8.000000 19: 8.538110 5.750000 20: 8.192931 15.730000 21: 10.410794 9.860000 22: 11.661864 13.510000 23: 9.378822 5.400000 24: 11.283679 6.250000 25: 11.695477 5.500000 26: 7.074957 5.000000 27: 13.641491 6.250000 28: 6.797038 5.750000 29: 10.551037 20.500000 30: 6.773502 5.000000 31: 13.478882 7.000000 32: 11.451565 18.000000 33: 12.089472 12.000000 34: 8.300384 20.400000 35: 12.392324 22.200000 36: 10.553313 16.420000 37: 7.161630 8.630000 38: 9.636646 19.380000 39: 10.925756 14.000000 40: 11.079876 10.000000 41: 10.475279 15.950000 42: 11.245818 20.000000 43: 9.378822 10.000000 44: 7.689336 24.980000 45: 8.368313 11.250000 46: 12.085558 22.830000 47: 12.517554 10.200000 48: 7.411417 10.000000 49: 12.836534 14.000000 50: 10.455183 12.500000 51: 9.533056 5.790000 52: 12.390264 24.980000 53: 6.193356 4.350000 54: 9.744098 11.250000 55: 9.356634 6.670000 56: 11.183578 8.000000 57: 9.899394 18.160000 58: 10.410112 12.000000 59: 11.836884 8.890000 60: 10.755291 9.500000 61: 10.240329 13.650000 62: 13.581266 12.000000 63: 9.794769 15.000000 64: 11.271618 12.670000 65: 9.933876 7.380000 66: 10.002602 15.560000 67: 8.209909 7.450000 68: 9.204113 6.250000 69: 8.184885 6.250000 70: 9.511932 9.370000 71: 11.544620 22.500000 72: 9.940220 7.500000 73: 7.844634 7.000000 74: 7.415190 5.750000 75: 9.291468 7.670000 76: 9.657424 12.500000 77: 8.696967 16.000000 78: 8.325346 11.790000 79: 9.403848 11.360000 80: 8.798800 6.100000 81: 12.544653 23.250000 82: 8.912503 19.880000 83: 11.742251 15.380000