实验数据_cnnlstm_mape

2006-03-04

pred:

array([7958.4155, 7751.268 , 7527.7954, 7201.982 , 6891.795 , 6662.3774,
       6567.015 , 6512.1772, 6521.0806, 6697.8184, 6918.5425, 7376.8047,
       7873.748 , 8257.072 , 8664.629 , 8909.608 , 9099.593 , 9343.218 ,
       9468.084 , 9583.169 , 9606.54  , 9620.329 , 9640.951 , 9613.612 ,
       9600.95  , 9607.041 , 9601.413 , 9596.466 , 9567.325 , 9542.329 ,
       9573.12  , 9588.498 , 9589.878 , 9513.756 , 9474.52  , 9308.211 ,
       9253.097 , 9176.314 , 9170.35  , 9147.754 , 9026.866 , 8784.027 ,
       8507.148 , 8512.837 , 8331.485 , 8332.714 , 8179.672 , 8026.828 ],
      dtype=float32)

 true:

array([7763.64667, 7520.06833, 7242.73833, 6931.72833, 6589.28167,
       6371.48167, 6227.40667, 6159.62333, 6142.65167, 6245.355  ,
       6329.72167, 6571.29833, 6780.075  , 7031.895  , 7411.87167,
       7845.47333, 8172.615  , 8561.64667, 8656.84333, 8791.97833,
       8778.925  , 8757.92833, 8746.56167, 8688.47   , 8686.71167,
       8624.335  , 8611.68333, 8566.765  , 8546.89333, 8572.79   ,
       8569.62333, 8589.01833, 8617.93   , 8454.73667, 8506.32167,
       8423.125  , 8387.81833, 8230.435  , 8371.435  , 8415.43167,
       8304.89667, 8093.65667, 7916.065  , 7885.44833, 7734.59333,
       7808.35333, 7578.37667, 7347.17667])

bule-forcesing 

实验数据_cnnlstm_mape_第1张图片

实验数据_cnnlstm_mape_第2张图片

2010-03-04

331
Model: "model_5"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_6 (InputLayer)         [(None, 48, 3)]           0         
_________________________________________________________________
conv1d_9 (Conv1D)            (None, 48, 128)           512       
_________________________________________________________________
conv1d_10 (Conv1D)           (None, 48, 128)           16512     
_________________________________________________________________
max_pooling1d_4 (MaxPooling1 (None, 1, 128)            0         
_________________________________________________________________
lstm_13 (LSTM)               (None, 1, 128)            131584    
_________________________________________________________________
lstm_14 (LSTM)               (None, 1, 64)             49408     
_________________________________________________________________
lstm_15 (LSTM)               (None, 48)                21696     
_________________________________________________________________
dense_5 (Dense)              (None, 48)                2352      
=================================================================
Total params: 222,064
Trainable params: 222,064
Non-trainable params: 0
_________________________________________________________________
Train on 331 samples, validate on 143 samples
Epoch 1/30
331/331 [==============================] - 3s 10ms/sample - loss: 0.1349 - accuracy: 0.0997 - val_loss: 0.1059 - val_accuracy: 0.1189
Epoch 2/30
331/331 [==============================] - 0s 275us/sample - loss: 0.0722 - accuracy: 0.1178 - val_loss: 0.0305 - val_accuracy: 0.1189
Epoch 3/30
331/331 [==============================] - 0s 212us/sample - loss: 0.0206 - accuracy: 0.1178 - val_loss: 0.0081 - val_accuracy: 0.1189
Epoch 4/30
331/331 [==============================] - 0s 205us/sample - loss: 0.0068 - accuracy: 0.0544 - val_loss: 0.0051 - val_accuracy: 0.1538
Epoch 5/30
331/331 [==============================] - 0s 202us/sample - loss: 0.0051 - accuracy: 0.1027 - val_loss: 0.0047 - val_accuracy: 0.0490
Epoch 6/30
331/331 [==============================] - 0s 196us/sample - loss: 0.0048 - accuracy: 0.0272 - val_loss: 0.0044 - val_accuracy: 0.0350
Epoch 7/30
331/331 [==============================] - 0s 199us/sample - loss: 0.0046 - accuracy: 0.0151 - val_loss: 0.0043 - val_accuracy: 0.0140
Epoch 8/30
331/331 [==============================] - 0s 196us/sample - loss: 0.0045 - accuracy: 0.0423 - val_loss: 0.0043 - val_accuracy: 0.1189
Epoch 9/30
331/331 [==============================] - 0s 190us/sample - loss: 0.0045 - accuracy: 0.1057 - val_loss: 0.0043 - val_accuracy: 0.1189
Epoch 10/30
331/331 [==============================] - 0s 212us/sample - loss: 0.0045 - accuracy: 0.0665 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 11/30
331/331 [==============================] - 0s 221us/sample - loss: 0.0045 - accuracy: 0.0785 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 12/30
331/331 [==============================] - 0s 218us/sample - loss: 0.0045 - accuracy: 0.0785 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 13/30
331/331 [==============================] - 0s 209us/sample - loss: 0.0045 - accuracy: 0.0785 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 14/30
331/331 [==============================] - 0s 205us/sample - loss: 0.0045 - accuracy: 0.0785 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 15/30
331/331 [==============================] - 0s 202us/sample - loss: 0.0045 - accuracy: 0.0785 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 16/30
331/331 [==============================] - 0s 209us/sample - loss: 0.0045 - accuracy: 0.0785 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 17/30
331/331 [==============================] - 0s 202us/sample - loss: 0.0045 - accuracy: 0.0785 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 18/30
331/331 [==============================] - 0s 199us/sample - loss: 0.0045 - accuracy: 0.0785 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 19/30
331/331 [==============================] - 0s 205us/sample - loss: 0.0045 - accuracy: 0.0785 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 20/30
331/331 [==============================] - 0s 202us/sample - loss: 0.0045 - accuracy: 0.0785 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 21/30
331/331 [==============================] - 0s 202us/sample - loss: 0.0045 - accuracy: 0.0785 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 22/30
331/331 [==============================] - 0s 202us/sample - loss: 0.0045 - accuracy: 0.0785 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 23/30
331/331 [==============================] - 0s 202us/sample - loss: 0.0045 - accuracy: 0.0785 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 24/30
331/331 [==============================] - 0s 202us/sample - loss: 0.0045 - accuracy: 0.0785 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 25/30
331/331 [==============================] - 0s 212us/sample - loss: 0.0045 - accuracy: 0.0785 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 26/30
331/331 [==============================] - 0s 205us/sample - loss: 0.0045 - accuracy: 0.0574 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 27/30
331/331 [==============================] - 0s 205us/sample - loss: 0.0045 - accuracy: 0.0574 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 28/30
331/331 [==============================] - 0s 199us/sample - loss: 0.0045 - accuracy: 0.0574 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 29/30
331/331 [==============================] - 0s 199us/sample - loss: 0.0045 - accuracy: 0.0574 - val_loss: 0.0043 - val_accuracy: 0.0699
Epoch 30/30
331/331 [==============================] - 0s 187us/sample - loss: 0.0045 - accuracy: 0.0574 - val_loss: 0.0043 - val_accuracy: 0.0699

实验数据_cnnlstm_mape_第3张图片

pred
[8010.635  7809.1787 7599.0825 7302.1855 6991.1035 6733.2344 6591.7183
 6527.3687 6531.6743 6670.868  6870.7544 7306.865  7780.8086 8179.305
 8601.343  8851.851  9067.619  9320.703  9440.045  9570.206  9602.607
 9615.256  9646.539  9607.961  9620.904  9622.764  9601.794  9606.234
 9587.057  9560.2705 9572.083  9583.597  9581.549  9531.673  9499.453
 9366.619  9340.422  9293.003  9290.076  9223.878  9086.14   8861.275
 8592.88   8551.889  8357.849  8352.641  8208.616  8078.3433]
true
[ 7723.29  7567.75  7305.22  7004.35  6772.37  6579.59  6498.34  6493.7
  6575.3   6798.57  7057.49  7710.26  8257.28  8719.72  9137.33  9268.66
  9383.15  9618.23  9762.4   9840.94  9900.17  9917.14  9956.33  9918.44
  9973.25 10046.95 10062.29 10109.92 10086.19 10085.09 10105.69 10110.17
 10110.66  9997.81  9930.54  9655.64  9558.58  9484.46  9598.29  9603.49
  9384.25  9011.44  8689.19  8711.33  8509.65  8401.14  8252.09  8051.83]

2010-04-04

100
Model: "model_6"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_7 (InputLayer)         [(None, 48, 3)]           0         
_________________________________________________________________
conv1d_11 (Conv1D)           (None, 48, 128)           512       
_________________________________________________________________
conv1d_12 (Conv1D)           (None, 48, 128)           16512     
_________________________________________________________________
max_pooling1d_5 (MaxPooling1 (None, 1, 128)            0         
_________________________________________________________________
lstm_16 (LSTM)               (None, 1, 128)            131584    
_________________________________________________________________
lstm_17 (LSTM)               (None, 1, 64)             49408     
_________________________________________________________________
lstm_18 (LSTM)               (None, 48)                21696     
_________________________________________________________________
dense_6 (Dense)              (None, 48)                2352      
=================================================================
Total params: 222,064
Trainable params: 222,064
Non-trainable params: 0
_________________________________________________________________
Train on 100 samples, validate on 43 samples
Epoch 1/30
100/100 [==============================] - 4s 36ms/sample - loss: 0.1144 - accuracy: 0.0100 - val_loss: 0.0954 - val_accuracy: 0.0698
Epoch 2/30
100/100 [==============================] - 0s 440us/sample - loss: 0.1058 - accuracy: 0.0200 - val_loss: 0.0833 - val_accuracy: 0.0698
Epoch 3/30
100/100 [==============================] - 0s 450us/sample - loss: 0.0906 - accuracy: 0.0200 - val_loss: 0.0631 - val_accuracy: 0.0698
Epoch 4/30
100/100 [==============================] - 0s 420us/sample - loss: 0.0664 - accuracy: 0.0200 - val_loss: 0.0397 - val_accuracy: 0.0698
Epoch 5/30
100/100 [==============================] - 0s 350us/sample - loss: 0.0419 - accuracy: 0.0200 - val_loss: 0.0289 - val_accuracy: 0.0698
Epoch 6/30
100/100 [==============================] - 0s 310us/sample - loss: 0.0301 - accuracy: 0.0200 - val_loss: 0.0222 - val_accuracy: 0.0233
Epoch 7/30
100/100 [==============================] - 0s 330us/sample - loss: 0.0209 - accuracy: 0.0000e+00 - val_loss: 0.0128 - val_accuracy: 0.0233
Epoch 8/30
100/100 [==============================] - 0s 300us/sample - loss: 0.0133 - accuracy: 0.2000 - val_loss: 0.0085 - val_accuracy: 0.1628
Epoch 9/30
100/100 [==============================] - 0s 320us/sample - loss: 0.0104 - accuracy: 0.2700 - val_loss: 0.0070 - val_accuracy: 0.1628
Epoch 10/30
100/100 [==============================] - 0s 310us/sample - loss: 0.0088 - accuracy: 0.2100 - val_loss: 0.0067 - val_accuracy: 0.0465
Epoch 11/30
100/100 [==============================] - 0s 300us/sample - loss: 0.0079 - accuracy: 0.0700 - val_loss: 0.0070 - val_accuracy: 0.0233
Epoch 12/30
100/100 [==============================] - 0s 290us/sample - loss: 0.0078 - accuracy: 0.0400 - val_loss: 0.0068 - val_accuracy: 0.0233
Epoch 13/30
100/100 [==============================] - 0s 290us/sample - loss: 0.0077 - accuracy: 0.0400 - val_loss: 0.0065 - val_accuracy: 0.0000e+00
Epoch 14/30
100/100 [==============================] - 0s 290us/sample - loss: 0.0077 - accuracy: 0.0200 - val_loss: 0.0063 - val_accuracy: 0.0000e+00
Epoch 15/30
100/100 [==============================] - 0s 280us/sample - loss: 0.0076 - accuracy: 0.0300 - val_loss: 0.0063 - val_accuracy: 0.1860
Epoch 16/30
100/100 [==============================] - 0s 280us/sample - loss: 0.0075 - accuracy: 0.1700 - val_loss: 0.0063 - val_accuracy: 0.1628
Epoch 17/30
100/100 [==============================] - 0s 310us/sample - loss: 0.0074 - accuracy: 0.2700 - val_loss: 0.0062 - val_accuracy: 0.1628
Epoch 18/30
100/100 [==============================] - 0s 300us/sample - loss: 0.0073 - accuracy: 0.1600 - val_loss: 0.0061 - val_accuracy: 0.1860
Epoch 19/30
100/100 [==============================] - 0s 300us/sample - loss: 0.0073 - accuracy: 0.2300 - val_loss: 0.0060 - val_accuracy: 0.1628
Epoch 20/30
100/100 [==============================] - 0s 300us/sample - loss: 0.0073 - accuracy: 0.2700 - val_loss: 0.0060 - val_accuracy: 0.1628
Epoch 21/30
100/100 [==============================] - 0s 310us/sample - loss: 0.0073 - accuracy: 0.2700 - val_loss: 0.0061 - val_accuracy: 0.1628
Epoch 22/30
100/100 [==============================] - 0s 290us/sample - loss: 0.0073 - accuracy: 0.2700 - val_loss: 0.0061 - val_accuracy: 0.1628
Epoch 23/30
100/100 [==============================] - 0s 300us/sample - loss: 0.0073 - accuracy: 0.2700 - val_loss: 0.0060 - val_accuracy: 0.1628
Epoch 24/30
100/100 [==============================] - 0s 290us/sample - loss: 0.0073 - accuracy: 0.2700 - val_loss: 0.0060 - val_accuracy: 0.1628
Epoch 25/30
100/100 [==============================] - 0s 290us/sample - loss: 0.0073 - accuracy: 0.2700 - val_loss: 0.0060 - val_accuracy: 0.1628
Epoch 26/30
100/100 [==============================] - 0s 280us/sample - loss: 0.0073 - accuracy: 0.2700 - val_loss: 0.0061 - val_accuracy: 0.1628
Epoch 27/30
100/100 [==============================] - 0s 310us/sample - loss: 0.0073 - accuracy: 0.2700 - val_loss: 0.0060 - val_accuracy: 0.1628
Epoch 28/30
100/100 [==============================] - 0s 290us/sample - loss: 0.0073 - accuracy: 0.2700 - val_loss: 0.0060 - val_accuracy: 0.1628
Epoch 29/30
100/100 [==============================] - 0s 300us/sample - loss: 0.0073 - accuracy: 0.2700 - val_loss: 0.0060 - val_accuracy: 0.1628
Epoch 30/30
100/100 [==============================] - 0s 280us/sample - loss: 0.0073 - accuracy: 0.2700 - val_loss: 0.0061 - val_accuracy: 0.1628

实验数据_cnnlstm_mape_第4张图片

pred
[7349.497  7179.998  7020.6714 6816.061  6613.731  6393.7188 6249.8213
 6175.4307 6158.613  6246.8555 6362.7847 6669.811  6999.348  7368.3564
 7751.2993 8045.674  8360.4    8637.023  8807.22   8977.349  9048.192
 9079.813  9111.7295 9080.084  9068.504  9079.433  9045.489  9030.003
 8980.91   8978.036  8964.549  8970.938  8978.405  8977.747  9007.412
 8994.505  9147.034  9234.686  9187.403  9044.471  8913.489  8701.031
 8471.27   8348.808  8121.     8157.4746 8010.4307 7913.78  ]
true
[7044.52 6840.32 6683.7  6475.78 6243.97 6047.35 5957.95 5921.97 5890.06
 5893.66 5925.86 6033.7  6097.41 6290.21 6567.28 6901.81 7219.55 7515.17
 7723.48 7855.98 7920.14 7902.77 7929.26 7823.64 7844.46 7776.33 7739.9
 7615.65 7533.99 7493.34 7416.12 7454.71 7473.31 7505.29 7565.4  7641.99
 7884.96 8240.28 8210.25 7974.3  7811.36 7668.71 7614.29 7492.89 7400.21
 7369.07 7394.38 7306.22]

2010

12-02

256
Model: "model_7"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_8 (InputLayer)         [(None, 48, 3)]           0         
_________________________________________________________________
conv1d_13 (Conv1D)           (None, 48, 128)           512       
_________________________________________________________________
conv1d_14 (Conv1D)           (None, 48, 128)           16512     
_________________________________________________________________
max_pooling1d_6 (MaxPooling1 (None, 1, 128)            0         
_________________________________________________________________
lstm_19 (LSTM)               (None, 1, 128)            131584    
_________________________________________________________________
lstm_20 (LSTM)               (None, 1, 64)             49408     
_________________________________________________________________
lstm_21 (LSTM)               (None, 48)                21696     
_________________________________________________________________
dense_7 (Dense)              (None, 48)                2352      
=================================================================
Total params: 222,064
Trainable params: 222,064
Non-trainable params: 0
_________________________________________________________________
Train on 256 samples, validate on 110 samples
Epoch 1/30
256/256 [==============================] - 4s 14ms/sample - loss: 0.1284 - accuracy: 0.0078 - val_loss: 0.1102 - val_accuracy: 0.0091
Epoch 2/30
256/256 [==============================] - 0s 242us/sample - loss: 0.0958 - accuracy: 0.0078 - val_loss: 0.0600 - val_accuracy: 0.0091
Epoch 3/30
256/256 [==============================] - 0s 231us/sample - loss: 0.0428 - accuracy: 0.0078 - val_loss: 0.0253 - val_accuracy: 0.0091
Epoch 4/30
256/256 [==============================] - 0s 203us/sample - loss: 0.0182 - accuracy: 0.0078 - val_loss: 0.0091 - val_accuracy: 0.0091
Epoch 5/30
256/256 [==============================] - 0s 199us/sample - loss: 0.0072 - accuracy: 0.0078 - val_loss: 0.0048 - val_accuracy: 0.0182
Epoch 6/30
256/256 [==============================] - 0s 199us/sample - loss: 0.0052 - accuracy: 0.0195 - val_loss: 0.0045 - val_accuracy: 0.0182
Epoch 7/30
256/256 [==============================] - 0s 215us/sample - loss: 0.0048 - accuracy: 0.0234 - val_loss: 0.0041 - val_accuracy: 0.0364
Epoch 8/30
256/256 [==============================] - 0s 207us/sample - loss: 0.0045 - accuracy: 0.0117 - val_loss: 0.0038 - val_accuracy: 0.0364
Epoch 9/30
256/256 [==============================] - 0s 211us/sample - loss: 0.0043 - accuracy: 0.1055 - val_loss: 0.0038 - val_accuracy: 0.1364
Epoch 10/30
256/256 [==============================] - 0s 195us/sample - loss: 0.0042 - accuracy: 0.0977 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 11/30
256/256 [==============================] - 0s 188us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0182
Epoch 12/30
256/256 [==============================] - 0s 188us/sample - loss: 0.0042 - accuracy: 0.0664 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 13/30
256/256 [==============================] - 0s 203us/sample - loss: 0.0042 - accuracy: 0.0586 - val_loss: 0.0037 - val_accuracy: 0.0364
Epoch 14/30
256/256 [==============================] - 0s 207us/sample - loss: 0.0042 - accuracy: 0.0312 - val_loss: 0.0037 - val_accuracy: 0.0364
Epoch 15/30
256/256 [==============================] - 0s 219us/sample - loss: 0.0042 - accuracy: 0.0625 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 16/30
256/256 [==============================] - 0s 203us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 17/30
256/256 [==============================] - 0s 207us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 18/30
256/256 [==============================] - 0s 199us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 19/30
256/256 [==============================] - 0s 199us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 20/30
256/256 [==============================] - 0s 199us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 21/30
256/256 [==============================] - 0s 203us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 22/30
256/256 [==============================] - 0s 199us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 23/30
256/256 [==============================] - 0s 203us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 24/30
256/256 [==============================] - 0s 199us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 25/30
256/256 [==============================] - 0s 215us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 26/30
256/256 [==============================] - 0s 207us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 27/30
256/256 [==============================] - 0s 195us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 28/30
256/256 [==============================] - 0s 184us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 29/30
256/256 [==============================] - 0s 199us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0909
Epoch 30/30
256/256 [==============================] - 0s 199us/sample - loss: 0.0042 - accuracy: 0.0781 - val_loss: 0.0037 - val_accuracy: 0.0909

实验数据_cnnlstm_mape_第5张图片

pred
[7923.696  7740.4424 7512.9272 7214.1787 6914.0386 6676.3384 6528.638
 6453.856  6466.6724 6612.4575 6805.5674 7223.514  7664.2925 8048.367
 8458.108  8704.794  8927.65   9173.978  9301.087  9425.839  9456.668
 9467.464  9475.861  9464.989  9450.09   9445.564  9436.48   9437.943
 9406.843  9396.314  9406.29   9415.468  9424.467  9367.473  9346.866
 9223.6045 9201.393  9156.106  9155.85   9101.848  8980.621  8743.551
 8489.39   8446.115  8266.466  8273.215  8120.446  7979.259 ]
true
[7656.58 7479.25 7253.48 6997.73 6697.31 6495.14 6385.37 6388.82 6469.15
 6701.75 6914.02 7488.23 8049.46 8442.19 8901.89 9133.85 9224.   9456.13
 9599.82 9696.93 9716.15 9683.64 9701.91 9698.27 9640.22 9691.94 9685.57
 9730.46 9666.57 9489.31 9536.09 9513.78 9505.3  9472.51 9471.54 9214.27
 9103.36 8967.74 8939.74 9037.84 8926.12 8672.38 8349.17 8373.84 8233.03
 8222.77 8067.77 7873.98]

12-03

338
Model: "model_8"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_9 (InputLayer)         [(None, 48, 3)]           0         
_________________________________________________________________
conv1d_15 (Conv1D)           (None, 48, 128)           512       
_________________________________________________________________
conv1d_16 (Conv1D)           (None, 48, 128)           16512     
_________________________________________________________________
max_pooling1d_7 (MaxPooling1 (None, 1, 128)            0         
_________________________________________________________________
lstm_22 (LSTM)               (None, 1, 128)            131584    
_________________________________________________________________
lstm_23 (LSTM)               (None, 1, 64)             49408     
_________________________________________________________________
lstm_24 (LSTM)               (None, 48)                21696     
_________________________________________________________________
dense_8 (Dense)              (None, 48)                2352      
=================================================================
Total params: 222,064
Trainable params: 222,064
Non-trainable params: 0
_________________________________________________________________
Train on 338 samples, validate on 145 samples
Epoch 1/30
338/338 [==============================] - 4s 12ms/sample - loss: 0.1289 - accuracy: 0.0592 - val_loss: 0.0994 - val_accuracy: 0.0897
Epoch 2/30
338/338 [==============================] - 0s 243us/sample - loss: 0.0673 - accuracy: 0.0828 - val_loss: 0.0326 - val_accuracy: 0.0897
Epoch 3/30
338/338 [==============================] - 0s 284us/sample - loss: 0.0202 - accuracy: 0.0799 - val_loss: 0.0092 - val_accuracy: 0.0069
Epoch 4/30
338/338 [==============================] - 0s 210us/sample - loss: 0.0071 - accuracy: 0.0533 - val_loss: 0.0050 - val_accuracy: 0.1379
Epoch 5/30
338/338 [==============================] - 0s 183us/sample - loss: 0.0049 - accuracy: 0.1479 - val_loss: 0.0047 - val_accuracy: 0.0138
Epoch 6/30
338/338 [==============================] - 0s 195us/sample - loss: 0.0046 - accuracy: 0.0473 - val_loss: 0.0045 - val_accuracy: 0.0690
Epoch 7/30
338/338 [==============================] - 0s 186us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 8/30
338/338 [==============================] - 0s 186us/sample - loss: 0.0043 - accuracy: 0.0266 - val_loss: 0.0043 - val_accuracy: 0.0276
Epoch 9/30
338/338 [==============================] - 0s 207us/sample - loss: 0.0043 - accuracy: 0.0118 - val_loss: 0.0043 - val_accuracy: 0.0138
Epoch 10/30
338/338 [==============================] - 0s 210us/sample - loss: 0.0043 - accuracy: 0.0266 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 11/30
338/338 [==============================] - 0s 204us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 12/30
338/338 [==============================] - 0s 207us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 13/30
338/338 [==============================] - 0s 201us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 14/30
338/338 [==============================] - 0s 204us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 15/30
338/338 [==============================] - 0s 195us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 16/30
338/338 [==============================] - 0s 210us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 17/30
338/338 [==============================] - 0s 204us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 18/30
338/338 [==============================] - 0s 201us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 19/30
338/338 [==============================] - 0s 216us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 20/30
338/338 [==============================] - 0s 222us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 21/30
338/338 [==============================] - 0s 210us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 22/30
338/338 [==============================] - 0s 207us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 23/30
338/338 [==============================] - 0s 204us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 24/30
338/338 [==============================] - 0s 204us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 25/30
338/338 [==============================] - 0s 204us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 26/30
338/338 [==============================] - 0s 207us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 27/30
338/338 [==============================] - 0s 207us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 28/30
338/338 [==============================] - 0s 204us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 29/30
338/338 [==============================] - 0s 201us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690
Epoch 30/30
338/338 [==============================] - 0s 207us/sample - loss: 0.0043 - accuracy: 0.0799 - val_loss: 0.0043 - val_accuracy: 0.0690

实验数据_cnnlstm_mape_第6张图片

pred
[8029.077  7832.7837 7621.4766 7335.655  7043.4087 6779.6904 6602.6035
 6527.119  6526.795  6643.554  6819.8306 7246.94   7681.114  8106.089
 8522.493  8767.697  9014.915  9283.784  9399.581  9527.989  9572.119
 9575.893  9587.795  9572.996  9570.831  9570.688  9558.398  9551.962
 9541.78   9503.843  9522.485  9532.224  9535.44   9463.853  9446.416
 9334.089  9329.337  9317.327  9303.091  9227.795  9094.39   8865.05
 8602.331  8546.524  8357.236  8373.456  8222.707  8099.8384]
true
[ 7699.36  7537.08  7392.4   7066.04  6756.79  6593.92  6474.21  6476.41
  6564.8   6786.13  7053.51  7583.67  8147.22  8321.58  8765.5   9016.14
  9362.55  9787.3   9932.43 10048.65 10067.54 10063.7  10123.17 10090.39
 10065.33 10045.28 10056.09 10027.39 10000.66  9980.53  9951.47  9906.39
  9815.59  9628.56  9459.65  9173.19  8980.21  8857.04  8848.53  8871.45
  8792.61  8551.15  8375.27  8451.91  8385.28  8377.46  8299.01  8014.12]

12-04

296
Model: "model_9"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_10 (InputLayer)        [(None, 48, 3)]           0         
_________________________________________________________________
conv1d_17 (Conv1D)           (None, 48, 128)           512       
_________________________________________________________________
conv1d_18 (Conv1D)           (None, 48, 128)           16512     
_________________________________________________________________
max_pooling1d_8 (MaxPooling1 (None, 1, 128)            0         
_________________________________________________________________
lstm_25 (LSTM)               (None, 1, 128)            131584    
_________________________________________________________________
lstm_26 (LSTM)               (None, 1, 64)             49408     
_________________________________________________________________
lstm_27 (LSTM)               (None, 48)                21696     
_________________________________________________________________
dense_9 (Dense)              (None, 48)                2352      
=================================================================
Total params: 222,064
Trainable params: 222,064
Non-trainable params: 0
_________________________________________________________________
Train on 296 samples, validate on 128 samples
Epoch 1/30
296/296 [==============================] - 4s 13ms/sample - loss: 0.1391 - accuracy: 0.0304 - val_loss: 0.1114 - val_accuracy: 0.0547
Epoch 2/30
296/296 [==============================] - 0s 186us/sample - loss: 0.0845 - accuracy: 0.0372 - val_loss: 0.0381 - val_accuracy: 0.0547
Epoch 3/30
296/296 [==============================] - 0s 182us/sample - loss: 0.0287 - accuracy: 0.0372 - val_loss: 0.0139 - val_accuracy: 0.0547
Epoch 4/30
296/296 [==============================] - 0s 186us/sample - loss: 0.0098 - accuracy: 0.0372 - val_loss: 0.0065 - val_accuracy: 0.0000e+00
Epoch 5/30
296/296 [==============================] - 0s 182us/sample - loss: 0.0058 - accuracy: 0.0236 - val_loss: 0.0049 - val_accuracy: 0.0156
Epoch 6/30
296/296 [==============================] - 0s 182us/sample - loss: 0.0049 - accuracy: 0.1520 - val_loss: 0.0045 - val_accuracy: 0.1875
Epoch 7/30
296/296 [==============================] - 0s 182us/sample - loss: 0.0048 - accuracy: 0.1318 - val_loss: 0.0045 - val_accuracy: 0.0078
Epoch 8/30
296/296 [==============================] - 0s 182us/sample - loss: 0.0047 - accuracy: 0.0135 - val_loss: 0.0043 - val_accuracy: 0.0312
Epoch 9/30
296/296 [==============================] - 0s 182us/sample - loss: 0.0046 - accuracy: 0.0743 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 10/30
296/296 [==============================] - 0s 182us/sample - loss: 0.0046 - accuracy: 0.0811 - val_loss: 0.0043 - val_accuracy: 0.0547
Epoch 11/30
296/296 [==============================] - 0s 182us/sample - loss: 0.0046 - accuracy: 0.0372 - val_loss: 0.0043 - val_accuracy: 0.0156
Epoch 12/30
296/296 [==============================] - 0s 182us/sample - loss: 0.0046 - accuracy: 0.0878 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 13/30
296/296 [==============================] - 0s 179us/sample - loss: 0.0046 - accuracy: 0.1351 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 14/30
296/296 [==============================] - 0s 186us/sample - loss: 0.0046 - accuracy: 0.1419 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 15/30
296/296 [==============================] - 0s 182us/sample - loss: 0.0046 - accuracy: 0.1486 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 16/30
296/296 [==============================] - 0s 182us/sample - loss: 0.0046 - accuracy: 0.1385 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 17/30
296/296 [==============================] - 0s 182us/sample - loss: 0.0046 - accuracy: 0.1385 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 18/30
296/296 [==============================] - 0s 186us/sample - loss: 0.0046 - accuracy: 0.1385 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 19/30
296/296 [==============================] - 0s 196us/sample - loss: 0.0046 - accuracy: 0.1385 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 20/30
296/296 [==============================] - 0s 189us/sample - loss: 0.0046 - accuracy: 0.1385 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 21/30
296/296 [==============================] - 0s 186us/sample - loss: 0.0046 - accuracy: 0.1385 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 22/30
296/296 [==============================] - 0s 182us/sample - loss: 0.0046 - accuracy: 0.1385 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 23/30
296/296 [==============================] - 0s 186us/sample - loss: 0.0046 - accuracy: 0.1385 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 24/30
296/296 [==============================] - 0s 189us/sample - loss: 0.0046 - accuracy: 0.1385 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 25/30
296/296 [==============================] - 0s 186us/sample - loss: 0.0046 - accuracy: 0.1385 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 26/30
296/296 [==============================] - 0s 210us/sample - loss: 0.0046 - accuracy: 0.1385 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 27/30
296/296 [==============================] - 0s 186us/sample - loss: 0.0046 - accuracy: 0.1385 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 28/30
296/296 [==============================] - 0s 189us/sample - loss: 0.0046 - accuracy: 0.1385 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 29/30
296/296 [==============================] - 0s 186us/sample - loss: 0.0046 - accuracy: 0.1385 - val_loss: 0.0043 - val_accuracy: 0.0625
Epoch 30/30
296/296 [==============================] - 0s 182us/sample - loss: 0.0046 - accuracy: 0.1318 - val_loss: 0.0043 - val_accuracy: 0.0625

实验数据_cnnlstm_mape_第7张图片

pred
[7958.0576 7759.34   7539.0767 7212.7817 6907.1313 6678.9155 6552.6836
 6511.8564 6528.0137 6714.446  6925.062  7394.0566 7890.836  8268.435
 8681.149  8906.056  9120.451  9388.612  9510.767  9614.055  9668.039
 9675.767  9709.273  9691.607  9687.969  9700.235  9685.791  9709.01
 9681.103  9660.03   9686.174  9686.691  9696.266  9629.146  9574.18
 9401.561  9327.356  9242.614  9240.816  9213.33   9083.897  8838.83
 8564.487  8539.373  8357.995  8344.04   8199.021  8029.7124]
true
[7855.8  7695.78 7386.18 7085.98 6814.34 6615.06 6463.05 6418.7  6422.93
 6515.57 6520.47 6708.32 7025.89 7360.76 7733.97 8128.16 8451.18 8810.48
 8874.04 8992.48 8987.8  8916.44 8932.77 8875.14 8795.33 8741.9  8755.06
 8739.13 8734.63 8716.26 8664.24 8683.48 8658.94 8507.08 8543.99 8514.78
 8485.41 8352.18 8330.75 8346.79 8305.49 8117.21 7998.15 7955.43 7823.99
 7858.36 7662.11 7422.66]

12-05

131
Model: "model_10"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_11 (InputLayer)        [(None, 48, 3)]           0         
_________________________________________________________________
conv1d_19 (Conv1D)           (None, 48, 128)           512       
_________________________________________________________________
conv1d_20 (Conv1D)           (None, 48, 128)           16512     
_________________________________________________________________
max_pooling1d_9 (MaxPooling1 (None, 1, 128)            0         
_________________________________________________________________
lstm_28 (LSTM)               (None, 1, 128)            131584    
_________________________________________________________________
lstm_29 (LSTM)               (None, 1, 64)             49408     
_________________________________________________________________
lstm_30 (LSTM)               (None, 48)                21696     
_________________________________________________________________
dense_10 (Dense)             (None, 48)                2352      
=================================================================
Total params: 222,064
Trainable params: 222,064
Non-trainable params: 0
_________________________________________________________________
Train on 131 samples, validate on 57 samples
Epoch 1/30
131/131 [==============================] - 4s 29ms/sample - loss: 0.1047 - accuracy: 0.0000e+00 - val_loss: 0.0989 - val_accuracy: 0.0175
Epoch 2/30
131/131 [==============================] - 0s 275us/sample - loss: 0.0932 - accuracy: 0.0153 - val_loss: 0.0816 - val_accuracy: 0.0175
Epoch 3/30
131/131 [==============================] - 0s 290us/sample - loss: 0.0726 - accuracy: 0.0153 - val_loss: 0.0552 - val_accuracy: 0.0175
Epoch 4/30
131/131 [==============================] - 0s 252us/sample - loss: 0.0469 - accuracy: 0.0153 - val_loss: 0.0359 - val_accuracy: 0.0175
Epoch 5/30
131/131 [==============================] - 0s 260us/sample - loss: 0.0310 - accuracy: 0.0153 - val_loss: 0.0230 - val_accuracy: 0.0175
Epoch 6/30
131/131 [==============================] - 0s 275us/sample - loss: 0.0186 - accuracy: 0.0305 - val_loss: 0.0138 - val_accuracy: 0.1228
Epoch 7/30
131/131 [==============================] - 0s 283us/sample - loss: 0.0115 - accuracy: 0.1069 - val_loss: 0.0101 - val_accuracy: 0.1228
Epoch 8/30
131/131 [==============================] - 0s 275us/sample - loss: 0.0085 - accuracy: 0.1069 - val_loss: 0.0083 - val_accuracy: 0.0000e+00
Epoch 9/30
131/131 [==============================] - 0s 275us/sample - loss: 0.0072 - accuracy: 0.0229 - val_loss: 0.0078 - val_accuracy: 0.0000e+00
Epoch 10/30
131/131 [==============================] - 0s 260us/sample - loss: 0.0069 - accuracy: 0.0611 - val_loss: 0.0077 - val_accuracy: 0.0877
Epoch 11/30
131/131 [==============================] - 0s 252us/sample - loss: 0.0069 - accuracy: 0.1221 - val_loss: 0.0077 - val_accuracy: 0.0702
Epoch 12/30
131/131 [==============================] - 0s 260us/sample - loss: 0.0068 - accuracy: 0.0611 - val_loss: 0.0075 - val_accuracy: 0.0702
Epoch 13/30
131/131 [==============================] - 0s 260us/sample - loss: 0.0067 - accuracy: 0.0611 - val_loss: 0.0074 - val_accuracy: 0.2105
Epoch 14/30
131/131 [==============================] - 0s 267us/sample - loss: 0.0065 - accuracy: 0.0687 - val_loss: 0.0074 - val_accuracy: 0.0351
Epoch 15/30
131/131 [==============================] - 0s 267us/sample - loss: 0.0065 - accuracy: 0.0153 - val_loss: 0.0074 - val_accuracy: 0.0351
Epoch 16/30
131/131 [==============================] - 0s 275us/sample - loss: 0.0065 - accuracy: 0.0992 - val_loss: 0.0073 - val_accuracy: 0.0877
Epoch 17/30
131/131 [==============================] - 0s 283us/sample - loss: 0.0065 - accuracy: 0.1298 - val_loss: 0.0073 - val_accuracy: 0.0877
Epoch 18/30
131/131 [==============================] - 0s 283us/sample - loss: 0.0065 - accuracy: 0.1298 - val_loss: 0.0073 - val_accuracy: 0.0877
Epoch 19/30
131/131 [==============================] - 0s 260us/sample - loss: 0.0065 - accuracy: 0.1298 - val_loss: 0.0073 - val_accuracy: 0.0877
Epoch 20/30
131/131 [==============================] - 0s 267us/sample - loss: 0.0065 - accuracy: 0.1298 - val_loss: 0.0073 - val_accuracy: 0.0877
Epoch 21/30
131/131 [==============================] - 0s 252us/sample - loss: 0.0065 - accuracy: 0.1298 - val_loss: 0.0073 - val_accuracy: 0.0877
Epoch 22/30
131/131 [==============================] - 0s 260us/sample - loss: 0.0065 - accuracy: 0.1069 - val_loss: 0.0073 - val_accuracy: 0.0877
Epoch 23/30
131/131 [==============================] - 0s 252us/sample - loss: 0.0065 - accuracy: 0.1069 - val_loss: 0.0073 - val_accuracy: 0.0877
Epoch 24/30
131/131 [==============================] - 0s 252us/sample - loss: 0.0065 - accuracy: 0.1069 - val_loss: 0.0073 - val_accuracy: 0.0877
Epoch 25/30
131/131 [==============================] - 0s 237us/sample - loss: 0.0065 - accuracy: 0.1298 - val_loss: 0.0073 - val_accuracy: 0.0877
Epoch 26/30
131/131 [==============================] - 0s 244us/sample - loss: 0.0065 - accuracy: 0.1298 - val_loss: 0.0073 - val_accuracy: 0.0877
Epoch 27/30
131/131 [==============================] - 0s 260us/sample - loss: 0.0065 - accuracy: 0.1298 - val_loss: 0.0073 - val_accuracy: 0.0877
Epoch 28/30
131/131 [==============================] - 0s 244us/sample - loss: 0.0065 - accuracy: 0.1298 - val_loss: 0.0073 - val_accuracy: 0.0877
Epoch 29/30
131/131 [==============================] - 0s 267us/sample - loss: 0.0065 - accuracy: 0.1298 - val_loss: 0.0073 - val_accuracy: 0.0877
Epoch 30/30
131/131 [==============================] - 0s 260us/sample - loss: 0.0065 - accuracy: 0.1298 - val_loss: 0.0073 - val_accuracy: 0.0877

实验数据_cnnlstm_mape_第8张图片

pred
[7468.0015 7285.739  7111.7686 6854.704  6618.753  6416.906  6289.9585
 6213.242  6205.834  6275.3574 6364.411  6588.532  6818.9316 7098.265
 7404.6235 7734.5225 8051.6675 8384.511  8600.304  8783.04   8846.781
 8889.047  8908.35   8886.795  8877.337  8869.199  8844.105  8823.986
 8800.736  8796.681  8797.683  8815.177  8827.009  8812.868  8855.23
 8831.113  8929.532  8987.874  8969.738  8892.95   8802.79   8597.65
 8361.995  8250.179  8041.98   8008.451  7846.6953 7713.182 ]
true
[7214.25 7077.09 6917.48 6689.79 6477.29 6276.32 6197.87 6187.15 6118.35
 6181.4  6171.95 6235.12 6437.94 6705.62 7056.8  7474.55 7829.64 8123.71
 8328.35 8516.35 8567.91 8617.11 8614.07 8563.02 8574.88 8522.64 8432.17
 8448.92 8383.17 8395.01 8431.68 8447.05 8493.35 8393.05 8468.13 8468.94
 8448.69 8355.89 8419.57 8511.51 8562.32 8415.76 8136.01 8077.22 7828.02
 7755.79 7510.79 7312.72]

 12-06

166
Model: "model_11"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_12 (InputLayer)        [(None, 48, 3)]           0         
_________________________________________________________________
conv1d_21 (Conv1D)           (None, 48, 128)           512       
_________________________________________________________________
conv1d_22 (Conv1D)           (None, 48, 128)           16512     
_________________________________________________________________
max_pooling1d_10 (MaxPooling (None, 1, 128)            0         
_________________________________________________________________
lstm_31 (LSTM)               (None, 1, 128)            131584    
_________________________________________________________________
lstm_32 (LSTM)               (None, 1, 64)             49408     
_________________________________________________________________
lstm_33 (LSTM)               (None, 48)                21696     
_________________________________________________________________
dense_11 (Dense)             (None, 48)                2352      
=================================================================
Total params: 222,064
Trainable params: 222,064
Non-trainable params: 0
_________________________________________________________________
Train on 166 samples, validate on 72 samples
Epoch 1/30
166/166 [==============================] - 3s 19ms/sample - loss: 0.1174 - accuracy: 0.0120 - val_loss: 0.0989 - val_accuracy: 0.0000e+00
Epoch 2/30
166/166 [==============================] - 0s 229us/sample - loss: 0.0984 - accuracy: 0.0120 - val_loss: 0.0699 - val_accuracy: 0.0000e+00
Epoch 3/30
166/166 [==============================] - 0s 229us/sample - loss: 0.0617 - accuracy: 0.0602 - val_loss: 0.0356 - val_accuracy: 0.0972
Epoch 4/30
166/166 [==============================] - 0s 223us/sample - loss: 0.0335 - accuracy: 0.1265 - val_loss: 0.0249 - val_accuracy: 0.0000e+00
Epoch 5/30
166/166 [==============================] - 0s 223us/sample - loss: 0.0189 - accuracy: 0.0120 - val_loss: 0.0116 - val_accuracy: 0.0000e+00
Epoch 6/30
166/166 [==============================] - 0s 229us/sample - loss: 0.0107 - accuracy: 0.0120 - val_loss: 0.0090 - val_accuracy: 0.0139
Epoch 7/30
166/166 [==============================] - 0s 229us/sample - loss: 0.0084 - accuracy: 0.0060 - val_loss: 0.0083 - val_accuracy: 0.0139
Epoch 8/30
166/166 [==============================] - 0s 229us/sample - loss: 0.0077 - accuracy: 0.0843 - val_loss: 0.0081 - val_accuracy: 0.2361
Epoch 9/30
166/166 [==============================] - 0s 223us/sample - loss: 0.0074 - accuracy: 0.1687 - val_loss: 0.0075 - val_accuracy: 0.2361
Epoch 10/30
166/166 [==============================] - 0s 229us/sample - loss: 0.0073 - accuracy: 0.1506 - val_loss: 0.0073 - val_accuracy: 0.0278
Epoch 11/30
166/166 [==============================] - 0s 229us/sample - loss: 0.0070 - accuracy: 0.0120 - val_loss: 0.0074 - val_accuracy: 0.0278
Epoch 12/30
166/166 [==============================] - 0s 229us/sample - loss: 0.0070 - accuracy: 0.0120 - val_loss: 0.0073 - val_accuracy: 0.0417
Epoch 13/30
166/166 [==============================] - 0s 235us/sample - loss: 0.0069 - accuracy: 0.0361 - val_loss: 0.0072 - val_accuracy: 0.0972
Epoch 14/30
166/166 [==============================] - 0s 223us/sample - loss: 0.0069 - accuracy: 0.1265 - val_loss: 0.0073 - val_accuracy: 0.0139
Epoch 15/30
166/166 [==============================] - 0s 247us/sample - loss: 0.0069 - accuracy: 0.0060 - val_loss: 0.0073 - val_accuracy: 0.0000e+00
Epoch 16/30
166/166 [==============================] - 0s 235us/sample - loss: 0.0069 - accuracy: 0.0120 - val_loss: 0.0072 - val_accuracy: 0.0000e+00
Epoch 17/30
166/166 [==============================] - 0s 223us/sample - loss: 0.0069 - accuracy: 0.0120 - val_loss: 0.0073 - val_accuracy: 0.0000e+00
Epoch 18/30
166/166 [==============================] - 0s 235us/sample - loss: 0.0069 - accuracy: 0.0181 - val_loss: 0.0073 - val_accuracy: 0.0417
Epoch 19/30
166/166 [==============================] - 0s 241us/sample - loss: 0.0069 - accuracy: 0.0361 - val_loss: 0.0072 - val_accuracy: 0.0417
Epoch 20/30
166/166 [==============================] - 0s 247us/sample - loss: 0.0069 - accuracy: 0.0361 - val_loss: 0.0073 - val_accuracy: 0.0417
Epoch 21/30
166/166 [==============================] - 0s 247us/sample - loss: 0.0069 - accuracy: 0.0361 - val_loss: 0.0073 - val_accuracy: 0.0417
Epoch 22/30
166/166 [==============================] - 0s 235us/sample - loss: 0.0069 - accuracy: 0.0361 - val_loss: 0.0073 - val_accuracy: 0.0417
Epoch 23/30
166/166 [==============================] - 0s 241us/sample - loss: 0.0069 - accuracy: 0.0361 - val_loss: 0.0073 - val_accuracy: 0.0417
Epoch 24/30
166/166 [==============================] - 0s 229us/sample - loss: 0.0069 - accuracy: 0.0361 - val_loss: 0.0073 - val_accuracy: 0.0417
Epoch 25/30
166/166 [==============================] - 0s 229us/sample - loss: 0.0069 - accuracy: 0.0361 - val_loss: 0.0073 - val_accuracy: 0.0417
Epoch 26/30
166/166 [==============================] - 0s 229us/sample - loss: 0.0069 - accuracy: 0.0361 - val_loss: 0.0073 - val_accuracy: 0.0417
Epoch 27/30
166/166 [==============================] - 0s 241us/sample - loss: 0.0069 - accuracy: 0.0361 - val_loss: 0.0073 - val_accuracy: 0.0417
Epoch 28/30
166/166 [==============================] - 0s 229us/sample - loss: 0.0069 - accuracy: 0.0361 - val_loss: 0.0073 - val_accuracy: 0.0417
Epoch 29/30
166/166 [==============================] - 0s 229us/sample - loss: 0.0069 - accuracy: 0.0361 - val_loss: 0.0073 - val_accuracy: 0.0417
Epoch 30/30
166/166 [==============================] - 0s 229us/sample - loss: 0.0069 - accuracy: 0.0361 - val_loss: 0.0073 - val_accuracy: 0.0417

实验数据_cnnlstm_mape_第9张图片

pred
[7354.5415 7182.1494 7023.1646 6786.9297 6575.576  6383.8877 6265.299
 6217.471  6225.808  6324.8413 6465.7065 6781.5767 7123.324  7488.344
 7885.414  8183.9663 8475.529  8767.779  8939.7705 9089.954  9161.981
 9205.077  9236.669  9222.855  9224.209  9227.723  9209.451  9200.444
 9182.419  9181.256  9187.504  9200.718  9202.423  9174.872  9196.694
 9130.557  9190.921  9206.005  9180.409  9075.652  8950.305  8722.285
 8462.343  8361.873  8143.2144 8144.484  7988.845  7862.783 ]
true
[ 7163.05  7026.16  6907.79  6608.58  6414.14  6387.98  6332.67  6344.56
  6390.92  6646.32  6932.74  7525.56  8191.29  8655.    9120.43  9367.3
  9579.16  9913.47 10099.72 10200.93 10286.04 10252.95 10332.86 10340.67
 10322.2  10308.35 10352.2  10331.84 10309.15 10244.07 10274.3  10298.24
 10257.93 10166.79  9939.89  9578.21  9406.9   9190.17  9094.8   9258.12
  9218.7   9014.7   8716.05  8661.35  8453.62  8341.08  8195.69  7920.77]

 12-07

249
Model: "model_12"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_13 (InputLayer)        [(None, 48, 3)]           0         
_________________________________________________________________
conv1d_23 (Conv1D)           (None, 48, 128)           512       
_________________________________________________________________
conv1d_24 (Conv1D)           (None, 48, 128)           16512     
_________________________________________________________________
max_pooling1d_11 (MaxPooling (None, 1, 128)            0         
_________________________________________________________________
lstm_34 (LSTM)               (None, 1, 128)            131584    
_________________________________________________________________
lstm_35 (LSTM)               (None, 1, 64)             49408     
_________________________________________________________________
lstm_36 (LSTM)               (None, 48)                21696     
_________________________________________________________________
dense_12 (Dense)             (None, 48)                2352      
=================================================================
Total params: 222,064
Trainable params: 222,064
Non-trainable params: 0
_________________________________________________________________
Train on 249 samples, validate on 107 samples
Epoch 1/30
249/249 [==============================] - 3s 12ms/sample - loss: 0.1597 - accuracy: 0.0442 - val_loss: 0.1422 - val_accuracy: 0.0000e+00
Epoch 2/30
249/249 [==============================] - 0s 221us/sample - loss: 0.1295 - accuracy: 0.0000e+00 - val_loss: 0.0909 - val_accuracy: 0.0000e+00
Epoch 3/30
249/249 [==============================] - 0s 221us/sample - loss: 0.0688 - accuracy: 0.0040 - val_loss: 0.0425 - val_accuracy: 0.0000e+00
Epoch 4/30
249/249 [==============================] - 0s 221us/sample - loss: 0.0294 - accuracy: 0.0080 - val_loss: 0.0129 - val_accuracy: 0.0000e+00
Epoch 5/30
249/249 [==============================] - 0s 221us/sample - loss: 0.0093 - accuracy: 0.0602 - val_loss: 0.0061 - val_accuracy: 0.0935
Epoch 6/30
249/249 [==============================] - 0s 217us/sample - loss: 0.0057 - accuracy: 0.0723 - val_loss: 0.0053 - val_accuracy: 0.0280
Epoch 7/30
249/249 [==============================] - 0s 233us/sample - loss: 0.0049 - accuracy: 0.0120 - val_loss: 0.0048 - val_accuracy: 0.0280
Epoch 8/30
249/249 [==============================] - 0s 225us/sample - loss: 0.0046 - accuracy: 0.0120 - val_loss: 0.0044 - val_accuracy: 0.0280
Epoch 9/30
249/249 [==============================] - 0s 225us/sample - loss: 0.0042 - accuracy: 0.0402 - val_loss: 0.0042 - val_accuracy: 0.0280
Epoch 10/30
249/249 [==============================] - 0s 217us/sample - loss: 0.0042 - accuracy: 0.1365 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 11/30
249/249 [==============================] - 0s 217us/sample - loss: 0.0042 - accuracy: 0.1807 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 12/30
249/249 [==============================] - 0s 217us/sample - loss: 0.0041 - accuracy: 0.1847 - val_loss: 0.0041 - val_accuracy: 0.2523
Epoch 13/30
249/249 [==============================] - 0s 217us/sample - loss: 0.0041 - accuracy: 0.1968 - val_loss: 0.0041 - val_accuracy: 0.2523
Epoch 14/30
249/249 [==============================] - 0s 217us/sample - loss: 0.0041 - accuracy: 0.1968 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 15/30
249/249 [==============================] - 0s 217us/sample - loss: 0.0041 - accuracy: 0.1807 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 16/30
249/249 [==============================] - 0s 217us/sample - loss: 0.0041 - accuracy: 0.1566 - val_loss: 0.0041 - val_accuracy: 0.2523
Epoch 17/30
249/249 [==============================] - 0s 217us/sample - loss: 0.0041 - accuracy: 0.1968 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 18/30
249/249 [==============================] - 0s 221us/sample - loss: 0.0041 - accuracy: 0.1807 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 19/30
249/249 [==============================] - 0s 217us/sample - loss: 0.0041 - accuracy: 0.1807 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 20/30
249/249 [==============================] - 0s 217us/sample - loss: 0.0041 - accuracy: 0.1807 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 21/30
249/249 [==============================] - 0s 221us/sample - loss: 0.0041 - accuracy: 0.1807 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 22/30
249/249 [==============================] - 0s 229us/sample - loss: 0.0041 - accuracy: 0.1807 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 23/30
249/249 [==============================] - 0s 213us/sample - loss: 0.0041 - accuracy: 0.1807 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 24/30
249/249 [==============================] - 0s 225us/sample - loss: 0.0041 - accuracy: 0.1807 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 25/30
249/249 [==============================] - 0s 241us/sample - loss: 0.0041 - accuracy: 0.1807 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 26/30
249/249 [==============================] - 0s 229us/sample - loss: 0.0041 - accuracy: 0.1807 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 27/30
249/249 [==============================] - 0s 241us/sample - loss: 0.0041 - accuracy: 0.1807 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 28/30
249/249 [==============================] - 0s 225us/sample - loss: 0.0041 - accuracy: 0.1807 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 29/30
249/249 [==============================] - 0s 217us/sample - loss: 0.0041 - accuracy: 0.1807 - val_loss: 0.0041 - val_accuracy: 0.1121
Epoch 30/30
249/249 [==============================] - 0s 217us/sample - loss: 0.0041 - accuracy: 0.1807 - val_loss: 0.0041 - val_accuracy: 0.1121

实验数据_cnnlstm_mape_第10张图片

pred
[ 7950.193   7752.6665  7509.118   7175.239   6858.6357  6656.558
  6547.883   6516.3267  6567.1587  6757.177   7002.997   7536.818
  8058.975   8442.772   8856.219   9099.122   9299.801   9566.696
  9714.75    9822.131   9881.78    9907.257   9956.303   9942.771
  9949.823   9984.191  10001.972  10012.741  10009.141   9993.645
 10024.178  10048.361  10039.312   9939.174   9861.79    9632.454
  9504.164   9341.0205  9316.258   9310.65    9220.555   8952.673
  8659.018   8647.736   8458.635   8429.165   8263.59    8075.7935]
true
[ 7766.2   7617.96  7374.47  7021.17  6807.34  6672.05  6590.83  6575.89
  6639.22  6807.09  7069.38  7606.75  8210.51  8694.29  9132.02  9381.14
  9588.18  9836.12  9983.45 10149.58 10162.29 10159.52 10194.68 10203.19
 10242.13 10289.77 10324.48 10372.78 10388.92 10338.68 10375.31 10367.85
 10403.75 10288.48 10120.55  9782.41  9533.65  9305.94  9190.81  9279.69
  9226.3   9068.28  8717.32  8689.82  8425.41  8369.89  8204.88  7994.47]

 12-09

220
Model: "model_13"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_14 (InputLayer)        [(None, 48, 3)]           0         
_________________________________________________________________
conv1d_25 (Conv1D)           (None, 48, 128)           512       
_________________________________________________________________
conv1d_26 (Conv1D)           (None, 48, 128)           16512     
_________________________________________________________________
max_pooling1d_12 (MaxPooling (None, 1, 128)            0         
_________________________________________________________________
lstm_37 (LSTM)               (None, 1, 128)            131584    
_________________________________________________________________
lstm_38 (LSTM)               (None, 1, 64)             49408     
_________________________________________________________________
lstm_39 (LSTM)               (None, 48)                21696     
_________________________________________________________________
dense_13 (Dense)             (None, 48)                2352      
=================================================================
Total params: 222,064
Trainable params: 222,064
Non-trainable params: 0
_________________________________________________________________
Train on 220 samples, validate on 95 samples
Epoch 1/30
220/220 [==============================] - 4s 18ms/sample - loss: 0.1735 - accuracy: 0.0455 - val_loss: 0.1582 - val_accuracy: 0.0000e+00
Epoch 2/30
220/220 [==============================] - 0s 214us/sample - loss: 0.1397 - accuracy: 0.0000e+00 - val_loss: 0.1031 - val_accuracy: 0.0000e+00
Epoch 3/30
220/220 [==============================] - 0s 227us/sample - loss: 0.0733 - accuracy: 0.0000e+00 - val_loss: 0.0429 - val_accuracy: 0.0000e+00
Epoch 4/30
220/220 [==============================] - 0s 236us/sample - loss: 0.0320 - accuracy: 0.0227 - val_loss: 0.0217 - val_accuracy: 0.0211
Epoch 5/30
220/220 [==============================] - 0s 218us/sample - loss: 0.0119 - accuracy: 0.0136 - val_loss: 0.0078 - val_accuracy: 0.0211
Epoch 6/30
220/220 [==============================] - 0s 223us/sample - loss: 0.0062 - accuracy: 0.0591 - val_loss: 0.0073 - val_accuracy: 0.0316
Epoch 7/30
220/220 [==============================] - 0s 227us/sample - loss: 0.0057 - accuracy: 0.0364 - val_loss: 0.0069 - val_accuracy: 0.0316
Epoch 8/30
220/220 [==============================] - 0s 218us/sample - loss: 0.0052 - accuracy: 0.0318 - val_loss: 0.0066 - val_accuracy: 0.0526
Epoch 9/30
220/220 [==============================] - 0s 236us/sample - loss: 0.0047 - accuracy: 0.1091 - val_loss: 0.0062 - val_accuracy: 0.1789
Epoch 10/30
220/220 [==============================] - 0s 232us/sample - loss: 0.0045 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 11/30
220/220 [==============================] - 0s 227us/sample - loss: 0.0044 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.0842
Epoch 12/30
220/220 [==============================] - 0s 227us/sample - loss: 0.0043 - accuracy: 0.0955 - val_loss: 0.0060 - val_accuracy: 0.0632
Epoch 13/30
220/220 [==============================] - 0s 227us/sample - loss: 0.0043 - accuracy: 0.0818 - val_loss: 0.0060 - val_accuracy: 0.0632
Epoch 14/30
220/220 [==============================] - 0s 223us/sample - loss: 0.0043 - accuracy: 0.1864 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 15/30
220/220 [==============================] - 0s 218us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 16/30
220/220 [==============================] - 0s 223us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 17/30
220/220 [==============================] - 0s 223us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 18/30
220/220 [==============================] - 0s 223us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 19/30
220/220 [==============================] - 0s 223us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 20/30
220/220 [==============================] - 0s 227us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 21/30
220/220 [==============================] - 0s 227us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 22/30
220/220 [==============================] - 0s 223us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 23/30
220/220 [==============================] - 0s 227us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 24/30
220/220 [==============================] - 0s 223us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 25/30
220/220 [==============================] - 0s 232us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 26/30
220/220 [==============================] - 0s 223us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 27/30
220/220 [==============================] - 0s 236us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 28/30
220/220 [==============================] - 0s 223us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 29/30
220/220 [==============================] - 0s 218us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789
Epoch 30/30
220/220 [==============================] - 0s 223us/sample - loss: 0.0043 - accuracy: 0.1909 - val_loss: 0.0060 - val_accuracy: 0.1789

实验数据_cnnlstm_mape_第11张图片

pred
[ 8016.648   7805.978   7565.331   7230.298   6919.1523  6718.987
  6612.395   6582.264   6642.0503  6825.507   7071.503   7592.791
  8105.245   8503.763   8918.699   9190.54    9414.387   9704.001
  9857.393   9987.211  10060.728  10099.812  10158.307  10168.561
 10185.43   10232.804  10262.596  10288.372  10289.455  10274.563
 10311.14   10325.333  10309.139  10204.239  10100.845   9852.403
  9704.055   9517.541   9475.692   9465.053   9359.09    9088.985
  8783.88    8770.908   8567.945   8523.855   8342.21    8139.722 ]
true
[ 7851.48  7690.51  7506.92  7194.91  6880.82  6697.45  6590.59  6611.04
  6667.32  6994.37  7313.09  7882.93  8454.02  8989.95  9480.06  9754.4
  9945.64 10205.44 10561.68 10735.26 10858.15 10972.1  11077.14 11138.77
 11105.14 11229.09 11354.06 11342.74 11301.26 11240.85 11358.58 11354.11
 11337.73 11077.   10948.37 10644.45 10397.09 10175.02 10005.1  10062.6
  9950.54  9583.81  9215.56  9072.67  8819.12  8642.37  8404.12  8150.69]

 12-10

141
Model: "model_14"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_15 (InputLayer)        [(None, 48, 3)]           0         
_________________________________________________________________
conv1d_27 (Conv1D)           (None, 48, 128)           512       
_________________________________________________________________
conv1d_28 (Conv1D)           (None, 48, 128)           16512     
_________________________________________________________________
max_pooling1d_13 (MaxPooling (None, 1, 128)            0         
_________________________________________________________________
lstm_40 (LSTM)               (None, 1, 128)            131584    
_________________________________________________________________
lstm_41 (LSTM)               (None, 1, 64)             49408     
_________________________________________________________________
lstm_42 (LSTM)               (None, 48)                21696     
_________________________________________________________________
dense_14 (Dense)             (None, 48)                2352      
=================================================================
Total params: 222,064
Trainable params: 222,064
Non-trainable params: 0
_________________________________________________________________
Train on 141 samples, validate on 61 samples
Epoch 1/30
141/141 [==============================] - 3s 20ms/sample - loss: 0.1910 - accuracy: 0.0213 - val_loss: 0.1854 - val_accuracy: 0.0164
Epoch 2/30
141/141 [==============================] - 0s 277us/sample - loss: 0.1731 - accuracy: 0.0426 - val_loss: 0.1569 - val_accuracy: 0.0164
Epoch 3/30
141/141 [==============================] - 0s 248us/sample - loss: 0.1370 - accuracy: 0.0426 - val_loss: 0.1055 - val_accuracy: 0.0164
Epoch 4/30
141/141 [==============================] - 0s 255us/sample - loss: 0.0827 - accuracy: 0.0426 - val_loss: 0.0546 - val_accuracy: 0.0164
Epoch 5/30
141/141 [==============================] - 0s 248us/sample - loss: 0.0446 - accuracy: 0.0426 - val_loss: 0.0349 - val_accuracy: 0.0164
Epoch 6/30
141/141 [==============================] - 0s 255us/sample - loss: 0.0265 - accuracy: 0.0426 - val_loss: 0.0170 - val_accuracy: 0.0656
Epoch 7/30
141/141 [==============================] - 0s 255us/sample - loss: 0.0123 - accuracy: 0.0638 - val_loss: 0.0112 - val_accuracy: 0.0164
Epoch 8/30
141/141 [==============================] - 0s 305us/sample - loss: 0.0089 - accuracy: 0.0000e+00 - val_loss: 0.0095 - val_accuracy: 0.0164
Epoch 9/30
141/141 [==============================] - 0s 284us/sample - loss: 0.0071 - accuracy: 0.0142 - val_loss: 0.0080 - val_accuracy: 0.0000e+00
Epoch 10/30
141/141 [==============================] - 0s 284us/sample - loss: 0.0062 - accuracy: 0.0496 - val_loss: 0.0076 - val_accuracy: 0.0820
Epoch 11/30
141/141 [==============================] - 0s 262us/sample - loss: 0.0058 - accuracy: 0.1560 - val_loss: 0.0073 - val_accuracy: 0.2295
Epoch 12/30
141/141 [==============================] - 0s 262us/sample - loss: 0.0055 - accuracy: 0.2057 - val_loss: 0.0072 - val_accuracy: 0.2295
Epoch 13/30
141/141 [==============================] - 0s 255us/sample - loss: 0.0054 - accuracy: 0.2057 - val_loss: 0.0070 - val_accuracy: 0.2295
Epoch 14/30
141/141 [==============================] - 0s 255us/sample - loss: 0.0053 - accuracy: 0.2057 - val_loss: 0.0068 - val_accuracy: 0.2295
Epoch 15/30
141/141 [==============================] - 0s 270us/sample - loss: 0.0051 - accuracy: 0.1135 - val_loss: 0.0068 - val_accuracy: 0.0164
Epoch 16/30
141/141 [==============================] - 0s 248us/sample - loss: 0.0051 - accuracy: 0.0426 - val_loss: 0.0067 - val_accuracy: 0.0164
Epoch 17/30
141/141 [==============================] - 0s 262us/sample - loss: 0.0051 - accuracy: 0.0355 - val_loss: 0.0067 - val_accuracy: 0.0820
Epoch 18/30
141/141 [==============================] - 0s 255us/sample - loss: 0.0051 - accuracy: 0.0851 - val_loss: 0.0067 - val_accuracy: 0.0820
Epoch 19/30
141/141 [==============================] - 0s 255us/sample - loss: 0.0051 - accuracy: 0.0851 - val_loss: 0.0067 - val_accuracy: 0.0820
Epoch 20/30
141/141 [==============================] - 0s 262us/sample - loss: 0.0050 - accuracy: 0.1773 - val_loss: 0.0067 - val_accuracy: 0.2131
Epoch 21/30
141/141 [==============================] - 0s 262us/sample - loss: 0.0050 - accuracy: 0.2340 - val_loss: 0.0067 - val_accuracy: 0.2131
Epoch 22/30
141/141 [==============================] - 0s 255us/sample - loss: 0.0050 - accuracy: 0.2340 - val_loss: 0.0067 - val_accuracy: 0.2131
Epoch 23/30
141/141 [==============================] - 0s 270us/sample - loss: 0.0050 - accuracy: 0.2340 - val_loss: 0.0067 - val_accuracy: 0.2131
Epoch 24/30
141/141 [==============================] - 0s 255us/sample - loss: 0.0050 - accuracy: 0.2340 - val_loss: 0.0067 - val_accuracy: 0.2295
Epoch 25/30
141/141 [==============================] - 0s 255us/sample - loss: 0.0050 - accuracy: 0.1631 - val_loss: 0.0067 - val_accuracy: 0.2131
Epoch 26/30
141/141 [==============================] - 0s 262us/sample - loss: 0.0050 - accuracy: 0.2340 - val_loss: 0.0067 - val_accuracy: 0.2131
Epoch 27/30
141/141 [==============================] - 0s 270us/sample - loss: 0.0050 - accuracy: 0.2340 - val_loss: 0.0067 - val_accuracy: 0.2131
Epoch 28/30
141/141 [==============================] - 0s 262us/sample - loss: 0.0050 - accuracy: 0.2340 - val_loss: 0.0067 - val_accuracy: 0.2131
Epoch 29/30
141/141 [==============================] - 0s 262us/sample - loss: 0.0050 - accuracy: 0.2340 - val_loss: 0.0067 - val_accuracy: 0.2131
Epoch 30/30
141/141 [==============================] - 0s 248us/sample - loss: 0.0050 - accuracy: 0.2340 - val_loss: 0.0067 - val_accuracy: 0.2131

实验数据_cnnlstm_mape_第12张图片

pred
[ 8100.379   7865.2554  7619.7754  7276.385   6973.5347  6773.982
  6673.3857  6649.74    6698.5664  6889.2485  7125.77    7637.74
  8140.2935  8540.239   8965.953   9274.014   9529.929   9845.902
 10007.664  10151.896  10240.472  10280.724  10356.417  10382.71
 10418.244  10478.212  10513.614  10547.599  10557.547  10550.114
 10585.956  10600.247  10574.437  10452.327  10347.073  10070.917
  9900.528   9676.019   9598.458   9578.768   9478.127   9207.914
  8900.773   8874.115   8658.45    8589.455   8392.926   8180.1553]
true
[ 7885.41  7664.28  7440.68  7107.98  6876.49  6682.53  6607.52  6631.62
  6694.8   6931.47  7230.92  7782.65  8384.73  8785.7   9295.35  9540.97
  9715.03  9909.28 10038.93 10118.85  9867.44  9968.02 10288.14 10332.53
 10314.62 10365.4  10391.8  10378.99 10404.25 10357.03 10454.88 10463.79
 10407.06 10267.94 10102.7   9729.57  9550.78  9345.59  9231.96  9241.71
  9302.13  9111.91  8834.14  8776.39  8604.26  8438.    8267.3   7948.22]

12-11

286
Model: "model_15"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_16 (InputLayer)        [(None, 48, 3)]           0         
_________________________________________________________________
conv1d_29 (Conv1D)           (None, 48, 128)           512       
_________________________________________________________________
conv1d_30 (Conv1D)           (None, 48, 128)           16512     
_________________________________________________________________
max_pooling1d_14 (MaxPooling (None, 1, 128)            0         
_________________________________________________________________
lstm_43 (LSTM)               (None, 1, 128)            131584    
_________________________________________________________________
lstm_44 (LSTM)               (None, 1, 64)             49408     
_________________________________________________________________
lstm_45 (LSTM)               (None, 48)                21696     
_________________________________________________________________
dense_15 (Dense)             (None, 48)                2352      
=================================================================
Total params: 222,064
Trainable params: 222,064
Non-trainable params: 0
_________________________________________________________________
Train on 286 samples, validate on 123 samples
Epoch 1/30
286/286 [==============================] - 4s 15ms/sample - loss: 0.1541 - accuracy: 0.0874 - val_loss: 0.1345 - val_accuracy: 0.0976
Epoch 2/30
286/286 [==============================] - 0s 238us/sample - loss: 0.1053 - accuracy: 0.0979 - val_loss: 0.0608 - val_accuracy: 0.0976
Epoch 3/30
286/286 [==============================] - 0s 245us/sample - loss: 0.0416 - accuracy: 0.0979 - val_loss: 0.0240 - val_accuracy: 0.2195
Epoch 4/30
286/286 [==============================] - 0s 231us/sample - loss: 0.0151 - accuracy: 0.1958 - val_loss: 0.0077 - val_accuracy: 0.2195
Epoch 5/30
286/286 [==============================] - 0s 241us/sample - loss: 0.0069 - accuracy: 0.1329 - val_loss: 0.0049 - val_accuracy: 0.0813
Epoch 6/30
286/286 [==============================] - 0s 255us/sample - loss: 0.0055 - accuracy: 0.0455 - val_loss: 0.0045 - val_accuracy: 0.0000e+00
Epoch 7/30
286/286 [==============================] - 0s 252us/sample - loss: 0.0051 - accuracy: 0.0070 - val_loss: 0.0043 - val_accuracy: 0.0000e+00
Epoch 8/30
286/286 [==============================] - 0s 252us/sample - loss: 0.0049 - accuracy: 0.0070 - val_loss: 0.0042 - val_accuracy: 0.0000e+00
Epoch 9/30
286/286 [==============================] - 0s 248us/sample - loss: 0.0048 - accuracy: 0.0699 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 10/30
286/286 [==============================] - 0s 248us/sample - loss: 0.0047 - accuracy: 0.0734 - val_loss: 0.0040 - val_accuracy: 0.2195
Epoch 11/30
286/286 [==============================] - 0s 245us/sample - loss: 0.0047 - accuracy: 0.1958 - val_loss: 0.0040 - val_accuracy: 0.2195
Epoch 12/30
286/286 [==============================] - 0s 238us/sample - loss: 0.0047 - accuracy: 0.1958 - val_loss: 0.0040 - val_accuracy: 0.2195
Epoch 13/30
286/286 [==============================] - 0s 231us/sample - loss: 0.0047 - accuracy: 0.1958 - val_loss: 0.0040 - val_accuracy: 0.2195
Epoch 14/30
286/286 [==============================] - 0s 224us/sample - loss: 0.0047 - accuracy: 0.1853 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 15/30
286/286 [==============================] - 0s 227us/sample - loss: 0.0047 - accuracy: 0.1538 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 16/30
286/286 [==============================] - 0s 224us/sample - loss: 0.0047 - accuracy: 0.1538 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 17/30
286/286 [==============================] - 0s 231us/sample - loss: 0.0047 - accuracy: 0.1538 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 18/30
286/286 [==============================] - 0s 234us/sample - loss: 0.0047 - accuracy: 0.1818 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 19/30
286/286 [==============================] - 0s 241us/sample - loss: 0.0047 - accuracy: 0.1818 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 20/30
286/286 [==============================] - 0s 245us/sample - loss: 0.0047 - accuracy: 0.1538 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 21/30
286/286 [==============================] - 0s 245us/sample - loss: 0.0047 - accuracy: 0.1538 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 22/30
286/286 [==============================] - 0s 245us/sample - loss: 0.0047 - accuracy: 0.1538 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 23/30
286/286 [==============================] - 0s 245us/sample - loss: 0.0047 - accuracy: 0.1538 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 24/30
286/286 [==============================] - 0s 259us/sample - loss: 0.0047 - accuracy: 0.1538 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 25/30
286/286 [==============================] - 0s 241us/sample - loss: 0.0047 - accuracy: 0.1538 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 26/30
286/286 [==============================] - 0s 238us/sample - loss: 0.0047 - accuracy: 0.1538 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 27/30
286/286 [==============================] - 0s 245us/sample - loss: 0.0047 - accuracy: 0.1538 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 28/30
286/286 [==============================] - 0s 241us/sample - loss: 0.0047 - accuracy: 0.1538 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 29/30
286/286 [==============================] - 0s 231us/sample - loss: 0.0047 - accuracy: 0.1538 - val_loss: 0.0040 - val_accuracy: 0.1463
Epoch 30/30
286/286 [==============================] - 0s 238us/sample - loss: 0.0047 - accuracy: 0.1538 - val_loss: 0.0040 - val_accuracy: 0.1463

实验数据_cnnlstm_mape_第13张图片

pred
[8003.1416 7789.0356 7551.5386 7223.4604 6904.8945 6691.962  6576.3594
 6549.2715 6590.569  6776.643  6999.612  7522.885  8020.3594 8403.926
 8824.659  9078.801  9282.94   9558.638  9687.96   9799.985  9857.573
 9878.223  9918.466  9909.272  9913.162  9938.115  9952.026  9969.912
 9952.414  9943.165  9985.121  9994.311  9988.686  9893.041  9827.561
 9627.074  9507.73   9362.     9333.248  9337.677  9232.361  8970.84
 8682.373  8675.927  8469.995  8448.734  8279.1045 8104.3647]
true
[7708.56 7473.72 7213.36 6893.25 6629.62 6442.55 6346.8  6324.64 6325.32
 6422.41 6432.17 6642.   6960.28 7320.87 7700.4  8026.47 8340.17 8599.07
 8607.47 8766.32 8726.94 8725.8  8739.91 8730.05 8676.62 8706.94 8673.62
 8689.01 8722.71 8758.53 8800.54 8825.8  8841.9  8718.52 8692.14 8643.35
 8510.24 8348.76 8257.95 8244.88 8309.18 8222.97 8068.83 7946.74 7815.45
 7753.92 7586.01 7321.49]

 12-12

140
Model: "model_16"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_17 (InputLayer)        [(None, 48, 3)]           0         
_________________________________________________________________
conv1d_31 (Conv1D)           (None, 48, 128)           512       
_________________________________________________________________
conv1d_32 (Conv1D)           (None, 48, 128)           16512     
_________________________________________________________________
max_pooling1d_15 (MaxPooling (None, 1, 128)            0         
_________________________________________________________________
lstm_46 (LSTM)               (None, 1, 128)            131584    
_________________________________________________________________
lstm_47 (LSTM)               (None, 1, 64)             49408     
_________________________________________________________________
lstm_48 (LSTM)               (None, 48)                21696     
_________________________________________________________________
dense_16 (Dense)             (None, 48)                2352      
=================================================================
Total params: 222,064
Trainable params: 222,064
Non-trainable params: 0
_________________________________________________________________
Train on 140 samples, validate on 60 samples
Epoch 1/30
140/140 [==============================] - 3s 22ms/sample - loss: 0.1152 - accuracy: 0.0000e+00 - val_loss: 0.1050 - val_accuracy: 0.0833
Epoch 2/30
140/140 [==============================] - 0s 243us/sample - loss: 0.1046 - accuracy: 0.1000 - val_loss: 0.0893 - val_accuracy: 0.0333
Epoch 3/30
140/140 [==============================] - 0s 236us/sample - loss: 0.0841 - accuracy: 0.0429 - val_loss: 0.0613 - val_accuracy: 0.0333
Epoch 4/30
140/140 [==============================] - 0s 243us/sample - loss: 0.0535 - accuracy: 0.0429 - val_loss: 0.0372 - val_accuracy: 0.0333
Epoch 5/30
140/140 [==============================] - 0s 250us/sample - loss: 0.0343 - accuracy: 0.0429 - val_loss: 0.0272 - val_accuracy: 0.0333
Epoch 6/30
140/140 [==============================] - 0s 250us/sample - loss: 0.0221 - accuracy: 0.0429 - val_loss: 0.0154 - val_accuracy: 0.0000e+00
Epoch 7/30
140/140 [==============================] - 0s 250us/sample - loss: 0.0134 - accuracy: 0.0143 - val_loss: 0.0116 - val_accuracy: 0.0000e+00
Epoch 8/30
140/140 [==============================] - 0s 250us/sample - loss: 0.0107 - accuracy: 0.0429 - val_loss: 0.0097 - val_accuracy: 0.0000e+00
Epoch 9/30
140/140 [==============================] - 0s 257us/sample - loss: 0.0088 - accuracy: 0.0357 - val_loss: 0.0089 - val_accuracy: 0.0000e+00
Epoch 10/30
140/140 [==============================] - 0s 250us/sample - loss: 0.0081 - accuracy: 0.0643 - val_loss: 0.0085 - val_accuracy: 0.0000e+00
Epoch 11/30
140/140 [==============================] - 0s 271us/sample - loss: 0.0077 - accuracy: 0.0714 - val_loss: 0.0082 - val_accuracy: 0.1833
Epoch 12/30
140/140 [==============================] - 0s 243us/sample - loss: 0.0077 - accuracy: 0.1357 - val_loss: 0.0082 - val_accuracy: 0.0333
Epoch 13/30
140/140 [==============================] - 0s 264us/sample - loss: 0.0076 - accuracy: 0.0214 - val_loss: 0.0082 - val_accuracy: 0.0333
Epoch 14/30
140/140 [==============================] - 0s 264us/sample - loss: 0.0075 - accuracy: 0.0214 - val_loss: 0.0082 - val_accuracy: 0.0333
Epoch 15/30
140/140 [==============================] - 0s 257us/sample - loss: 0.0075 - accuracy: 0.0214 - val_loss: 0.0080 - val_accuracy: 0.0333
Epoch 16/30
140/140 [==============================] - 0s 257us/sample - loss: 0.0074 - accuracy: 0.0214 - val_loss: 0.0080 - val_accuracy: 0.1333
Epoch 17/30
140/140 [==============================] - 0s 264us/sample - loss: 0.0074 - accuracy: 0.1500 - val_loss: 0.0080 - val_accuracy: 0.1333
Epoch 18/30
140/140 [==============================] - 0s 257us/sample - loss: 0.0074 - accuracy: 0.1500 - val_loss: 0.0080 - val_accuracy: 0.1333
Epoch 19/30
140/140 [==============================] - 0s 250us/sample - loss: 0.0074 - accuracy: 0.1214 - val_loss: 0.0080 - val_accuracy: 0.0833
Epoch 20/30
140/140 [==============================] - 0s 264us/sample - loss: 0.0074 - accuracy: 0.0929 - val_loss: 0.0080 - val_accuracy: 0.0833
Epoch 21/30
140/140 [==============================] - 0s 271us/sample - loss: 0.0074 - accuracy: 0.0929 - val_loss: 0.0080 - val_accuracy: 0.0833
Epoch 22/30
140/140 [==============================] - 0s 279us/sample - loss: 0.0074 - accuracy: 0.0929 - val_loss: 0.0080 - val_accuracy: 0.0833
Epoch 23/30
140/140 [==============================] - 0s 257us/sample - loss: 0.0074 - accuracy: 0.1429 - val_loss: 0.0080 - val_accuracy: 0.1333
Epoch 24/30
140/140 [==============================] - 0s 271us/sample - loss: 0.0074 - accuracy: 0.1500 - val_loss: 0.0080 - val_accuracy: 0.1333
Epoch 25/30
140/140 [==============================] - 0s 250us/sample - loss: 0.0074 - accuracy: 0.1500 - val_loss: 0.0080 - val_accuracy: 0.1333
Epoch 26/30
140/140 [==============================] - 0s 250us/sample - loss: 0.0074 - accuracy: 0.1500 - val_loss: 0.0080 - val_accuracy: 0.1333
Epoch 27/30
140/140 [==============================] - 0s 257us/sample - loss: 0.0074 - accuracy: 0.1500 - val_loss: 0.0080 - val_accuracy: 0.1333
Epoch 28/30
140/140 [==============================] - 0s 243us/sample - loss: 0.0074 - accuracy: 0.1500 - val_loss: 0.0080 - val_accuracy: 0.1333
Epoch 29/30
140/140 [==============================] - 0s 236us/sample - loss: 0.0074 - accuracy: 0.1500 - val_loss: 0.0080 - val_accuracy: 0.1333
Epoch 30/30
140/140 [==============================] - 0s 229us/sample - loss: 0.0074 - accuracy: 0.1500 - val_loss: 0.0080 - val_accuracy: 0.1333

实验数据_cnnlstm_mape_第14张图片

pred
[7369.9585 7200.9834 7028.761  6761.938  6531.9375 6364.6978 6268.64
 6232.3154 6252.644  6370.538  6502.516  6804.556  7121.7554 7435.1484
 7800.4897 8127.2964 8407.719  8717.083  8905.924  9054.688  9125.381
 9162.089  9204.826  9209.171  9215.575  9228.746  9223.391  9230.476
 9224.799  9234.984  9263.72   9284.136  9293.766  9248.098  9252.37
 9147.094  9106.476  9030.707  9013.392  8997.841  8918.276  8694.034
 8413.436  8341.0625 8138.644  8098.1133 7925.825  7763.4053]
true
[7107.15 6905.44 6685.05 6420.49 6235.78 6065.24 5992.03 5974.2  5968.51
 6016.01 6005.39 6104.04 6310.36 6617.23 6986.72 7313.32 7618.65 7902.21
 8118.4  8240.1  8269.32 8339.51 8393.03 8496.69 8481.54 8521.15 8493.49
 8519.37 8552.41 8632.61 8756.16 8774.66 8778.73 8775.16 8886.59 8863.23
 8849.15 8777.13 8628.47 8714.02 8820.83 8704.29 8432.96 8294.08 8008.26
 7824.74 7572.87 7315.62]

 12-13

100
Model: "model_17"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_18 (InputLayer)        [(None, 48, 3)]           0         
_________________________________________________________________
conv1d_33 (Conv1D)           (None, 48, 128)           512       
_________________________________________________________________
conv1d_34 (Conv1D)           (None, 48, 128)           16512     
_________________________________________________________________
max_pooling1d_16 (MaxPooling (None, 1, 128)            0         
_________________________________________________________________
lstm_49 (LSTM)               (None, 1, 128)            131584    
_________________________________________________________________
lstm_50 (LSTM)               (None, 1, 64)             49408     
_________________________________________________________________
lstm_51 (LSTM)               (None, 48)                21696     
_________________________________________________________________
dense_17 (Dense)             (None, 48)                2352      
=================================================================
Total params: 222,064
Trainable params: 222,064
Non-trainable params: 0
_________________________________________________________________
Train on 100 samples, validate on 43 samples
Epoch 1/30
100/100 [==============================] - 3s 30ms/sample - loss: 0.1430 - accuracy: 0.0200 - val_loss: 0.1326 - val_accuracy: 0.0698
Epoch 2/30
100/100 [==============================] - 0s 270us/sample - loss: 0.1336 - accuracy: 0.0000e+00 - val_loss: 0.1189 - val_accuracy: 0.0698
Epoch 3/30
100/100 [==============================] - 0s 260us/sample - loss: 0.1171 - accuracy: 0.0000e+00 - val_loss: 0.0952 - val_accuracy: 0.0698
Epoch 4/30
100/100 [==============================] - 0s 260us/sample - loss: 0.0900 - accuracy: 0.0000e+00 - val_loss: 0.0634 - val_accuracy: 0.0698
Epoch 5/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0577 - accuracy: 0.0400 - val_loss: 0.0409 - val_accuracy: 0.0000e+00
Epoch 6/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0393 - accuracy: 0.0400 - val_loss: 0.0342 - val_accuracy: 0.0000e+00
Epoch 7/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0311 - accuracy: 0.0400 - val_loss: 0.0226 - val_accuracy: 0.0000e+00
Epoch 8/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0194 - accuracy: 0.0400 - val_loss: 0.0135 - val_accuracy: 0.0000e+00
Epoch 9/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0122 - accuracy: 0.0400 - val_loss: 0.0096 - val_accuracy: 0.0000e+00
Epoch 10/30
100/100 [==============================] - 0s 280us/sample - loss: 0.0090 - accuracy: 0.0400 - val_loss: 0.0074 - val_accuracy: 0.0000e+00
Epoch 11/30
100/100 [==============================] - 0s 280us/sample - loss: 0.0069 - accuracy: 0.0400 - val_loss: 0.0067 - val_accuracy: 0.0000e+00
Epoch 12/30
100/100 [==============================] - 0s 290us/sample - loss: 0.0063 - accuracy: 0.0800 - val_loss: 0.0069 - val_accuracy: 0.0233
Epoch 13/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0064 - accuracy: 0.0100 - val_loss: 0.0067 - val_accuracy: 0.0233
Epoch 14/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0062 - accuracy: 0.0100 - val_loss: 0.0065 - val_accuracy: 0.0233
Epoch 15/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0061 - accuracy: 0.0100 - val_loss: 0.0063 - val_accuracy: 0.0000e+00
Epoch 16/30
100/100 [==============================] - 0s 280us/sample - loss: 0.0060 - accuracy: 0.0200 - val_loss: 0.0062 - val_accuracy: 0.0000e+00
Epoch 17/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0059 - accuracy: 0.0100 - val_loss: 0.0062 - val_accuracy: 0.0233
Epoch 18/30
100/100 [==============================] - 0s 300us/sample - loss: 0.0058 - accuracy: 0.0100 - val_loss: 0.0062 - val_accuracy: 0.0233
Epoch 19/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0058 - accuracy: 0.0900 - val_loss: 0.0061 - val_accuracy: 0.1628
Epoch 20/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0057 - accuracy: 0.1700 - val_loss: 0.0060 - val_accuracy: 0.1628
Epoch 21/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0057 - accuracy: 0.1600 - val_loss: 0.0060 - val_accuracy: 0.0698
Epoch 22/30
100/100 [==============================] - 0s 280us/sample - loss: 0.0057 - accuracy: 0.0300 - val_loss: 0.0061 - val_accuracy: 0.0000e+00
Epoch 23/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0057 - accuracy: 0.0400 - val_loss: 0.0061 - val_accuracy: 0.0000e+00
Epoch 24/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0057 - accuracy: 0.0400 - val_loss: 0.0061 - val_accuracy: 0.0000e+00
Epoch 25/30
100/100 [==============================] - 0s 280us/sample - loss: 0.0057 - accuracy: 0.0400 - val_loss: 0.0061 - val_accuracy: 0.0000e+00
Epoch 26/30
100/100 [==============================] - 0s 280us/sample - loss: 0.0057 - accuracy: 0.0400 - val_loss: 0.0061 - val_accuracy: 0.0000e+00
Epoch 27/30
100/100 [==============================] - 0s 280us/sample - loss: 0.0057 - accuracy: 0.0400 - val_loss: 0.0061 - val_accuracy: 0.0000e+00
Epoch 28/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0057 - accuracy: 0.0400 - val_loss: 0.0061 - val_accuracy: 0.0465
Epoch 29/30
100/100 [==============================] - 0s 270us/sample - loss: 0.0057 - accuracy: 0.1100 - val_loss: 0.0061 - val_accuracy: 0.0465
Epoch 30/30
100/100 [==============================] - 0s 260us/sample - loss: 0.0057 - accuracy: 0.1100 - val_loss: 0.0061 - val_accuracy: 0.1628

实验数据_cnnlstm_mape_第15张图片

pred
[7325.5684 7168.5063 7002.6133 6755.878  6544.169  6401.701  6299.653
 6283.3438 6315.724  6475.9585 6681.091  7138.377  7583.3677 8009.2446
 8450.552  8738.074  8983.585  9264.755  9423.79   9569.858  9637.64
 9665.849  9721.047  9722.32   9724.004  9768.039  9787.563  9786.6
 9788.899  9771.91   9786.657  9809.235  9812.62   9767.638  9721.595
 9554.51   9504.496  9381.819  9338.447  9277.919  9150.202  8891.067
 8593.608  8530.443  8333.852  8312.344  8154.748  8011.0864]
true
[ 7169.06  7028.37  6882.56  6675.72  6509.74  6433.63  6337.44  6391.6
  6430.49  6699.06  6909.91  7506.36  8082.77  8525.79  9019.16  9284.05
  9491.5   9730.94  9878.81  9983.39 10091.39 10110.35 10147.24 10175.82
 10173.61 10206.99 10277.8  10316.92 10343.71 10320.83 10394.41 10425.31
 10389.64 10280.01 10109.29  9720.02  9469.66  9099.85  8935.96  8966.7
  9020.98  8850.09  8575.67  8547.27  8319.89  8234.43  8131.17  7867.71]

 12-16

219
Model: "model_18"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_19 (InputLayer)        [(None, 48, 3)]           0         
_________________________________________________________________
conv1d_35 (Conv1D)           (None, 48, 128)           512       
_________________________________________________________________
conv1d_36 (Conv1D)           (None, 48, 128)           16512     
_________________________________________________________________
max_pooling1d_17 (MaxPooling (None, 1, 128)            0         
_________________________________________________________________
lstm_52 (LSTM)               (None, 1, 128)            131584    
_________________________________________________________________
lstm_53 (LSTM)               (None, 1, 64)             49408     
_________________________________________________________________
lstm_54 (LSTM)               (None, 48)                21696     
_________________________________________________________________
dense_18 (Dense)             (None, 48)                2352      
=================================================================
Total params: 222,064
Trainable params: 222,064
Non-trainable params: 0
_________________________________________________________________
Train on 219 samples, validate on 95 samples
Epoch 1/30
219/219 [==============================] - 3s 14ms/sample - loss: 0.1732 - accuracy: 0.0046 - val_loss: 0.1575 - val_accuracy: 0.0000e+00
Epoch 2/30
219/219 [==============================] - 0s 215us/sample - loss: 0.1380 - accuracy: 0.0000e+00 - val_loss: 0.1020 - val_accuracy: 0.0000e+00
Epoch 3/30
219/219 [==============================] - 0s 206us/sample - loss: 0.0732 - accuracy: 0.0000e+00 - val_loss: 0.0443 - val_accuracy: 0.0842
Epoch 4/30
219/219 [==============================] - 0s 206us/sample - loss: 0.0333 - accuracy: 0.1050 - val_loss: 0.0212 - val_accuracy: 0.0842
Epoch 5/30
219/219 [==============================] - 0s 210us/sample - loss: 0.0123 - accuracy: 0.1005 - val_loss: 0.0098 - val_accuracy: 0.1895
Epoch 6/30
219/219 [==============================] - 0s 210us/sample - loss: 0.0076 - accuracy: 0.1918 - val_loss: 0.0077 - val_accuracy: 0.1895
Epoch 7/30
219/219 [==============================] - 0s 210us/sample - loss: 0.0055 - accuracy: 0.1826 - val_loss: 0.0066 - val_accuracy: 0.0526
Epoch 8/30
219/219 [==============================] - 0s 210us/sample - loss: 0.0048 - accuracy: 0.0594 - val_loss: 0.0064 - val_accuracy: 0.0947
Epoch 9/30
219/219 [==============================] - 0s 219us/sample - loss: 0.0046 - accuracy: 0.0776 - val_loss: 0.0062 - val_accuracy: 0.0947
Epoch 10/30
219/219 [==============================] - 0s 210us/sample - loss: 0.0044 - accuracy: 0.0776 - val_loss: 0.0060 - val_accuracy: 0.0947
Epoch 11/30
219/219 [==============================] - 0s 215us/sample - loss: 0.0043 - accuracy: 0.0868 - val_loss: 0.0060 - val_accuracy: 0.0526
Epoch 12/30
219/219 [==============================] - 0s 206us/sample - loss: 0.0042 - accuracy: 0.1233 - val_loss: 0.0059 - val_accuracy: 0.1895
Epoch 13/30
219/219 [==============================] - 0s 210us/sample - loss: 0.0042 - accuracy: 0.1826 - val_loss: 0.0059 - val_accuracy: 0.2947
Epoch 14/30
219/219 [==============================] - 0s 210us/sample - loss: 0.0042 - accuracy: 0.1735 - val_loss: 0.0059 - val_accuracy: 0.2947
Epoch 15/30
219/219 [==============================] - 0s 210us/sample - loss: 0.0042 - accuracy: 0.1872 - val_loss: 0.0059 - val_accuracy: 0.1895
Epoch 16/30
219/219 [==============================] - 0s 215us/sample - loss: 0.0042 - accuracy: 0.1918 - val_loss: 0.0059 - val_accuracy: 0.1895
Epoch 17/30
219/219 [==============================] - 0s 219us/sample - loss: 0.0042 - accuracy: 0.1918 - val_loss: 0.0059 - val_accuracy: 0.0947
Epoch 18/30
219/219 [==============================] - 0s 224us/sample - loss: 0.0042 - accuracy: 0.1005 - val_loss: 0.0059 - val_accuracy: 0.0947
Epoch 19/30
219/219 [==============================] - 0s 215us/sample - loss: 0.0042 - accuracy: 0.1233 - val_loss: 0.0059 - val_accuracy: 0.1895
Epoch 20/30
219/219 [==============================] - 0s 219us/sample - loss: 0.0042 - accuracy: 0.1918 - val_loss: 0.0059 - val_accuracy: 0.1895
Epoch 21/30
219/219 [==============================] - 0s 219us/sample - loss: 0.0042 - accuracy: 0.1918 - val_loss: 0.0059 - val_accuracy: 0.1895
Epoch 22/30
219/219 [==============================] - 0s 215us/sample - loss: 0.0042 - accuracy: 0.1918 - val_loss: 0.0059 - val_accuracy: 0.1895
Epoch 23/30
219/219 [==============================] - 0s 215us/sample - loss: 0.0042 - accuracy: 0.1918 - val_loss: 0.0059 - val_accuracy: 0.1895
Epoch 24/30
219/219 [==============================] - 0s 219us/sample - loss: 0.0042 - accuracy: 0.1918 - val_loss: 0.0059 - val_accuracy: 0.1895
Epoch 25/30
219/219 [==============================] - 0s 219us/sample - loss: 0.0042 - accuracy: 0.1918 - val_loss: 0.0059 - val_accuracy: 0.1895
Epoch 26/30
219/219 [==============================] - 0s 210us/sample - loss: 0.0042 - accuracy: 0.1918 - val_loss: 0.0059 - val_accuracy: 0.1895
Epoch 27/30
219/219 [==============================] - 0s 215us/sample - loss: 0.0042 - accuracy: 0.1918 - val_loss: 0.0059 - val_accuracy: 0.1895
Epoch 28/30
219/219 [==============================] - 0s 215us/sample - loss: 0.0042 - accuracy: 0.1918 - val_loss: 0.0059 - val_accuracy: 0.1895
Epoch 29/30
219/219 [==============================] - 0s 215us/sample - loss: 0.0042 - accuracy: 0.1918 - val_loss: 0.0059 - val_accuracy: 0.1895
Epoch 30/30
219/219 [==============================] - 0s 210us/sample - loss: 0.0042 - accuracy: 0.1918 - val_loss: 0.0059 - val_accuracy: 0.1895

实验数据_cnnlstm_mape_第16张图片

pred
[ 8010.2812  7795.0703  7564.422   7225.707   6919.785   6712.253
  6607.061   6586.321   6634.1426  6830.8213  7080.0835  7604.8574
  8131.903   8530.349   8947.871   9210.863   9431.09    9719.999
  9876.513   9997.637  10072.125  10105.951  10164.23   10174.132
 10195.184  10236.515  10265.087  10291.155  10299.1875 10285.224
 10323.4375 10334.313  10327.895  10218.452  10122.957   9863.107
  9719.938   9526.191   9484.489   9480.636   9370.866   9098.046
  8800.417   8780.977   8572.676   8534.667   8355.627   8160.165 ]
true
[ 7883.01  7697.8   7454.8   7124.84  6924.24  6775.86  6658.5   6700.81
  6729.53  6968.24  7165.2   7764.98  8301.17  8652.23  9112.    9473.62
  9749.28 10047.51 10199.95 10421.91 10504.25 10467.54 10474.48 10425.8
 10434.05 10387.62 10255.64 10079.61 10005.6   9878.18  9705.36  9582.39
  9615.68  9572.51  9594.13  9403.11  9213.98  8993.86  8949.15  8978.68
  8994.21  8781.87  8530.68  8483.23  8245.1   8215.07  8083.06  7847.47]

 12-17

实验数据_cnnlstm_mape_第17张图片

pred
[7953.6826 7764.2075 7525.9487 7193.199  6884.777  6641.638  6540.9766
 6501.481  6541.193  6738.235  6985.7256 7513.686  8028.8506 8388.727
 8816.574  9045.857  9238.705  9511.588  9636.9    9736.143  9784.97
 9792.08   9819.831  9803.825  9798.889  9808.947  9821.329  9825.073
 9804.305  9783.639  9813.6455 9838.6045 9832.7295 9741.81   9677.087
 9479.081  9365.7705 9237.208  9227.098  9215.348  9102.898  8842.39
 8565.39   8563.052  8382.6455 8378.317  8209.04   8044.8027]
true
[7655.29 7456.22 7226.49 6964.52 6727.05 6546.17 6485.81 6449.02 6545.72
 6793.83 7014.99 7557.14 8063.95 8404.65 8775.34 9023.61 9192.9  9427.04
 9507.14 9528.89 9494.33 9470.4  9437.37 9383.72 9375.24 9385.37 9357.67
 9340.12 9305.39 9321.9  9260.88 9277.83 9270.07 9165.36 9076.46 8773.58
 8617.1  8501.93 8360.43 8420.27 8542.9  8353.83 8175.34 8191.65 8097.69
 8080.61 7946.84 7741.89]

12-18

实验数据_cnnlstm_mape_第18张图片

pred
[7865.2236 7677.1333 7456.1416 7154.1284 6850.285  6626.4424 6489.7876
 6419.662  6430.68   6571.7866 6756.8037 7149.8203 7573.453  7936.213
 8327.337  8582.044  8804.69   9057.268  9184.601  9315.578  9339.636
 9345.116  9357.485  9340.651  9316.209  9318.807  9310.985  9302.852
 9274.648  9256.042  9284.229  9294.672  9316.4    9251.395  9233.683
 9121.313  9090.658  9043.04   9055.084  9015.442  8912.394  8674.6455
 8423.779  8391.762  8221.767  8217.742  8056.0825 7911.022 ]
true
[7536.2  7320.33 7097.55 6795.59 6507.03 6277.95 6164.62 6115.29 6125.64
 6234.22 6269.43 6477.05 6708.51 7009.05 7362.78 7761.16 8027.79 8333.36
 8363.98 8438.6  8425.72 8417.9  8372.52 8343.37 8242.85 8191.22 8085.53
 8095.87 8022.95 8001.18 7968.75 7915.96 7924.51 7878.38 7902.41 7880.48
 7825.89 7750.67 7784.41 7887.17 8004.28 7936.47 7801.48 7805.52 7691.82
 7689.13 7490.91 7292.07]

 11-19

实验数据_cnnlstm_mape_第19张图片

pred
[7967.773  7780.1353 7578.1963 7293.3896 6998.7544 6734.992  6570.999
 6492.248  6488.9717 6597.7173 6776.7847 7177.6133 7602.7163 8011.8896
 8439.06   8677.58   8916.004  9165.357  9291.33   9420.282  9458.338
 9475.561  9491.976  9461.701  9454.392  9461.733  9439.987  9437.684
 9413.306  9397.864  9409.572  9406.601  9421.963  9364.303  9351.47
 9237.589  9257.813  9243.763  9238.902  9152.531  9017.155  8782.299
 8539.245  8467.33   8282.052  8310.971  8160.825  8034.3843]
true
[7540.35 7358.8  7176.6  6890.91 6623.68 6442.35 6355.86 6334.19 6432.11
 6616.44 6924.92 7509.92 8086.14 8486.14 8901.87 9028.32 9097.9  9257.81
 9311.52 9341.68 9330.75 9252.11 9174.87 9296.21 9184.21 9127.45 9104.33
 9037.21 8908.58 8776.77 8776.51 8841.54 8851.47 8798.11 8733.41 8542.93
 8544.56 8468.44 8507.41 8575.36 8434.61 8201.8  7990.06 8040.75 7989.99
 8099.74 8030.74 7861.39]

 11-20

实验数据_cnnlstm_mape_第20张图片

pred
[7658.116  7461.5674 7246.294  6941.2827 6662.401  6459.3604 6351.2134
 6302.537  6319.0854 6437.2007 6557.7256 6861.8833 7187.0464 7481.572
 7855.0415 8174.0874 8437.384  8719.736  8881.039  9015.198  9058.812
 9069.197  9085.506  9077.307  9070.739  9071.924  9054.071  9046.765
 9033.168  9035.119  9065.985  9087.845  9114.164  9062.605  9075.727
 8981.485  8927.712  8852.261  8839.681  8849.244  8807.555  8598.439
 8350.126  8318.162  8157.8555 8114.532  7946.8    7772.266 ]
true
[7698.96 7498.86 7281.45 6936.39 6676.8  6396.83 6314.37 6267.42 6277.04
 6376.66 6391.01 6606.38 6879.9  7200.98 7572.37 7989.62 8264.51 8421.6
 8495.47 8544.23 8481.18 8466.99 8380.54 8307.48 8208.27 8103.64 8019.13
 8052.48 7998.17 7995.55 7981.04 8033.03 8018.03 7973.22 7977.28 7982.32
 7905.84 7847.67 7901.97 7984.08 7928.23 7755.08 7653.65 7576.1  7483.01
 7566.4  7456.93 7278.91]

 11-21

实验数据_cnnlstm_mape_第21张图片

pred
[7335.7456 7160.5    7008.125  6778.8853 6570.6484 6347.791  6231.724
 6143.876  6145.4414 6211.49   6343.6177 6595.224  6878.6084 7233.2134
 7594.5986 7895.8594 8212.867  8482.55   8653.988  8810.846  8902.306
 8922.246  8942.578  8907.583  8874.332  8876.025  8836.406  8817.289
 8785.119  8764.53   8757.     8757.213  8775.332  8762.126  8826.847
 8838.953  8965.05   9024.0205 8996.978  8893.513  8795.878  8592.542
 8351.354  8233.171  8034.6646 8053.8403 7903.154  7786.9644]
true
[7090.83 6986.06 6760.39 6527.12 6294.61 6139.96 6051.96 6005.29 5980.31
 6036.   6021.97 6129.97 6259.3  6515.63 6865.08 7278.56 7571.23 7880.97
 8005.78 8125.06 8128.52 8082.69 8052.26 8077.55 8058.23 7995.52 7957.18
 7926.25 7888.81 7942.18 7995.95 8030.31 8015.66 7924.17 8007.57 8083.29
 8066.84 8028.86 8022.77 8140.54 8081.94 7891.74 7735.92 7670.76 7536.78
 7442.11 7272.63 7106.86]

11-22

实验数据_cnnlstm_mape_第22张图片

pred
[7326.1045 7127.653  7001.1797 6760.385  6587.647  6357.289  6190.7124
 6182.9023 6210.8623 6298.155  6419.5874 6756.05   7086.5205 7490.285
 7928.6377 8116.4995 8491.8    8780.574  8960.463  9121.699  9203.788
 9246.216  9218.818  9276.804  9183.858  9252.85   9255.539  9259.722
 9255.282  9210.178  9192.959  9289.753  9175.495  9242.834  9230.407
 9194.7    9254.693  9265.059  9271.058  9078.975  8944.088  8751.835
 8494.521  8412.966  8142.4116 8209.342  8005.757  7942.297 ]
true
[6965.14 6852.4  6719.32 6534.4  6381.95 6278.91 6218.12 6246.06 6341.62
 6626.39 6874.83 7428.62 7990.92 8429.33 8758.04 9021.75 9174.34 9379.13
 9501.78 9662.41 9684.4  9700.28 9750.52 9687.44 9643.01 9673.54 9754.38
 9766.75 9736.53 9736.18 9713.49 9726.78 9691.35 9706.76 9565.39 9381.76
 9175.94 8997.81 8910.26 8961.54 8881.13 8608.68 8331.93 8326.58 8210.31
 8095.65 7990.08 7809.96]

11-23

219

实验数据_cnnlstm_mape_第23张图片

pred
[7920.0327 7727.453  7498.005  7185.2114 6878.8813 6647.562  6509.2666
 6460.4077 6487.224  6645.835  6849.7173 7306.449  7780.017  8165.4346
 8575.308  8824.612  9033.209  9284.637  9409.561  9526.89   9556.26
 9569.138  9598.808  9577.252  9565.444  9579.406  9571.225  9572.5205
 9553.926  9536.858  9558.368  9571.38   9585.237  9507.746  9473.623
 9326.139  9278.574  9203.591  9185.488  9143.21   9029.894  8775.27
 8515.4    8487.965  8315.512  8309.635  8150.843  7995.8057]
true
[7676.74 7488.34 7284.98 6955.98 6695.74 6565.37 6467.37 6419.44 6535.9
 6693.26 6915.04 7480.59 8073.68 8496.22 8859.66 9040.25 9218.58 9422.82
 9549.47 9631.52 9691.09 9694.25 9714.88 9721.51 9793.06 9760.63 9741.99
 9808.03 9826.22 9778.31 9820.92 9880.44 9872.48 9783.38 9720.19 9433.17
 9303.74 9061.45 9019.82 9066.22 8970.96 8717.53 8420.32 8422.73 8254.52
 8194.7  8091.06 7880.16]

 11-24

345

实验数据_cnnlstm_mape_第24张图片

pred
[8004.741  7836.458  7603.202  7314.055  7014.709  6749.5996 6595.0283
 6525.7344 6519.7524 6669.235  6852.7886 7258.991  7708.597  8117.655
 8555.022  8792.569  9031.577  9288.715  9415.42   9547.265  9597.754
 9636.981  9641.2705 9629.287  9617.367  9627.957  9622.893  9613.755
 9593.704  9573.136  9595.104  9597.0205 9601.174  9541.243  9515.239
 9390.875  9377.776  9336.862  9309.265  9246.4795 9109.968  8886.664
 8630.598  8561.573  8355.295  8379.891  8237.742  8093.9297]
true
[ 7719.93  7555.17  7363.01  7047.69  6746.96  6545.98  6463.73  6456.23
  6490.03  6737.62  6967.2   7567.01  8137.38  8556.23  8966.92  9197.37
  9357.05  9614.87  9787.37  9877.89  9929.24  9991.64 10069.66 10115.05
 10063.61 10120.29 10125.07 10195.46 10183.76 10173.81 10198.73 10228.02
 10180.88 10039.71  9911.18  9599.43  9435.21  9241.65  9194.97  9289.21
  9153.82  8898.44  8589.48  8558.74  8384.7   8342.53  8145.04  7909.66]

11-27

 288

实验数据_cnnlstm_mape_第25张图片

pred
[7978.2227 7762.1753 7535.09   7190.468  6873.919  6672.794  6571.751
 6531.8135 6578.506  6758.31   6998.714  7524.4707 8029.8286 8400.463
 8815.995  9061.025  9268.963  9531.245  9665.14   9779.905  9835.957
 9847.31   9885.594  9884.188  9881.698  9909.587  9928.226  9932.07
 9915.323  9912.073  9937.959  9955.917  9946.1455 9853.647  9794.922
 9575.495  9459.749  9313.353  9293.79   9295.67   9187.785  8927.393
 8635.85   8632.782  8445.346  8413.299  8259.434  8080.411 ]
true
[7775.14 7621.58 7352.58 7055.24 6785.56 6631.69 6515.74 6447.65 6450.86
 6504.6  6583.8  6820.93 7074.62 7480.55 7848.92 8329.09 8608.05 8961.58
 9017.24 9193.34 9212.99 9222.8  9235.72 9268.29 9166.05 9208.1  9229.47
 9191.43 9211.02 9105.96 9056.33 9102.75 9107.86 8917.   8822.83 8712.19
 8554.41 8419.46 8377.18 8537.98 8434.52 8251.53 8073.58 8015.31 7965.3
 7921.39 7696.89 7437.95]

 11-28

165

实验数据_cnnlstm_mape_第26张图片

pred
[7622.089  7433.514  7227.8193 6961.206  6706.786  6493.731  6359.1245
 6290.1704 6273.657  6345.671  6424.8203 6650.1357 6900.9785 7216.72
 7585.784  7916.0874 8222.613  8535.923  8728.1875 8892.483  8960.711
 8996.679  9024.557  9025.254  9023.9    9029.322  9012.389  9007.668
 8991.39   8992.334  9005.018  9012.712  9023.311  8988.801  9014.838
 8950.001  8952.094  8919.148  8881.271  8848.96   8795.48   8595.546
 8356.173  8273.891  8101.968  8070.0713 7900.942  7747.8354]
true
[7239.79 7042.03 6899.01 6665.75 6421.44 6247.71 6241.86 6191.66 6158.63
 6208.94 6166.45 6243.24 6440.46 6751.08 7064.03 7437.11 7727.5  7984.53
 8152.66 8279.98 8321.88 8311.75 8375.56 8283.17 8275.1  8161.08 8094.9
 8105.09 8096.23 8120.49 8081.84 8109.79 8197.5  8205.53 8266.91 8327.65
 8263.45 8156.93 8184.06 8324.47 8254.33 8071.07 7900.53 7789.66 7572.89
 7470.29 7255.79 7066.29]

 11-29

128

实验数据_cnnlstm_mape_第27张图片

pred
[7354.361  7192.362  7026.5654 6798.1035 6588.299  6385.22   6259.58
 6211.2954 6221.9624 6327.96   6477.664  6815.372  7176.6865 7559.209
 7974.569  8252.226  8548.225  8815.762  8991.436  9139.384  9207.862
 9247.233  9289.931  9257.233  9256.092  9270.941  9253.998  9245.9795
 9226.661  9231.085  9231.876  9241.795  9247.777  9219.032  9247.188
 9173.0205 9248.124  9260.431  9225.302  9114.342  8992.472  8779.633
 8508.688  8398.28   8183.7065 8188.5264 8042.2544 7928.874 ]
true
[6992.16 6843.24 6716.09 6553.76 6352.7  6281.24 6249.09 6297.8  6281.89
 6527.21 6776.26 7331.34 7877.59 8387.56 8791.79 9068.34 9195.86 9357.81
 9482.36 9557.18 9506.76 9494.25 9547.87 9446.35 9366.44 9367.7  9388.01
 9300.02 9268.14 9263.74 9228.2  9271.1  9342.75 9400.46 9359.57 9250.46
 9093.62 8921.77 8819.34 8845.23 8627.71 8368.82 8059.55 8030.84 7823.82
 7809.72 7743.59 7599.99]

 11-30

148

实验数据_cnnlstm_mape_第28张图片

pred
[7841.304  7647.1177 7420.666  7113.2344 6810.6753 6582.8794 6456.209
 6397.4062 6414.656  6564.199  6744.2104 7152.8975 7581.1265 7936.505
 8336.705  8600.568  8823.452  9084.268  9217.222  9339.518  9379.849
 9387.69   9405.911  9389.394  9383.671  9394.063  9385.995  9378.815
 9359.981  9353.944  9380.819  9390.11   9409.569  9350.326  9320.608
 9178.522  9126.161  9050.496  9030.646  9005.092  8924.07   8697.836
 8448.34   8419.407  8254.421  8237.489  8079.025  7921.785 ]
true
[7486.74 7365.91 7196.1  6904.89 6589.38 6427.24 6344.64 6309.51 6446.28
 6672.93 6930.36 7518.19 8137.44 8502.02 8907.83 9053.67 9127.21 9290.15
 9398.54 9471.39 9518.63 9512.94 9475.56 9524.75 9459.6  9450.12 9433.43
 9424.38 9422.29 9362.09 9414.97 9402.16 9475.24 9436.82 9434.39 9152.26
 9108.13 8907.29 8900.34 8932.27 8759.71 8487.   8211.04 8264.15 8009.86
 8072.83 7969.69 7781.7 ]

你可能感兴趣的:(数据分析,python,开发语言)