使用Keras中的RNN模型进行时间序列预测

参考(有改动):使用Keras中的RNN模型进行时间序列预测

数据下载:点击下载

数据描述:

数据为苏州市公共自行车某个站点可借车辆数量的数据,一分钟一次。

数据格式为:

num weekday hour
0 5 17
1 5 17
1 5 17
2 5 17
4 5 17
5 5 17
5 5 17
5 5 17
4 5 17

代码:

import csv

import numpy as np
from keras.layers import Dense, Dropout, LSTM, Activation
from keras.models import Sequential

def load_data(path):
    with open(path, 'r') as f:
        lines = csv.reader(f, delimiter=',')
        bike = []
        next(lines)  # 删除头
        for line in lines:
            bike.append(float(line[0]))

    return bike


# 读取数据
data = load_data('bike_rnn.csv')

seq_len = 20
result = []
for index in range(len(data) - seq_len):
    result.append(data[index:index + seq_len])
result = np.array(result)

# 数据简单预处理,中心化
result_mean = result.mean()
result -= result_mean
print(result.shape)
# (45929, 20)

# 训练和验证数据划分,90%用于训练,10%用于验证
train_size = int(round(result.shape[0] * 0.9))
print(train_size)
# 41336
train = result[:train_size]
np.random.shuffle(train)
trainx = train[:, :-1]
trainy = train[:, -1]
testx = result[train_size:, :-1]
testy = result[train_size:, -1]

# 调整为LSTM要求的输入格式 (sample_size,time_steps,
trainx = np.reshape(trainx, (trainx.shape[0], trainx.shape[1], 1))
testx = np.reshape(testx, (testx.shape[0], testx.shape[1], 1))

# 定义模型
model = Sequential()
model.add(LSTM(50, input_shape=(None, 1), return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(100, return_sequences=False))
model.add(Dropout(0.2))
model.add(Dense(1))
model.add(Activation('linear'))

# 训练模型
model.compile(loss='mse', optimizer='rmsprop')
model.fit(trainx, trainy, epochs=10, batch_size=100, validation_data=(testx, testy))

# Train on 41336 samples, validate on 4593 samples
# Epoch 1/10
# 41336/41336 [==============================] - 18s 444us/step - loss: 3.4548 - val_loss: 0.7029
# Epoch 2/10
# 41336/41336 [==============================] - 20s 485us/step - loss: 1.4621 - val_loss: 0.6527
# Epoch 3/10
# 41336/41336 [==============================] - 23s 546us/step - loss: 1.3706 - val_loss: 0.6739
# Epoch 4/10
# 41336/41336 [==============================] - 23s 546us/step - loss: 1.3354 - val_loss: 0.4942
# Epoch 5/10
# 41336/41336 [==============================] - 21s 511us/step - loss: 1.3052 - val_loss: 0.4900
# Epoch 6/10
# 41336/41336 [==============================] - 21s 518us/step - loss: 1.2928 - val_loss: 0.5053
# Epoch 7/10
# 41336/41336 [==============================] - 19s 471us/step - loss: 1.2629 - val_loss: 0.5168
# Epoch 8/10
# 41336/41336 [==============================] - 21s 504us/step - loss: 1.2635 - val_loss: 0.5228
# Epoch 9/10
# 41336/41336 [==============================] - 18s 446us/step - loss: 1.2602 - val_loss: 0.5764
# Epoch 10/10
# 41336/41336 [==============================] - 20s 486us/step - loss: 1.2583 - val_loss: 0.5141



个人资料
感情洁癖
等级:5
文章:4篇
访问:1.9k
排名: 30
下一篇:Keras 实现一元线性回归
猜你感兴趣的圈子:
深度学习交流圈
标签: loss、epoch、train、step、result、面试题
隐藏