Keras 实现一元线性回归

直接参考代码:

import keras
import matplotlib.pyplot as plt
import numpy as np
from keras.layers import Dense
from keras.models import Sequential

print("keras version=>", keras.__version__)
# keras version=> 2.3.1

###########生成数据#######
np.random.seed(0)

trainx = np.linspace(-0.6, 0.6, 1000)
noise = np.random.normal(0, 0.0002, trainx.shape)
trainy = np.multiply(trainx, 3) + 0.75 + noise

validx = np.random.random((200,))
validy = np.multiply(validx, 3) + 0.75

plt.scatter(trainx, trainy)
plt.show()

print("input data shape: trainx=>{},trainy=>{},validx=>{},validy=>{}".
      format(trainx.shape, trainy.shape, validx.shape, validy.shape))
# input data shape: trainx=>(1000,),trainy=>(1000,),validx=>(200,),validy=>(200,)

#定义模型
model = Sequential()
model.add(Dense(units=1, input_dim=1, use_bias=True))
print(model.summary())

# Model: "sequential_1"
# _________________________________________________________________
# Layer (type)                 Output Shape              Param #
# =================================================================
# dense_1 (Dense)              (None, 1)                 2
# =================================================================
# Total params: 2
# Trainable params: 2
# Non-trainable params: 0
# _________________________________________________________________
# None

#训练模型
model.compile(loss='mse', optimizer='sgd')
model.fit(trainx, trainy, epochs=100, batch_size=10, validation_data=(validx, validy))
# Train on 1000 samples, validate on 200 samples
# Epoch 1/100
# 1000/1000 [==============================] - 0s 95us/step - loss: 0.6641 - val_loss: 1.4196
# Epoch 2/100
# 1000/1000 [==============================] - 0s 63us/step - loss: 0.3261 - val_loss: 0.7654
# Epoch 3/100
# 1000/1000 [==============================] - 0s 62us/step - loss: 0.1998 - val_loss: 0.4592
# Epoch 4/100
# 1000/1000 [==============================] - 0s 62us/step - loss: 0.1234 - val_loss: 0.2831
# ...
# ...
# Epoch 96/100
# 1000/1000 [==============================] - 0s 62us/step - loss: 3.9039e-08 - val_loss: 8.0054e-11
# Epoch 97/100
# 1000/1000 [==============================] - 0s 62us/step - loss: 3.9039e-08 - val_loss: 2.0959e-10
# Epoch 98/100
# 1000/1000 [==============================] - 0s 59us/step - loss: 3.9045e-08 - val_loss: 8.2652e-11
# Epoch 99/100
# 1000/1000 [==============================] - 0s 47us/step - loss: 3.9048e-08 - val_loss: 9.7926e-11
# Epoch 100/100
# 1000/1000 [==============================] - 0s 47us/step - loss: 3.9051e-08 - val_loss: 2.0124e-10

W, b = model.layers[0].get_weights()
print('W=', W, 'b=', b)
# W= [[2.9999964]] b= [0.7499877]




个人资料
感情洁癖
等级:5
文章:4篇
访问:1.9k
排名: 30
上一篇: 使用Keras中的RNN模型进行时间序列预测
下一篇:Keras 实现多元线性回归
猜你感兴趣的圈子:
深度学习交流圈
标签: loss、0s、trainx、epoch、shape、面试题
隐藏