avatar

深度学习入门(一)

最近发现了一个非常简单上手的深度学习框架keras,对新手来说简直不要太友好,真的就是deeplearning for human, 下面记录建立一个最简单的单层神经元模型的基本步骤。

单变量线性回归

数据处理

  1. 导入所需模块

  2. 使用random随机生成数据

1
2
x = np.linspace(0, 100, 30)
y = 3*x + 7 + np.random.randint(30)*6

散点图

建立模型

1
2
3
4
5
6
7
8
9
10
11
12
13
model = keras.Sequential()  # 顺序模型
model.add(layers.Dense(1, input_dim=1))
model.summary()
Model: "sequential_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense_1 (Dense) (None, 1) 2
=================================================================
Total params: 2
Trainable params: 2
Non-trainable params: 0
____________________________

编译模型

1
2
3
4
model.compile(
optimizer="adam", # 优化算法为adam
loss="mse" # 损失函数为均方差
)

训练模型

1
2
3
4
5
6
7
8
9
model.fit(x, y, epochs=9000)
Epoch 7524/9000
30/30 [==============================] - 0s 100us/step - loss: 2890.6289
Epoch 7525/9000
30/30 [==============================] - 0s 66us/step - loss: 2890.5745
Epoch 7526/9000
30/30 [==============================] - 0s 66us/step - loss: 2890.5193
Epoch 7527/9000
30/30 [==============================] - 0s 100us/step - loss: 2890.4651

结果对比

1
2
plt.scatter(x, y, c='r')
plt.plot(x, model.predict(x))

多变量线性回归

pandas读取数据并预处理

1
2
3
data = pd.read_csv("./dataset/线性回归.csv")
x = data[data.columns[1: -1]]
y = data.iloc[:, -1]

建立模型

1
2
3
4
5
6
7
8
9
10
11
12
13
model = keras.Sequential()
model.add(layers.Dense(1, input_dim=3)) y_pred = w1*x1 + w2+x2 + w3*x3 +b
model.summary()
Model: "sequential_2"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense_1 (Dense) (None, 1) 4
=================================================================
Total params: 4
Trainable params: 4
Non-trainable params: 0
____________________________

编译模型

1
2
3
4
model.compile(
optimizer='adam',
loss='mse'
)

训练模型

1
2
3
4
5
6
7
8
9
model.fit(x, y, epochs=2000)
Epoch 1/2000
200/200 [==============================] - 0s 30us/step - loss: 2.8109
Epoch 2/2000
200/200 [==============================] - 0s 35us/step - loss: 2.8055
Epoch 3/2000
200/200 [==============================] - 0s 40us/step - loss: 2.8167
Epoch 4/2000
200/200 [==============================] - 0s 55us/step - loss: 2.8635
文章作者: gh
文章链接: https://ghclub.top/posts/35246/
版权声明: 本博客所有文章除特别声明外,均采用 CC BY-NC-SA 4.0 许可协议。转载请注明来自 GHBlog
打赏
  • 微信
    微信
  • 支付寶
    支付寶

评论