请用knet为我实现一个线性回归模型
import knet as kn import numpy as np
构建数据集
X = np.array([[1, 2], [2, 3], [3, 4], [4, 5], [5, 6]]) y = np.array([3, 4, 5, 6, 7])
构建模型
model = kn.nn.Sequential( kn.nn.Linear(2, 1) )
定义损失函数和优化器
loss_fn = kn.mse_loss optimizer = kn.optim.SGD(model.parameters(), lr=0.01)
训练模型
for epoch in range(100): # 前向传播 y_pred = model(X)
# 计算损失
loss = loss_fn(y_pred, y)
# 反向传播
optimizer.zero_grad()
loss.backward()
optimizer.step()
# 打印损失
if epoch % 10 == 0:
print('Epoch [{}/{}], Loss: {:.4f}'.format(epoch+1, 100, loss.item()))
测试模型
X_test = np.array([[6, 7], [7, 8], [8, 9], [9, 10], [10, 11]]) y_test = np.array([8, 9, 10, 11, 12])
with kn.no_grad(): y_pred = model(X_test) test_loss = loss_fn(y_pred, y_test) print('Test Loss: {:.4f}'.format(test_loss.item())
原文地址: https://www.cveoy.top/t/topic/fQxO 著作权归作者所有。请勿转载和采集!