Pytorch提供torchvision包,这个包提供MNIST数据集、CIFAR数据集 实现代码:
#train=Ture表示取训练集数据,train=False表示取测试集数据,若第一次使用就有download=Ture下载数据,下载过了download=False
train_set = torchvision.datasets.MNIST(root='/Users/l/PycharmProjects/pythonProjectTest/mr_liu',train=True,download=True)
test_set = torchvision.datasets.MNIST(root='/Users/l/PycharmProjects/pythonProjectTest/mr_liu',train=False,download=True)
train_set = torchvision.datasets.CIFAR10(root='/Users/l/PycharmProjects/pythonProjectTest/mr_liu',train=True,download=True)
test_set = torchvision.datasets.CIFAR10(root='/Users/l/PycharmProjects/pythonProjectTest/mr_liu',train=False,download=True)
Sigmoid函数:有极限、单调增、饱和函数。其中最典型的有logistic函数,所以有时把logistic函数称为Sigmoid函数。
Logistic回归模型代码
import torch
import torch.nn.functional as F
import matplotlib.pyplot as plt
#prepare dataset
x_data = torch.Tensor([[1.0], [2.0], [3.0]])
y_data = torch.Tensor([[0], [0], [1]])
# design model using class
class LogisticRegressionModel(torch.nn.Module):
def __init__(self):
super(LogisticRegressionModel, self).__init__()
self.linear = torch.nn.Linear(1, 1)
def forward(self, x):
y_pred = F.sigmoid(self.linear(x))
return y_pred
model = LogisticRegressionModel()
# construct loss and optimizer
# 默认情况下,loss会基于element平均,如果size_average=False的话,loss会被累加。
criterion = torch.nn.BCELoss(size_average=False)#ruduction = 'sum'
optimizer = torch.optim.SGD(model.parameters(), lr=0.01)
# training cycle forward, backward, update
num_epochs = 1000
losses = []
for epoch in range(num_epochs):
y_pred = model(x_data)
loss = criterion(y_pred, y_data)
print(epoch, loss.item())
losses.append(loss.item())
optimizer.zero_grad()
loss.backward()
optimizer.step()
print('w = ', model.linear.weight.item())
print('b = ', model.linear.bias.item())
x_test = torch.Tensor([[4.0]])
y_test = model(x_test)
print('y_pred = ', y_test.data)
plt.plot(range(num_epochs), losses)
plt.ylabel('Loss')
plt.xlabel('epoch')
plt.show()