损失函数
1. 分类损失函数
1)交叉熵损失函数, 2)二分类交叉熵损失函数
2. 回归损失函数
1)MAE平均绝对值损失函数。 2)MSE均方误差损失函数 3)Smooth L1损失函数
import torch
y_true = torch.tensor([[0,0,0], [0,0,1]], dtype=torch.float)
y_pred = torch.tensor([[0.1, 0.7, 0.2], [0.1, 0.3, 0.6]])
loss_fn = torch.nn.CrossEntropyLoss()
loss = loss_fn(y_pred, y_true)
print(loss, '交叉熵损失函数')
tensor(0.4266)
y_true = torch.tensor([0, 1, 0], dtype=torch.float)
y_pred = torch.tensor([0.1, 0.7, 0.2], requires_grad=True, dtype=torch.float32)
loss_fn = torch.nn.BCELoss()
loss = loss_fn(y_pred, y_true)
print(loss, '二分类交叉熵损失函数')
tensor(0.2284, grad_fn=<BinaryCrossEntropyBackward0>) 二分类交叉熵损失函数
y_true = torch.tensor([1, 1, 1], dtype=torch.float32)
y_pred = torch.tensor([1.2, 1.7, 1.9], requires_grad=True)
loss_fn = torch.nn.L1Loss()
loss = loss_fn(y_pred, y_true)
print(loss, 'mea, 平均绝对值')
tensor(0.6000, grad_fn=<MeanBackward0>) mea, 平均绝对值
y_true = torch.tensor([1, 1, 1], dtype=torch.float32)
y_pred = torch.tensor([1.2, 1.7, 1.9], requires_grad=True)
loss_fn = torch.nn.MSELoss()
loss = loss_fn(y_pred, y_true)
print(loss, 'mse, 平均方差')
tensor(0.4467, grad_fn=<MseLossBackward0>) mse, 平均方差
y_true = torch.tensor([1, 1, 1], dtype=torch.float32)
y_pred = torch.tensor([1.2, 1.7, 1.9], requires_grad=True)
loss_fn = torch.nn.SmoothL1Loss()
loss = loss_fn(y_pred, y_true)
print(loss, 'smooth L1')
tensor(0.2233, grad_fn=<SmoothL1LossBackward0>) smooth L1