Answer the question
In order to leave comments, you need to log in
How to implement a custom loss function in Pytorch?
Here is the code for the loss function (log_loss is a loss from sklearn), I have a problem with backward, I can't figure out how to implement it (already tried to do it through the autograd function).
from torch.autograd.function import Function
class custom_loss:#(Function):
def __init__(self, preds, targets):
self.preds = preds
self.targets = targets
#@staticmethod
def forward(self):
log_loss_ = 0
#self.preds.save_for_backward(self.targets)
#gt = np.array(gt)
for i in range(10):
log_loss_ += log_loss(self.target[:, i], self.preds[:, i])
return torch.tensor(log_loss_ / 10, requires_grad = True)
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question