S
S
SemenAnnigilator2021-10-17 20:59:29
Python
SemenAnnigilator, 2021-10-17 20:59:29

How to implement a custom loss function in Pytorch?

Here is the code for the loss function (log_loss is a loss from sklearn), I have a problem with backward, I can't figure out how to implement it (already tried to do it through the autograd function).

from torch.autograd.function import Function
class custom_loss:#(Function):
    def __init__(self, preds, targets):
        self.preds = preds
        self.targets = targets
    #@staticmethod
    def forward(self):
        log_loss_ = 0
        #self.preds.save_for_backward(self.targets)
        #gt = np.array(gt)
    
        for i in range(10):
            log_loss_ += log_loss(self.target[:, i], self.preds[:, i])
        
        return torch.tensor(log_loss_ / 10, requires_grad = True)

Answer the question

In order to leave comments, you need to log in

1 answer(s)
X
xandox, 2021-10-19
@xandox

And why not use your family for hanging out? https://pytorch.org/docs/stable/generated/torch.nn...

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question