PyTorch: custom loss doesn't backpropagate

后端 未结 0 1165
走了就别回头了
走了就别回头了 2021-01-18 19:36

We wrote a custom loss with PyTorch. The problem is that even if we write:

...
loss = Variable(loss, requires_grad = True)
return loss 

the g

相关标签:
回答
  • 消灭零回复
提交回复
热议问题