How to update parameter by hand during training without disturbing autograd in Pytorch?

前端 未结 0 950
轻奢々
轻奢々 2021-01-15 00:46

Here is my code,

class model(torch.nn.Module):
    def __init__(self, config):
        super(model, self).__init__()
        self.memory_matrix = torch.zeros(         


        
相关标签:
回答
  • 消灭零回复
提交回复
热议问题