What does the parameter retain_graph mean in the Variable's backward() method?
问题 I'm going through the neural transfer pytorch tutorial and am confused about the use of retain_variable (deprecated, now referred to as retain_graph ). The code example show: class ContentLoss(nn.Module): def __init__(self, target, weight): super(ContentLoss, self).__init__() self.target = target.detach() * weight self.weight = weight self.criterion = nn.MSELoss() def forward(self, input): self.loss = self.criterion(input * self.weight, self.target) self.output = input return self.output def