Pytorch框架学习(4)——autograd与逻辑回归
autograd与逻辑回归 文章目录 autograd与逻辑回归 1. torch.autograd自动求导系统 2. 逻辑回归 1. torch.autograd自动求导系统 torch.autograd.backward 功能:自动求取梯度 tensors:用于求导的张量,如loss retain_graph:保存计算图 create_graph:创建导数计算图,用于高阶求导 grad_tensors:多梯度权重(当有多个loss需要计算梯度时,需要设置各个loss之间的权重比例) w = torch . tensor ( [ 1 . ] , requires_grad = True ) x = torch . tensor ( [ 2 . ] , requires_grad = True ) a = torch . add ( w , x ) b = torch . add ( w , 1 ) y0 = torch . mul ( a , b ) y1 = torch . add ( a , b ) loss = torch . cat ( [ y0 , y1 ] , dim = 0 ) grad_tensors = torch . tensor ( [ 1 . , 2 . ] ) loss . backward ( gradient = grad_tensors ) #