The following is a Feed-forward network using the nn.functional() module in PyTorch
import torch.nn as nn import torch.nn.functional as F class newNetwork(n
There is no difference between the two. The latter is arguably more concise and easier to write and the reason for "objective" versions of pure (ie non-stateful) functions like ReLU and Sigmoid is to allow their use in constructs like nn.Sequential.
ReLU
Sigmoid
nn.Sequential