0 votes
in PyTorch by
Computing softmax activation can be skipped when using nn.CrossEntropyLoss.

Select the best option from below

a) True

b) False

1 Answer

0 votes
by
a) True
...