0 votes
in PyTorch by
Computing softmax activation can be skipped when using nn.CrossEntropyLoss.

Select the best option from below

a) True

b) False

1 Answer

0 votes
by
a) True

Related questions

0 votes
asked Aug 7, 2021 in PyTorch by sharadyadav1986
+1 vote
asked Jan 28, 2020 in Data Handling by rahuljain1
...