0

Getting wrong output while calculating Cross entropy loss using pytorch

Hi guys , I calculated the cross entropy loss using pytorch , Input = torch.tensor([[1.,0.0,0.0],[1.,0.0,0.0]]) ,label = torch.tensor([0, 0]) . The output must be 0 but I got ( tensor(0.5514) ) . Anyone Plz say why it's coming 0.55 instead of 0 code for reference

1 Answer 1

0

Yes, you are getting the correct output.

import torch
Input = torch.tensor([[1.,0.0,0.0],[1.,0.0,0.0]])
label = torch.tensor([0, 0])

print(torch.nn.functional.cross_entropy(Input,label))
# tensor(0.5514)

torch.nn.functional.cross_entropy function combines log_softmax and nll_loss in a single function:

It is equivalent to :

torch.nn.functional.nll_loss(torch.nn.functional.log_softmax(Input, 1), label)

Code for reference:

print(torch.nn.functional.softmax(Input, 1).log())
tensor([[-0.5514, -1.5514, -1.5514],
        [-0.5514, -1.5514, -1.5514]])

print(torch.nn.functional.log_softmax(Input, 1))
tensor([[-0.5514, -1.5514, -1.5514],
        [-0.5514, -1.5514, -1.5514]])

print(torch.nn.functional.nll_loss(torch.nn.functional.log_softmax(Input, 1), label))
tensor(0.5514)

Now, you see:

torch.nn.functional.cross_entropy(Input,label) is equal to

torch.nn.functional.nll_loss(torch.nn.functional.log_softmax(Input, 1), label)

2
  • Thanks for your detailed explanation and one more thing can we use softmax at the last layer? if we use cross entropy for loss calculation
    – Sukesh Ram
    Commented Jan 30, 2023 at 7:55
  • I think we can.. but, please double check I am not sure. Also, if this solution answers your question please accept it as an answer. Commented Jan 30, 2023 at 8:24

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.