Skip to content Skip to sidebar Skip to footer

Binary Cross Entropy Vs Categorical Cross Entropy With 2 Classes

When considering the problem of classifying an input to one of 2 classes, 99% of the examples I saw used a NN with a single output and sigmoid as their activation followed by a bin

Solution 1:

If you are using softmax on top of the two output network you get an output that is mathematically equivalent to using a single output with sigmoid on top. Do the math and you'll see.

In practice, from my experience, if you look at the raw "logits" of the two outputs net (before softmax) you'll see that one is exactly the negative of the other. This is a result of the gradients pulling exactly in the opposite direction each neuron.

Therefore, since both approaches are equivalent, the single output configuration has less parameters and requires less computations, thus it is more advantageous to use a single output with a sigmoid ob top.

Post a Comment for "Binary Cross Entropy Vs Categorical Cross Entropy With 2 Classes"