Cross-entropy is taken from the information theory field. It helps in calculation difference between two probability distributions, when we want to compare the distributions.
In Machine Learning, cross entropy is used to compare the actual and predicted outcomes.
In the above Equation,
x is the total number of values and p(x) is the probability of distribution in the real world. In the projected distribution B, A is the probability distribution and q(x) is the probability of distribution
Entropy is used as a loss function when optimizing classification problems