Cross-entropy is a measure from the field of information theory that quantifies the difference between two probability distributions. It is commonly used in machine learning, particularly in classification problems, as a loss function to assess the performance of models, especially in the context of neural networks.
New to topics? Read the docs here!