Source: wikibot/cross-entropy
= Cross-entropy
{wiki=Cross-entropy}
Cross-entropy is a measure from the field of information theory that quantifies the difference between two probability distributions. It is commonly used in machine learning, particularly in classification problems, as a loss function to assess the performance of models, especially in the context of neural networks.