Source: /cirosantilli/entropy

= Entropy
{wiki}

OK, can someone please just stop the philosophy and give numerical predictions of how entropy helps you predict the future?

The original notion of entropy, and the first one you should study, is the <Clausius entropy>.

For entropy in chemistry see: <entropy of a chemical reaction>{child}.

* https://www.youtube.com/watch?v=0-yhZFDxBh8 The Unexpected Side of Entropy by Daan Frenkel (2021)

\Video[https://www.youtube.com/watch?v=rBPPOI5UIe0]
{title=The Biggest Ideas in the Universe | 20. Entropy and Information by <Sean Carroll> (2020)}
{description=In usual Sean Carroll fashion, it glosses over the subject. This one might be worth watching. It mentions 4 possible definitions of entropy: Boltzmann, Gibbs, Shannon (<information theory>) and <John von Neumann> (<quantum mechanics>).}