= Entropy
{wiki}
OK, can someone please just stop the philosophy and give numerical predictions of how entropy helps you predict the future?
The original notion of entropy, and the first one you should study, is the <Clausius entropy>.
For entropy in chemistry see: <entropy of a chemical reaction>.
\Video[https://www.youtube.com/watch?v=0-yhZFDxBh8]
{title=The Unexpected Side of Entropy by Daan Frenkel}
{description=2021.}
\Video[https://www.youtube.com/watch?v=rBPPOI5UIe0]
{title=The Biggest Ideas in the Universe | 20. Entropy and Information by <Sean Carroll> (2020)}
{description=In usual <Sean Carroll> fashion, it glosses over the subject. This one might be worth watching. It mentions 4 possible definitions of entropy: Boltzmann, Gibbs, Shannon (<information theory>) and <John von Neumann> (<quantum mechanics>).}
* https://www.quantamagazine.org/what-is-entropy-a-measure-of-just-how-little-we-really-know-20241213/ What Is Entropy? A Measure of Just How Little We Really Know. on <Quanta Magazine> attempts to make the point that entropy is observer dependant. TODO details on that.
Back to article page