Considering e.g. Newton's laws of motion, you take a system that is a function of time , e.g. the position of many point particles, and then you reverse the speeds of all particles, then is a solution to that.
Computable number Updated 2025-07-16
There are only boring examples of taking an uncomputable language and converting it into a number?
Mortal matrix problem Updated 2025-07-16
One of the most simple to state undecidable problems.
The reason that it is undecidable is that you can repeat each matrix any number of times, so there isn't a finite number of possibilities to check.
Operon Updated 2025-07-16
Sequence of genes under a single promoter. For an example, see E. Coli K-12 MG1655 operon thrLABC.
A single operon may produce multiple different transcription units depending on certain conditions, see: operon vs transcription unit.
Pali Canon Updated 2025-07-16
Anything that is not in the Pali Canon has basically zero chance of having come from Buddha or his immediate followers.
Special unitary group Updated 2025-07-16
The complex analogue of the special orthogonal group, i.e. the subgroup of the unitary group with determinant equals exactly 1 instead of an arbitrary complex number with absolute value equal 1 as is the case for the unitary group.
This is very widely used in courses as of 2020, it became kind of the default book.
Unfortunately, this approach bores Ciro Santilli to death. Or perhaps is too just advanced for him to appreciate. Either of those.
800+ pages.
Ape Updated 2025-07-16
Ape subclade Updated 2025-07-16
Entropy Updated 2025-07-16
OK, can someone please just stop the philosophy and give numerical predictions of how entropy helps you predict the future?
The original notion of entropy, and the first one you should study, is the Clausius entropy.
Video 1.
The Unexpected Side of Entropy by Daan Frenkel
. Source. 2021.
Video 2.
The Biggest Ideas in the Universe | 20. Entropy and Information by Sean Carroll (2020)
Source. In usual Sean Carroll fashion, it glosses over the subject. This one might be worth watching. It mentions 4 possible definitions of entropy: Boltzmann, Gibbs, Shannon (information theory) and John von Neumann (quantum mechanics).
enwiki-latest-categorylinks.sql Updated 2025-07-16

There are unlisted articles, also show them or only show them.