Entropy Updated +Created
OK, can someone please just stop the philosophy and give numerical predictions of how entropy helps you predict the future?
The original notion of entropy, and the first one you should study, is the Clausius entropy.
Video 1.
The Unexpected Side of Entropy by Daan Frenkel
. Source. 2021.
Video 2.
The Biggest Ideas in the Universe | 20. Entropy and Information by Sean Carroll (2020)
Source. In usual Sean Carroll fashion, it glosses over the subject. This one might be worth watching. It mentions 4 possible definitions of entropy: Boltzmann, Gibbs, Shannon (information theory) and John von Neumann (quantum mechanics).
Lattice-based cryptography Updated +Created
Quantum memory Updated +Created
TODO clear example and application.
Representation theory Updated +Created
Basically, a "representation" means associating each group element as an invertible matrices, i.e. a matrix in (possibly some subset of) , that has the same properties as the group.
Or in other words, associating to the more abstract notion of a group more concrete objects with which we are familiar (e.g. a matrix).
Each such matrix then represents one specific element of the group.
This is basically what everyone does (or should do!) when starting to study Lie groups: we start looking at matrix Lie groups, which are very concrete.
Or more precisely, mapping each group element to a linear map over some vector field (which can be represented by a matrix infinite dimension), in a way that respects the group operations:
As shown at Physics from Symmetry by Jakob Schwichtenberg (2015)
Bibliography: