Entropy
🏠 Home · 📚 Reading Notes · 📝 Articles · ℹ️ About
Entropy
Entropy is an ontic measure of the number of physically accessible microstates compatible with a system’s macroscopic constraints.
In dynamical terms, entropy characterizes the statistical structure of state space under a fixed dynamical evolution: it describes how many alternative trajectories remain accessible given noise, coupling, and constraint. Entropy does not drive motion or cause change; it constrains what typical evolutions look like.
As constraints relax or noise dominates, increasingly many microstates and trajectories become accessible. Under these conditions, distinctions are less likely to persist, and records become unstable unless additional energetic work is performed.
Maintaining persistent distinctions therefore requires entropy export to an environment. Entropy is not reduced in isolation; it is displaced.
Through this role, entropy:
- limits the counterfactual stability of distinctions and records,
- defines the background pressure against which persistence must be achieved,
- constrains which explanations remain valid over time.
Distillation
Entropy measures how easily a system forgets its past.
Why it matters
- Distinction: High entropy destabilizes partitions of state space unless constraints suppress transitions.
- Persistence: Metastability exists only relative to entropic pressure and transition rates.
- Records: Memory requires continuous work against entropy via constrained coupling and dissipation.
- RDD coherence: Prevents treating entropy as disorder, fate, or final state rather than a dynamical constraint on what can last.
Links
Related atoms
Used in molecules
Conflicts with
- Interpretations of entropy as disorder, decay, or teleological drive.
- Views that treat entropy as a final state rather than a constraint on dynamical explanations.
- Accounts that conflate entropy with energy loss or information destruction in isolation.
Sources
-
Source: Sharp, K., & Matschinsky, F. (2015). Translation of Ludwig Boltzmann’s Paper “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium” Sitzungberichte der Kaiserlichen Akademie der Wissenschaften. Mathematisch-Naturwissen Classe. Abt. II, LXXVI 1877, pp 373-435 (Wien. Ber. 1877, 76:373-435). Reprinted in Wiss. Abhandlungen, Vol. II, reprint 42, p. 164-223, Barth, Leipzig, 1909. Entropy, 17(4), 1971–2009. https://doi.org/10.3390/e17041971
- Key: @sharpTranslationLudwigBoltzmanns2015
- Use here: Introduces entropy as a measure of microstate multiplicity. -
Source: Sethna, J. P. (2021). Statistical Mechanics: Entropy, Order Parameters, and Complexity.
- Key: @sethnaEntropyOrderParameters2021
- Use here: Modern treatment of entropy as state space volume under constraints. -
Source: Parrondo, J. M. R., Horowitz, J. M., & Sagawa, T. (2015). Thermodynamics of information. Nature Physics, 11, 131–139.
- Key: @parrondoThermodynamicsInformation2015
- Use here: Links entropy, information, and the cost of maintaining records.
Re-contextualization Log
-
2025-12-29
- context: Explanatory audit revealed entropy being read as a causal driver or terminal state rather than a dynamical constraint
- effect: refined
- note: Reframed entropy explicitly as a constraint on accessible trajectories under fixed dynamics; clarified its role as pressure against persistence and record stability, not as a force, disorder, or end state.
-
2025-12-26
- context: Alignment with Constraint and Persistent Distinction atoms
- effect: clarified
- note: Removed rigidity and end-state framing; redefined entropy as a dynamical pressure expanding accessible state space and destabilizing distinctions absent compensating constraints.