Entropy

🏠 Home · 📚 Reading Notes · 📝 Articles · ℹ️ About

Entropy

Entropy is an ontic measure of the number of physically accessible microstates compatible with a system’s macroscopic constraints.

In dynamical terms, entropy characterizes the statistical structure of state space under a fixed dynamical evolution: it describes how many alternative trajectories remain accessible given noise, coupling, and constraint. Entropy does not drive motion or cause change; it constrains what typical evolutions look like.

As constraints relax or noise dominates, increasingly many microstates and trajectories become accessible. Under these conditions, distinctions are less likely to persist, and records become unstable unless additional energetic work is performed.

Maintaining persistent distinctions therefore requires entropy export to an environment. Entropy is not reduced in isolation; it is displaced.

Through this role, entropy:


Distillation

Entropy measures how easily a system forgets its past.


Why it matters


Used in molecules

Conflicts with


Sources


Re-contextualization Log