Entropy Revisited

Posted: August 26, 2019 at 5:51 pm

Before starting the recent collage explorations, I had been doing more reading and thinking about entropy, see notes following. I also got a chance to watch a lecture that Sarah Dunsiger sent on entropy and emergence, I’ve included my notes on that below as well.

Entropy

  • Natural log of the number of states in a system multiplied by a constant
  • The log reduces very large numbers and does little for small numbers. (e.g. ln(10e06) = ~16, ln(10e02) = ~7
  • The constant (Boltzmann) is a very small number (~1e-23)
  • So entropy is a small representation of really large numbers of possible states.
  • All possible 640×480 images in 8bit have 5e12 possible states and an ‘entropy’ of 4.04e-22. (Does it make any sense to think of entropy of an image?? An image is not dynamical, entropy is about dynamics, not structure.)
  • Second Law of Thermodynamics:
    • Entropy of closed systems never decreases (the number of possible states only increases until equilibrium, maximum entropy)
    • Entropy in open systems may decrease if the environment entropy increases (the number of possible states may decrease if the number of states in the environment increases)
  • is entropy about the propagation of energy? Does a system with more energy have more states? If it has more states, it looses that energy to the environment (increasing the number of its states in the environment).
  • is there some analogy in ML? Could the energy be the state of excitement of the initial conditions? The rate of learning?
  • More entropy means more complexity because more information is needed to represent the potential states of a system.
  • This seems more about the constraints of the system than the specific energy states.

Sarah Papers

  • order can be introduced from entropy alone
  • order from disorder?
  • the whole often resembles the part (chiral particles make chiral structure)

Entropy and Emergence (Video Lecture)

  • entropy as a measure of what you don’t know about the state of a system
  • fewer states means more certainty due to less possibilities.
  • a high-entropy system is random / has many states and no constraint.
  • entropy as the minimum number of binary questions one must ask to fully determine the system.
  • random needs every question whereas a pattern can be compressed
  • Key take-away: entropy does not indicate disorder because a system may have more ordered states than disordered states.