Entropy
June 09, 2020
Entropy can be thought of as a measure of disorder, uncertainty, or even surprise.
If you have shied away from entropy in the past, you’re not alone, in at least two senses.
First, we observe that all things in nature eventually increase in entropy, which is to say that they decay, but living systems (like you) appear at first to be a bit of an exception, as they (like you) resist the tendency towards entropy by repeatedly reducing it.
Second, Claude Shannon also tried to shy away from it, wanting to call the concept of entropy “information” instead of “entropy” in information theory:
My greatest concern was what to call it. I thought of calling it “information,” but the word was overly used, so I decided to call it “uncertainty.” When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name. In the second place, and more importantly, no one knows what entropy really is, so in a debate you will always have the advantage.”
The details of this anecdote are debated, but the point stands: Entropy is counter-intuitive, possibly because we’re more likely to think about measuring things in terms of order rather than in terms of _dis_order.
More on entropy: It is originally a concept from thermodynamics, but was later applied to information theory. There are many other uses for it as well.
I'm Bryan Kam. I'm thinking about complexity and selfhood. Please sign up to my newsletter, follow me on Mastodon, or see more here.