What Is Entropy?

sebg | 33 points

Entropy can also be understood as uniformity.

For example, when you add a bit of milk to a cup of black coffee, at first you'll see the white milk and black coffee as separate. But with a single stir (or just over time), they mix together and become a uniform blend—the entropy has increased, the uniformity has grown. You can't just stir it the opposite way and magically separate the milk from the coffee again.

Over time, an abandoned house gradually turns into dust and rubble—a more uniform state. Entropy, or this increasing uniformity, always grows over time. To me, uniformity feels more "ordered," so in that sense, entropy is a measure of order, or how mixed things are.

Citizen_Lame | 2 months ago

>The state of unknown information is 23 bits per molecule

This begs the following question: How many bits per molecule is known, or assumably measurable?

What format is this “known” information encoded into or otherwise measured?

Fascinating

AndrewKemendo | 2 months ago

I just love John Baez stuff, hitting the perfect balance between physics and math.

fregus | 2 months ago