top of page
an abstract depiction of entropy.jpg

ENTROPY

The concept of entropy is challengingly abstract - but entropy is a physical quantity, in terms of both thermodynamics and computation.

What is entropy? It’s a physical quantity – the amount of disorder in a system, or a description of the possible system states. And it’s a computational quantity – a distribution of possible system states, after all, it is a quantity of information. But entropy is also a thermodynamic quantity – the inefficiency of a system, or the amount of energy that is not available to do work. So entropy is a physical quantity, a computational quantity, and a thermodynamic quantity. But how are all these ideas related? 

​

“Energy cannot be created nor destroyed,” is something many people know as the first law of thermodynamics. An important clarification is that while the total amount of energy remains constant, it can change forms. For example, burning a log changes the form of the log’s energy from wood to ash, from matter to heat. While the log looks much different than when it started, energy was not destroyed, just dissipated into the environment as heat. Entropy appears in the second law of thermodynamics, which states that entropy is always increasing, or put another way, disorder is always increasing. 

​

Think of a time you left a glass of ice water on your room-temperature kitchen counter and by the time you came back it had melted. Now think of a time you left a glass of water on the counter, and you came back, and it was frozen. Of course, only one of these scenarios happens in real life – and that perfectly illustrates the second law of thermodynamics. But how is a liquid considered “more disordered” than ice? Well, the ice is a crystal – which means it has a very ordered structure, with the atoms neatly lined up. Meanwhile, the liquid is composed of atoms that are freely floating around and bumping into each other. There are many more possible system states in the liquid. Hence, more entropy. 

​

It’s useful to think of entropy in a different, more mathematical way – using an analogy. 

​

We can imagine six dogs in an enclosed dog park. These dogs are free to move about the park, with the park being divided between two zones. The zones are equal in size and area, and they have no discernable features. This situation has maximal entropy – literally any arrangement of dogs across the park is possible. Now, let’s imagine that one of the two zones contains a treat dispenser. In this case, it is far more likely that the dogs are congregated in one zone, rather than the other. This situation has less entropy – less uncertainty, more predictable outcomes. 

​

​

​

​

​

​

​

​

​

​

​

​

​

​​

​

​​​​​

​

​

​

five purebred golden retriever puppies sitting next to one pitbull puppy.jpg

Let’s go back to the situation where both zones of the park were completely equivalent, and the six dogs are allowed to move freely about these two zones. There may be one dog in the first zone while five are in the second, two in the first and four in the second etc. While the dogs can move freely throughout the dog park, there are 462 possible configurations and the most probable, accounting for 21.6% of all configurations, is half of the dogs in one zone and half in the other. ​

 

Returning to the liquid and ice example, there are many more potential configurations of water molecules than ice molecules, because liquids have more freedom of movement. Therefore, it is more statistically probable for ice to move to a liquid form than the reverse. Again, this is not to say that entropy can never decrease - but for it to decrease, energy must be dissipated (in the form of work) for the system to become more ordered. 

An example of maximum entropy.

The concept of entropy as a thermodynamic quantity helped to establish entropy as a mathematical value. This also laid the groundwork for the idea of entropy as a computational quantity in the field of information theory. Information theory is the study of the quantification, storage, and transfer of information. In this context, entropy is often described as a measure of uncertainty or “surprise” value. For example, the more “surprising” an outcome is, the greater the entropy associated with that outcome, and the more information that can be gained.

 

Let’s return to the dog park example to understand this concept. ​If we again have six dogs in a dog park and all six dogs are golden retrievers, it would be unsurprising that the man standing next to you owns a golden retriever. In this case, the answer is expected, there is no other possible outcome, and therefore provides no new information. However, if there were five golden retrievers and one pit bull, you would expect the man most likely owns one of the golden retrievers. Here it would be mildly surprising to learn he actually owns the pit bull. But of course, if there were six different breeds, you would be equally surprised no matter which dog the man owned, and any answer would provide new information. ​

 

In the context of information theory, the greater the amount of entropy, the greater the amount of information within the message. In other words, the more surprising the message is, the more you gain from the message. And there is a limit to the amount of information that can be transferred, which is called bandwidth. ​

 

This type of entropy is called Shannon entropy, after the pioneer of information theory, Claude Shannon. Shannon entropy relies on binary computation - for example, the man either owns a golden retriever or he doesn’t. But there is another kind of entropy, which is not strictly binary. This is called von Neumann entropy, after the pioneer of computational theory, John von Neumann. Von Neumann entropy can accommodate much greater levels of uncertainty – for example, all the possible dogs one might own and all their possible positions and momentums. And that sort of high-dimensional distribution of possibilities is useful when computing complex data. ​

 

When we discuss entropy as a higher dimensional probability distribution, we not are discussing not only one axis of uncertainty - like the location of each dog in the dog park, or the spin-up/spin-down state of a qubit - but rather many possible positions, many possible orbital configurations, and many other possible features of a qubit. And we are not only discussing the computational power of such a broad distribution of possible system states - we are also discussing the energy distributed to create this entropy.And so, it is useful to consider entropy as both a computational quantity and a thermodynamic quantity. By doing so, we can start considering how energy-efficient computation works in practice. 

BACK TO THE BEEHIVE
bottom of page