Stochastic thermodynamics finds exceptions to the law of entropy
Entropy is only highly likely not guaranteed to increase.
Thermodynamics is one of the most venerable physical theories. 18th and 19th century physicists like Carnot, Clausius, Gibbs, Helmholtz, and Boltzmann established it as a cornerstone of how macroscopic systems made of many, many constituent particles behave, how heat is transported from one system to another, and how engines do work and with what efficiency.
It is from thermodynamics that we get laws like: there are no perpetual motion machines (machines with 100% or more efficiency or that have non-zero efficiency in thermal equilibrium).
Unlike the laws of mechanics or many field theories, however, thermodynamics has always been recognized as containing both strict and de facto laws. The primary laws are:
Energy can be neither created nor destroyed. (Strict)
Entropy can never decrease. (De Facto)
There are qualifiers to this. In the 1st law, the system must be closed. For the second, we are talking about bringing two or more isolated systems together.
The conservation of energy is a consequence of time translation symmetry of our universe. All times in physical laws can be shifted by a constant amount. We get that from Emmy Noether’s theorem. No physical law breaks the conservation of energy.
The second law is a “de facto” law of physics. I use this term where others might use the word “statistical”. But to say it is statistical is an understatement compared to other fields like sociology. Until about 25 years ago, a violation of the 2nd law was almost thought to be impossible.
Carnot’s theorem of engine efficiency derives from these laws. A consequence of his theorem is that, if you have two systems at the same temperature (in equilibrium), then the efficiency is 0%, nothing moving. Since all systems equilibriate thermally when connected because their efficiency is < 100%, all engines run down. There are no perpetuum mobile, no perpetual motion machines.
A problem with the 2nd law of thermodynamics is that it conflicts with another, strict law of physics: that all physical processes are time reversible. Any trajectory of a particle forwards in time can move backwards in time the same way it came.
If that is so, then why would a law that implies the existence of irreversible processes like the 2nd law also exist?
Physicists wrestled with that idea for decades until about 1993 when Evans and team proposed a potential solution called the fluctuation theorem. The fluctuation theorem provides an exception to the 2nd law of thermodynamics that admits reversibility even in so called irreversible processes.
Essentially, all it says is that increase in entropy becomes exponentially more probable with longer duration of time. Thus, the 2nd law is no longer strict, just exponentially probable. (An exponential probability distribution goes to 0 or 1 extremely quickly, making it appear to be a virtual certainty.)
This theorem, which has been proved from mechanics, opened the door to a new science: stochastic thermodynamics, and, in the last twenty years, it has been demonstrated experimentally multiple times, each time with more startling results.
The most recent demonstration, published this month, by a team at University of Arkansas, is perhaps the most amazing: power generation from the “out-of-plane” thermal motion of a wafer of graphene (that all-powerful substance made of the thinnest layer of graphite, i.e. carbon).
While at first glance, this is a perpetual motion machine, the theory behind it is sound. The basic idea is that any system exchanges entropy (sometimes just called heat) with its environment (also called a thermal bath because it is a reservoir at a particular temperature). While the contact between a system and its bath is at equilibrium, making exchanges of energy in either direction equally likely, the rest of the system can be far from equilibrium. This means that over short durations of time, systems can do useful work as they relax to equilibrium.
Yet, they never stay in perfect equilibrium if they are very small, made of a few individual particles. Instead, they undergo random fluctuations that can push them away from equilibrium. Such is the case with graphene which experiences fluctuations from quantum effects even in perfect vacuum.
While the authors of the study claim in the news media that they could harvest power from thermal fluctuations, as yet, they have not demonstrated that they can store the energy. Instead, their experiment dissipated it through a resistor as heat. Time will tell if genuine power can be harvested this way. Mother nature will take her due in the end; the question is whether something useful can be done before that.
They plan to store the energy in a tiny capacitor as their next experiment. The hope would be to dissipate it at a later time by releasing the energy from the capacitor.
One way to think about this is that you are not really creating a perpetual motion machine. Rather you are extracting energy from the variance around the thermodynamic mean (average) of a system. In other words, by causing a system to return to equilibrium faster after some fluctuation pushes it away, one can do useful work.
The science behind stochastic thermodynamics goes far beyond harvesting energy from Brownian motion, however. It is a fundamentally more sophisticated and coherent theory of thermodynamics, one that takes into account exactly how systems reach equilibrium, how they stay in it, how they can be knocked out of it, and what happens when systems are connected to many reservoirs of heat rather than one, as in the case of graphene being connected to its thermal reservoir and a circuit.
Stochastic thermodynamics combines two essential ideas: ordinary statistical mechanics which underlies classical thermodynamics and Markov processes which are critical to understanding non-equilibrium statistical processes like engines and control systems.
While equilibrium statistical mechanics ignores time, Markov processes are a way of representing how stochastic (anything that obeys a random rather than deterministic law) processes change with time. Combining the two, we arrive at a description of dynamic processes that may, over long periods of time, have stationary statistical states (are in thermal equilibrium) but over short periods of time the probability of deviating from equilibrium is non-zero (but exponentially small). This can do useful work on another system.
One of the key properties of Markov process descriptions of equilibrium systems is called detailed balance which derives from something called a master equation. A master equation represents how a stochastic system transitions from one state to the next. Detailed balance is the idea that if a system has two states m and m’ that it is equally likely to transition from m to m’ as from m’ to m, i.e., equally likely to transition one way as the other. This is a fundamental statement of equilibrium.
In order for a Markov process to do work, it has to violate detailed balance for some short amount of time. Violations of detailed balance manifest as a force because they increase the likelihood of a state transition in one direction versus another. This idea is fundamental to non-equilibrium processes, but we now know it applies to “equilibrium” processes as well, because, in reality, there is no such thing as equilibrium at all time scales, i.e., equilibrium is not scale invariant.
This theory helped the University of Arkansas team prove that a particle of graphene under Brownian motion can generate power which is then piped through a pair of diodes to convert from AC to two DC currents and dissipated in a resistor. The theory is that if Brownian motion can generate a DC current like this, it can be dissipated in something else, possibly at a later time, like powering a microchip or a small LED.
Indeed, this is not the first time that a microscopic “engine” has been seen experimentally. In 2016, a Spanish team published an investigation of a Brownian engine where thermal fluctuations created deviations from the 2nd law of thermodynamics, allowing work to be done even in thermal equilibrium. They observed a single trapped particle of polystyrene immersed in water and saw how it could violate Carnot’s law of engine efficiency (essentially extracting energy in thermal equilibrium).
From this, it may be possible to create large arrays of nano-scale engines to produce energy, though the power and expense required to trap these particles may exceed that which they generate. One would need a passive trap, with graphene perhaps offering the answer in that its particles are trapped in its lattice of carbon molecules. The ability to generate this power in a vacuum may also make it applicable to space technology where a vacuum is readily available. Practical applications may be far off, however. The amount of power generated in the experiment is in the picowatts or trillionths of a watt so the likelihood that your next iPhone will generate its own power from graphene is highly unlikely.
Evans, Denis J.; Cohen, E. G. D.; Morriss, G. P. (1993). “Probability of second law violations in shearing steady states”. Physical Review Letters. 71 (15)
Thibado, P. M., et al. “Fluctuation-induced current from freestanding graphene.” Physical Review E 102.4 (2020): 042101.
Martínez, Ignacio A., et al. “Brownian carnot engine.” Nature physics 12.1 (2016): 67–70.