Published in Physics World, 1 May 2014
To many physicists, “Tsallis entropy” has been a revolution in statistical mechanics. To others, it is merely a useful fitting technique. Jon Cartwright tries to make sense of this world of disorder
Physics may aim for simplicity, yet the world it describes is a mess. There is disorder wherever we look, from an ice cube melting to the eventual fate of the cosmos. Of course, physicists are well aware of that untidiness and have long used the concept of “entropy” as a measure of disorder. One of the pillars of physical science, entropy can be used to calculate the efficiency of heat engines, the direction of chemical reactions and how information is generated. It even offers an explanation for why time flows forwards, not backwards.
Our definition of entropy is expressed by one of the most famous formulae in physics, and dates back over a century to the work of the Austrian physicist Ludwig Boltzmann and the American chemist J Willard Gibbs. For more than 20 years, however, the Greek- born physicist Constantino Tsallis, who is based at the Brazilian Centre for Physics Research (CBPF) in Rio de Janeiro, has been arguing that entropy is in need of some refinement. The situation, accord- ing to Tsallis, is rather like Newtonian mechanics – a theory that works perfectly until speeds approach that of light, at which point Einstein’s special theory of relativity must take over.
Likewise, says Tsallis, entropy – as defined by Boltzmann and Gibbs – works perfectly, but only within certain limits. If a system is out of equilibrium or its component states depend strongly on one another, he believes an alternative definition should take over. Known as “Tsallis entropy” or “non-additive entropy”, it was first proposed by Tsallis himself in a 1988 paper (J. Stat. Phys. 52 479) that has gone on to become the most cited article written by a scientist (or group of scientists) based in Brazil. So far it has clocked more than 3200 citations, according to the Thomson Reuters Web of Science.
To many who study statistical mechanics, Tsallis entropy makes for a much broader view of how dis- order arises in macroscopic systems. “Tsallis entropy provides a remarkable breakthrough in statistical mechanics, thermodynamics and related areas,” says applied mathematician Thanasis Fokas at the University of Cambridge in the UK. In fact, Fokas goes as far as saying that subsequent work motivated by Tsallis’s discovery has been “a new paradigm in theoretical physics”.
Tsallis entropy has, though, been divisive, with a significant number of physicists believing he has not uncovered anything more general at all. But the voices of these detractors are fast being lost in the crowd of support, with Tsallis’s original paper being applied to everything from magnetic resonance imaging to particle physics. So are these applications exploiting a truly revolutionary theory? Or to put it another way: is Tsallis to Boltzmann and Gibbs what Einstein was to Newton? […]
To read the rest of this article, please email for a pdf.