Brown University’s Harrison E. Farnsworth Professor of Physics, J. Michael Kosterlitz, recently published his first paper in the Proceedings of the National Academy of Sciences of the United States of America (PNAS) since becoming a member of the National Academy in April of 2017.

Kosterlitz’s article, “Global potential, topology, and pattern selection in a noisy stabilized Kuramoto–Sivashinsky equation” authored with collaborators from the Shanghai Center for Quantitative Life Sciences, Yong-Cong Chen, Chunxiao Shi, Xiaomei Zhu and Ping Ao, posits a novel theoretical framework for understanding how the transition from a disordered to an ordered state complex systems proceeds through nonlinear steps.

In 2016, Kosterlitz, along with David J. Thouless and F. Duncan M. Haldane, received the Nobel Prize in Physics for work using the mathematical field of topology to examine phase transitions in exotic states of matter. In this paper, Kosterlitz and his collaborators utilize topology to investigate systems that are out of equilibrium.

Kosterlitz describes the equilibrium state as the final, or “boring” state of a system, but notes that “systems that are out of equilibrium are much harder to deal with because you have to discuss them in terms of the dynamics of their evolution towards some equitable state, assuming you know what that is.” He notes that “in the real world, most systems are out of equilibrium, systems are constantly evolving towards some equilibrium state.” 

Kosterlitz was particularly interested in certain experimental systems that “appear to come eventually to some stationary time-independent state.” He clarifies, “it’s not equilibrium, it’s a driven-out-of-equilibrium state, but it seems to be time-independent.” He wondered if “maybe the same thing happens in these driven-out of-equilibrium systems when they come to a unique final stationary state.”

Working with Brown Physics graduate student Saloni Saxena, Kosterlitz wanted to test his hypothesis out on a simple model system to see if this speculation could be right. “There's no reason for it to be right, it’s just that in some experiments it seems to happen, and I just wanted to test it out, and it seemed to work for a very simple system.”

In discussions with his collaborators at the Shanghai Center for Quantitative Life Sciences, Kosterlitz learned “they had some clever analytical methods of dealing with the same type of problem but in much simpler systems with only a few variables.”  Kosterlitz “wanted to see if the same thing happened for more complex systems.”

In order to do so, Kosterlitz insisted on incorporating stochastic, or random, noise in order to mimic real-life systems. “Every real system is subject to some random perturbation whether it’s fluctuations in temperature or a truck driving by the laboratory.” By way of example, Kosterlitz notes how if you have a slightly sticky ball that slides down a hill, “it will get stuck in the first local minimum it gets to.” In order to get it out of that state, “you need some random noise or fluctuations. It's like if you shake the system eventually it will jiggle out of it and end up in the deepest minimum. But it won’t stay there forever because with some new fluctuation it gets excited out of that state.” 

Kosterlitz was curious if it was possible to show that a system would spend most of its time in a particular state. Along with his collaborators, he managed to construct a formula that put that into a mathematical language, and in the particular system they looked at “it happened to fit.”  For future work, Kosterlitz would like to see if the same holds true for more complicated systems that do come to a stationary state.

By: Pete Bilderback