These cases “might be recognized as examples of apparent fine-tuning” between the system and its environment, Horowitz and England write, in which the system finds “rare states of extremal thermodynamic forcing.” (The second law is true because there are more ways for energy to be spread out among particles than to be concentrated, so as particles move around and interact, the odds favor their energy becoming increasingly shared.)īut for some initial settings, the chemical reaction network in the simulation goes in a wildly different direction: In these cases, it evolves to fixed points far from equilibrium, where it vigorously cycles through reactions by harvesting the maximum energy possible from the environment. This tendency to equilibrate, like a cup of coffee cooling to room temperature, is the most familiar outcome of the second law of thermodynamics, which says that energy constantly spreads and the entropy of the universe always increases. Often, the system settles into an equilibrium state, where it has a balanced concentration of chemicals and reactions that just as often go one way as the reverse. Starting with random initial chemical concentrations, reaction rates and “forcing landscapes” - rules that dictate which reactions get a boost from outside forces and by how much - the simulated chemical reaction network evolves until it reaches its final, steady state, or “fixed point.” Energy sources in the soup’s environment facilitate or “force” some of these chemical reactions, just as sunlight triggers the production of ozone in the atmosphere and the chemical fuel ATP drives processes in the cell. The simulation involved a soup of 25 chemicals that react with one another in myriad ways. The dynamics of the system are too complicated and nonlinear to predict what will happen. “That doesn’t mean you’re guaranteed to acquire that structure,” England explained. The paper strips away the nitty-gritty details of cells and biology and describes a simpler, simulated system of chemicals in which it is nonetheless possible for exceptional structure to spontaneously arise - the phenomenon that England sees as the driving force behind the origin of life. “But the obvious interest is to ask what this means for life.” It’s “a case study about a given set of rules on a relatively small system, so it’s maybe a bit early to say whether it generalizes,” Lässig said. “This is obviously a pioneering study,” Michael Lässig, a statistical physicist and quantitative biologist at the University of Cologne in Germany, said of the PNAS paper written by England and an MIT postdoctoral fellow, Jordan Horowitz. The outcomes of both computer experiments appear to back England’s general thesis about dissipation-driven adaptation, though the implications for real life remain speculative. The two most significant of these studies were published this month - the more striking result in the Proceedings of the National Academy of Sciences ( PNAS) and the other in Physical Review Letters ( PRL). Since then, England, a 35-year-old associate professor at the Massachusetts Institute of Technology, has been testing aspects of his idea in computer simulations. The existence of life is no mystery or lucky break, he told Quanta in 2014, but rather follows from general physical principles and “should be as unsurprising as rocks rolling downhill.” England said this restructuring effect, which he calls dissipation-driven adaptation, fosters the growth of complex structures, including living things. His equations suggested that under certain conditions, groups of atoms will naturally restructure themselves so as to burn more and more energy, facilitating the incessant dispersal of energy and the rise of “entropy” or disorder in the universe. The biophysicist Jeremy England made waves in 2013 with a new theory that cast the origin of life as an inevitable outcome of thermodynamics.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |