Why do we want more and more money regardless of how much we already have? Why do we hate to be manipulated and to lose? Why do twenty percent of the people own eighty percent of the wealth? Why in most languages, does the most common word appear twice as often as the second most common word? Why does the digit "1" appear in company balance sheets six and a half times more often than the digit "9"? Why does nature hate "bubbles"?The cause for all these phenomena is the very same law that makes water flow from high to low, and heat flow from a hot place to a cold one. This law, which for historical reasons is called the Second Law of Thermodynamics, states that there is a never-decreasing always-increasing quantity called entropy. Entropy represents the uncertainty of a system in hypothetical equilibrium where everybody and everything have equal opportunities but slim chances to win; or in other words - the majority have little and a few have a lot.The book describes the historical evolution of the understanding of entropy, alongside the biographies of the scientists who contributed to its definition and to the exploration of its effects in numerous domains including exact sciences, communication theory, economics and sociology. This book should be of interest to a broad audience, from scientists, engineers and students to the general public.
Why do we want more and more money regardless of how much we already have? Why do we hate to be manipulated and to lose? Why do twenty percent of the people own eighty percent of the wealth? Why in most languages, does the most common word appear twice as often as the second most common word? Why does the digit "1" appear in company balance sheets six and a half times more often than the digit "9"? Why does nature hate "bubbles"?The cause for all these phenomena is the very same law that makes water flow from high to low, and heat flow from a hot place to a cold one. This law, which for historical reasons is called the Second Law of Thermodynamics, states that there is a never-decreasing always-increasing quantity called entropy. Entropy represents the uncertainty of a system in hypothetical equilibrium where everybody and everything have equal opportunities but slim chances to win; or in other words - the majority have little and a few have a lot.The book describes the historical evolution of the understanding of entropy, alongside the biographies of the scientists who contributed to its definition and to the exploration of its effects in numerous domains including exact sciences, communication theory, economics and sociology. This book should be of interest to a broad audience, from scientists, engineers and students to the general public.