《增长的本质》_笔记

《Why Information Grows: The Evolution of Order, from Atoms to Economies》

INTRODUCTION: FROM ATOMS TO PEOPLE TO ECONOMIES

But why were Shannon and Weaver so eager to divorce information from meaning? They needed to separate information from meaning for both technical and philosophical reasons. On the technical side, Shannon was interested in the construction of machines that could help communicate information regardless of the meaning of the message. Mixing information and meaning obfuscated the engineering problem. On the philosophical side, Shannon and Weaver understood that their use of the words information and meaning referred to concepts that were fundamentally different. Humans, and some machines, have the ability to interpret messages and infuse them with meaning. But what travels through the wires or electromagnetic waves is not that meaning. It is simpler. It is just information.

It is hard for us humans to separate information from meaning because we cannot help interpreting messages. We infuse messages with meaning automatically, fooling ourselves to believe that the meaning of a message is carried in the message. But it is not. This is only an illusion. Meaning is derived from context and prior knowledge. Meaning is the interpretation that a knowledge agent, such as a human, gives to a message, but it is different from the physical order that carries the message, and different from the message itself. Meaning emerges when a message reaches a life form or a machine with the ability to process information; it is not carried in the blots of ink, sound waves, beams of light, or electric pulses that transmit information.

--------------------------------------------- 分隔线----------------------------------------------
PART I Bits in Atoms

Chapter 1 The Secret to Time Travel

Knowledge and knowhow are two fundamental capacities that relate to computation, and both are crucial for the accumulation of information in the economy and society. Yet knowledge and knowhow are not the same.

Simply put, knowledge involves relationships or linkages between entities. These relationships are often used to predict the outcomes of events without having to act them out. For instance, we know that tobacco use increases the likelihood of lung cancer, and we can use that linkage to anticipate the consequences of tobacco use without the need to use tobacco ourselves.

Knowhow is different from knowledge because it involves the capacity to perform actions, which is tacit.For example, most of us know how to walk, even though we do not know how we walk. Most of us know how to identify and label objects in an image, even though we do not know how we accomplish those perceptual and verbal tasks. Most of us know how to recognize objects from different angles, identify faces, digest food, and recognize emotions, even though we cannot explain how we do it. We can do these tasks, however, because we have knowhow. Knowhow is the tacit computational capacity that allows us to perform actions, and it is accumulated at both the individual and collective levels.

The tacit nature of knowhow seems strange, as it makes us feel like automatons that are unaware of what we are doing. Yet there is nothing strange in that. As Marvin Minsky, one of the fathers of artificial intelligence, once said: “No computer has ever been designed that is ever aware of what it’s doing; but most of the time, we aren’t either.

Chapter 2 The Body of the Meaningless

As Manfred Eigen, the winner of the 1967 Nobel Prize in Chemistry, remarked: “Entropy refers to an average of (physical) states, information to a particular (physical) state.

========

But what are the properties of information-rich states? And how can we use knowledge about their properties to identify them? One important characteristic of information-rich states is that these involve both longrange and short-range correlations. In the case of the Rubik’s cube these correlations are conspicuous: when the cube is perfectly ordered, each color is surrounded by as many neighbors of the same color as it possibly can be. Yet correlations are conspicuous not only in man-made objects, like a Rubik’s cube, but also in nature.

========

Finally, I will connect the multiplicity-of-states definition of entropy with our ability to process information (that is, compute). As we saw in the Rubik’s cube example, information-rich states are hard to find, not only because they are rare but also because there are few paths leading to them. That’s why we equate the ability of someone to solve a Rubik’s cube with a form of intelligence, since those who know how to solve a Rubik’s cube get credit for finding these rare paths (or memorizing the rules to find them).

Chapter 3 The Eternal Anomaly

The irreversibility of time is the mechanism that brings order out of chaos.
—ILYA PRIGOGINE

========

Prigogine produced many important insights, but the one that is of concern to us here is the idea that information emerges naturally in the steady states of physical systems that are out of equilibrium. That statement, which summarizes the physical origins of information, sounds awfully complicated. Yet if we go carefully through a sequence of examples, we will realize that it is not.

========

The fact that out-of-equilibrium systems are characterized by information-rich steady states helps us understand where information comes from. In an out-of-equilibrium system, such as Earth, the emergence of information is expected. It is no longer an anomaly. The bad news, however, is that entropy is always lurking on the borders of information rich anomalies, waiting to devour these anomalies as soon as it gets the chance. Bathtub whirlpools vanish as soon as we put the stopper back in the drain or run out of water. This might lead us to think that the universe is quick at taking away the information-rich steady states that out-of equilibrium systems give us for free. Yet information has found ways to fight back. As a result, we live on a planet where information is “sticky” enough to be recombined and created. This stickiness, which is essential for the emergence of life and economies, also hinges on additional fundamental physical properties.

The first mechanism that contributes to the stickiness of information involves the idea of thermodynamic potentials. Once again this sounds complicated, but it is not. What we need to know here is that the steady states of physical systems can be described as minimums of mathematical functions, which are known as thermodynamic potentials. We are all familiar with the basic idea of potentials from high school physics, since we know that marbles end up at the bottom of bowls because in that state there is a minimum of potential energy. Now, the thing is that not all of the steady states of physical systems minimize energy. Many steady states minimize or maximize other quantities (for example, a gas sitting quietly in a box maximizes entropy). Yet we do not need to describe all of these quantities here, since we are interested primarily in the potentials that rule out-of equilibrium systems. So what is the potential that out-of-equilibrium systems such as our bathtub whirlpool minimize? In 1947 Prigogine showed that the steady state of out-of-equilibrium systems minimizes the production of entropy. What this means is that out-of-equilibrium systems self-organize into steady states in which order emerges spontaneously, minimizing the destruction of information.

========

From a physical perspective, both proteins and DNA are technically crystals; more precisely, they are aperiodic crystals (structures that do not repeat each other but contain long-range correlations). Think of a sheet of music where the same four notes repeat over and over. The information carried by that sheet would be minimal compared to one in which variations and departures are prevalent. Schrödinger understood that aperiodicity was needed to store information, since a regular crystal would be unable to carry much information: “The gene is most certainly not just a homogeneous drop of liquid. It is probably a large protein molecule, in which every atom, every radical, every heterocyclic ring plays an individual role, more or less different from that played by any of the other similar atoms, radicals, or rings.” According to Schrödinger, the phenomenon of life hinged on both the aperiodicity of biological molecules and their solid, crystalline nature. The aperiodicity was essential for the molecule to embody information, and the solid nature of the molecule was essential for this information to last.

So by combining the ideas of Prigogine and Schrödinger, we can understand where information comes from (the steady state of nonequilibrium systems) and why it sticks around (because it is stored in solids). The poetic oddity of this combination is that it tells us that our universe is both frozen and dynamic. From a physical perspective, a solid is “frozen” because its structure is stable to the thermal fluctuations of the environment.

========

Highly interacting out-of-equilibrium systems, whether they are trees reacting to the change of seasons or chemical systems processing information about the inputs they receive, teach us that matter can compute. These systems tell us that computation precedes the origins of life just as much as information does. The chemical changes encoded by these systems are modifying the information encoded in these chemical compounds, and therefore they represent a fundamental form of computation. Life is a consequence of the ability of matter to compute.

========

So time is irreversible in a statistical system because the chaotic nature of systems of many particles implies that an infinite amount of information would be needed to reverse the evolution of the system. This also means that statistical systems cannot go backward because there are an infinite number of paths that are compatible with any present. As statistical systems move forward, they quickly forget how to go back. This infiniteness is what Prigogine calls the entropy barrier, and it is what provides a perspective of time that is not spatialized like the theories of time advanced by Newton and Einstein. For Prigogine, the past is not just unreachable; it simply does not exist. There is no past, although there was a past. In our universe, there is no past, and no future, but only a present that is being calculated at every instant. This instantaneous nature of reality is deep because it helps us connect statistical physics with computation. The instantaneous universe of Prigogine implies that the past is unreachable because it is incomputable at the micro level. Prigogine’s entropy barrier forbids the present to evolve into the past, except in idealized systems, like a pendulum or planetary orbits, which look the same going forward and backward (when there is no dissipation involved).

We started this chapter by asking ourselves about the irreversibility of time and the origins of information. We learned that when put together, these questions pose a puzzle, since time moves from order to disorder even though the complexity of our world is seen to increase. The universal increase of entropy appears to contradict the growth of information, but in fact it does not, since the universe has some tricks up its sleeve that allow information to emerge in well-defined pockets. These are pockets where free energy is abundant, but also where the range of temperatures is mild enough for solids to exist, as information lasts longer when preserved in solids.

The thermodynamics of the universe I described in this chapter help us understand the circumstances under which information is allowed to emerge. Yet the ability of the universe to beget the complexity we see out our windows is not an immediate consequence of these simple mechanisms. For information to truly grow, the universe needed one more trick. This is the ability of matter to compute.

This computational ability of matter, which can be embodied both in simple chemical systems and in complex life-forms such as trees or us, is the key capacity that allows information to grow explosively in the pocket of the universe we call home. This computational capacity, and its relationship to humans and the networks that humans form, will be the focus of the third part of the book, which explores the ability of systems to accumulate knowledge and knowhow. We will need this computational capacity, and the constraints defined by its human embodiment, to explain the growth of information in society.

--------------------------------------------- 分隔线----------------------------------------------
PART II Crystallized Imagination

Chapter 4 Out of Our Heads!

Making a strong distinction between the generation of value and the appropriation of monetary compensation helps us understand the difference between wealth and economic development. In fact, the world has many countries that are rich but still have underdeveloped economies. This is a distinction that we will explore in detail in Part IV. But making this distinction, which comes directly from the idea of crystallized imagination, helps us see that economic development is based not on the ability of apocket of the economy to consume but on the ability of people to turn their dreams into reality. Economic development is not the ability to buy but the ability to make.

Chapter 5 Amplifiers

Crystallizing our thoughts into tangible and digital objects is what allows us to share our thoughts with others. Otherwise, our thoughts are trapped in the prison of our minds. A musician records her music as a way to perfect her art, but also as a way of creating copies of her mind that can be shared with others and that can survive her. Without these copies her talents would be trapped in her body, inaccessible to others. We crystallize imagination to make copies of our thoughts and share them with others. This makes crystallizing imagination the essence of creative expression.

But does this mean that products are simply a form of communication? Not so fast. Our ability to crystallize imagination into products, although expressive, is different from our ability to verbally articulate ideas. An important difference is that products can augment our capacities in ways that narrative descriptions cannot. Talking about toothpaste does not help you clean your teeth, just as talking about the chemistry of gasoline will not fill up your car with gas. It is the toothpaste’s embodiment of the practical uses of knowledge, knowhow, and imagination, not a narrative description of them, that endows other people with those practical uses. Without this physical embodiment the practical uses of knowledge and knowhow cannot be transmitted. Crystallizing imagination is therefore essential for sharing the practical uses of the knowledge that we accumulate in our mind. Without our ability to crystallize imagination, the practical uses of knowledge would not exist, because that practicality does not reside solely in the idea but hinges on the tangibility of the implementation. Once again, the physicality of products—whether tangible or digital—augments us. And it is through this augmentation that products help us communicate something that words cannot: the practical uses of knowhow, imagination, and knowledge.

Emphasizing the ability of products to augment human capacities can help us refine what we understand as the economy. It helps us see the economy not as the careful management of resources, the wealth of a nation, or a network of financial transactions, but as a system that amplifies the practical uses of knowledge and knowhow through the physical embodiment of information and the context-specific properties that this information helps carry. This is an interpretation of the economy as a knowledge and knowhow amplifier, or a knowledge and knowhow amplification engine: a complex sociotechnical system able to produce physical packages containing the information needed to augment the humans who participate in it. Ultimately, the economy is the collective system by which humans make information grow.

========

Norbert Wiener, the father of cybernetics, understood that the ability to embody information outside our bodies is not unique to our species. In fact, our ability to print information in our environment makes us similar to other eusocial species, such as ants. Single ants are not very clever, but their ability to deposit information in the form of pheromones can make ant colonies extremely savvy. Thanks to their ability to deposit information in their physical environment, ants can solve difficult problems of transportation, construction, ventilation, and routing. Humans have a similar capacity. Yet instead of leaving behind pheromones, we leave behind physical instantiations of imaginary objects, such as wrenches, screwdrivers, dishwashers, pyramids, chairs, and beer bottles. The ability to deposit imaginary information in our environment is key for our species’ ability to create societies and economies that are significantly more complex than those of ants. Unlike ants, we embody information not just to communicate but also to augment one another’s capacities by making available—through objects—the practical uses of knowledge, knowhow, and imagination.

--------------------------------------------- 分隔线----------------------------------------------
PART III The Quantization of Knowhow
As we will see in the next pages, the challenge of economic development is constrained not only by the duality between matter and information but also by the duality between systems and computation. In society, the latter is the duality between networks of people and their capacity to process information, which we know as knowledge and knowhow.

Economic systems, just like all natural systems, have an ability to produce information that is constrained by the systems’ computational capacity. For information to grow in the economy, the computational capacity of the economy needs to grow as well. Increasing the computational capacity of economic systems, however, is not easy, since the growth of an economy’s computational capacity is constrained by the ability of people to embody knowledge and knowhow in networks of people. So to understand the growth of information in the economy we need to understand the mechanisms that limit people’s ability to form the networks they need to accumulate volumes of knowledge and knowhow that transcend a person’s individual capacity.

Chapter 6 This Time, It’s Personal

Hiring a musician by picking up a random person from the street is a bad idea because even though the information available in books can help us speed up the accumulation of knowledge and knowhow, knowledge and knowhow are not present in books. For instance, a book can tell us how to position our bodies for karate moves. But I would not recommend that you jump into the ring of an ultimate fighting event if your only fighting experience comes from reading some karate books. Knowhow, in particular, resides primarily in humans’ nervous systems. It is the instinctive way in which the musician plays guitar, the fluidity with which the artist draws, and the dexterity with which the truck driver backs up an eighteen-wheeler. It is not in books.

Getting knowledge inside a human’s nervous system is not easy because learning is both experiential and social. To say that learning is social means that people learn from people: children learn from their parents and employees learn from their coworkers (I hope). The social nature of learning makes the accumulation of knowledge and knowhow geographically biased. People learn from people, and it is easier for people to learn from others who are experienced in the tasks they want to learn than from people with no relevant experience in that task.

Ultimately, the experiential and social nature of learning not only limits the knowledge and knowhow that individuals can achieve but also biases the accumulation of knowledge and knowhow toward what is already available in the places where these individuals reside. This implies that the accumulation of knowledge and knowhow is geographically biased.

========

personbyte

We can simplify this discussion by defining the maximum amount of knowledge and knowhow that a human nervous system can accumulate as a fundamental unit of measurement. We call this unit a personbyte, and define it as the maximum knowledge and knowhow carrying capacity of a human.

Chapter 7 Links Are Not Free

firmbyte

The limited proliferation of megafactories like the Rouge implies that there must be mechanisms that limit the size of the networks we call firms and make it preferable to disaggregate production into networks of firms. This also suggests the existence of a second quantization limit, which we will call the firmbyte. It is analogous to the personbyte, but instead of requiring the distribution of knowledge and knowhow among people, it requires them to be distributed among a network of firms.

The factors that limit the size of firms—and imply a second quantization threshold—have been studied extensively in a branch of the academic literature known as transaction cost theory or new institutional economics. Additionally, the factors that limit the size of the networks humans form—whether firms or not—have been studied extensively by the sociologists, political scientists, and economists working on social capital and social networks. Since this is an extensive literature, I will review the basics of the new institutional economics in this chapter and leave the discussion of social capital theories for the next chapter.

Transaction cost theory, or new institutional economics, is the branch of economics that studies the costs of transactions and the institutions that people develop to govern them. In simpler terms, it is the branch studying the cost of economic links and the ways in which people organize to deal with commercial interactions.

========

Coase’s explanation of the boundaries of a firm was brilliant and simple. It was based on the idea that economic transactions are costly and not as fluid as the cheerleaders of the price mechanism religiously believed. Often, market transactions require negotiations, drafting of contracts, setting up inspections, settling disputes, and so on. These transaction costs can help us understand the boundary of the firm, since according to Coase, a parsimonious way of understanding the islands of central planning that we know as firms is to search for the point at which the cost of transactions taking place internally within the firm equals the cost of market transactions. When the external transactions become less costly than the internal transactions, firms stop growing, since it is better for them to buy things from the market than to produce these internally.

Chapter 8 In Links We Trust

Trust, which is an essential form of social capital, is the “glue” needed to form and maintain large networks. It is different from the knowledge and knowhow that we accumulate in these networks.9 Ultimately, this makes the knowledge and knowhow accumulated in networks a different factor of production than the trust, or social capital, that enables the formation of the networks where this knowledge is accumulated.

--------------------------------------------- 分隔线----------------------------------------------
PART IV The Complexity of the Economy

Chapter 11 The Marriage of Knowledge, Knowhow, and Information

We have noted that information and knowhow are clearly distinct concepts. Information refers to the order embodied in codified sequences, such as those found in music or DNA, while knowledge and knowhow refer to the ability of a system to process information. Examples of knowhow are found in the biological networks that perform photosynthesis, the process by which plants harvest carbon from the air—or, more fancifully, the human networks that perform “autosynthesis,” the process by which groups of humans manufacture cars out of minerals.

Knowhow and information are distinct, but they are also intimately connected. The ability of a system to pack knowhow depends largely on the fluidity with which it can use information to reconstruct the dynamic networks it needs to accumulate that knowhow. A seed is a perfect example of this. It is a package containing both the knowhow and the information needed to create a plant, such as a tree. The development of a tree is nothing other but the majestic unpacking of knowhow facilitated by genetic information. A seed unpacking into a tree unpacks the knowhow needed to perform photosynthesis, to build the structures that will transport nutrients and water from the ground to the leaves, and to defend itself against pests. A seed unpacking into a tree is an example of knowhow and information being unpacked into a structure that is more complex than the one that begot it—the tree has the ability to perform functions that were absent in the seed.

========

The intimate connection between the information that is coded in DNA and the knowhow that is embodied in a seed’s network of biological interactions provides a highly efficient mode of knowhow reproduction and diffusion. Under the right conditions a few seeds can grow into a forest, a queen bee can give rise to a colony, and a few rabbits can take over Australia. Yet the marriage between knowhow and information that permeates biology is lacking in the systems of people and products that we know as the economy.

--------------------------------------------- 分隔线----------------------------------------------
PART V Epilogue

Chapter 12 The Evolution of Physical Order, from Atoms to Economies

Energy is needed for information to emerge, and solids are needed for information to endure. But for the growth of information to explode, we need one more ingredient: the ability of matter to compute.

The fact that matter can compute is one of the most amazing facts of the universe. Think about it: if matter could not compute, there would be no life. Bacteria, plants, and you and I are all, technically, computers. Our cells are constantly processing information in ways that we poorly understand. As we saw earlier, the ability of matter to compute is a precondition for life to emerge. It also signifies an important point of departure in our universe’s ability to beget information. As matter learns to compute, it becomes selective about the information it accumulates and the structures it replicates. Ultimately, it is the computational capacities of matter that allow information to experience explosive growth.

Out-of-equilibrium systems, solids, and the computational abilities of matter help us understand the growth and presence of information in our universe. These three mechanisms help matter cheat the steady march of entropy, not universally but in well-defined pockets such as a cell, a human, a city, or a planet. But to bring these ideas into our modern reality, we need to recast them in the language of humans, societies, and economies. The growth of information in the economy is still the result of these basic mechanisms. But in these large-scale social and economic systems these mechanisms take new forms.

Our world is populated by structures that are more complex than whirlpools and proteins. These include people and objects. People are the ultimate incarnation of the computational capacities of matter. We embody the capacity to compute, as we organize our brain and our society to beget new forms of information. Objects are where we deposit information. They allow us to communicate messages and coordinate our social and professional activities, but more importantly, they allow us to transmit the practical uses of knowledge and knowhow.

The economy of early hominids and that of twenty-first-century society have enormous differences, but they do share one important feature: in both of these economies, humans accumulate information in objects. Our world is different from that of early hominids only in the way in which atoms are arranged. The objects of today—the arrangements of atoms—are what make our world essentially different from the one in which our ancestors evolved.

========

Ultraorthodox interpretations of the economy would argue that this computer self-organizes to an optimal state thanks to the price system. In reality, however, the economic computer is much clunkier than that.

Economies are embedded in social and professional networks that predate and constrain economic activity. These networks are important because they are the only structures that we have available to accumulate large volumes of knowledge and knowhow. Yet, as Granovetter, Putnam, and Fukuyama showed us, the sizes, shapes, and evolution of these networks are constrained by historical and institutional factors, from a society’s level of trust to the relative importance we give to family relationships.

So the social and economic problem that we are truly trying to solve is that of embodying knowledge and knowhow in networks of humans. By doing so, we are evolving the computational capacities of our species, and ultimately helping information grow.

So the growth of information in the economy, which is ultimately the essence of economic growth, results from the coevolution of our species’ collective computational capacities and the augmentations provided by the crystals of imagination that we are able to make. Crystals of imagination, from airplanes to toothpaste, amplify the practical uses of the knowledge, knowhow, and imagination of our society, augmenting our capacities to create new forms of information. Moreover, these objects allow us to form networks that embody an increasing amount of knowledge and knowhow, helping us increase our capacity to collectively process information.

Our need to form networks, however, emerges from one important consideration: the limited ability of humans to embody knowledge and knowhow. To fight our individual limitations we need to collaborate.

The personbyte theory suggests a relationship between the complexity of an economic activity and the size of the social and professional network needed to execute it. Activities that require more personbytes of knowledge and knowhow need to be executed by larger networks. This relationship helps explain the structure and evolution of our planet’s industrial structures. The personbyte theory implies that (1) simpler economic activities will be more ubiquitous, (2) that diversified economies will be the only ones capable of executing complex economic activities, (3) that countries will diversify toward related products, and (4) that over the long run a region’s level of income will approach the complexity of its economy, which we can approximate by looking at the mix of products produced and exported by a region, since products inform us about the presence of knowledge and knowhow in a region. All of these predictions are empirically testable and are consistent with the available data.

So in the world of atoms and economies the growth of information hinges on the eternal dance between information and computation. This dance is powered by the flow of energy, the existence of solids, and the computational abilities of matter. The flow of energy drives selforganization, but it also fuels the ability of matter to compute. Solids, on the other hand, from proteins to buildings, help order endure. Solids minimize the need for energy to produce order and shield information from the steady march of entropy. Yet the queen of the ball is the emergence of collective forms of computation, which are ubiquitous in our planet. Our cells are networks of proteins, which form organelles and signaling pathways that help them decide when to divide, differentiate, and even die. Our society is also a collective computer, which is augmented by the products we produce to compute new forms of information.

========

As the universe moves on and entropy continues to increase, our planet continues its rebellious path marked by pockets that are rich in information. Enslaved by the growth of order, we form social relationships, make professional alliances, beget children, and, of course, laugh and cry. But we often lose sight of the beauty of information. We get lost in the urgency of the moment, as our minds race like whirlpools and our lives compute forward in a universe that has no past. We worry about money and taxes instead of owning the responsibility of perpetuating this godless creation: a creation that grew from modest physical principles and which has now been bestowed on us.

你可能感兴趣的:(《增长的本质》_笔记)