diff --git a/contents/english/03-01-living-in-a-plural-world.md b/contents/english/03-01-living-in-a-plural-world.md index f5306e96..5520b9fd 100644 --- a/contents/english/03-01-living-in-a-plural-world.md +++ b/contents/english/03-01-living-in-a-plural-world.md @@ -4,46 +4,24 @@ > (A)re…atoms independent elements of reality? No…as quantum theory shows: they are defined by their…interactions with the rest of the world…(Q)uantum physics may just be the realization that this ubiquitous relational structure of reality continues all the way down…Reality is not a collection of things, it’s a network of processes. — Carlo Rovelli, 2022[^RelationalReality] -     Technology follows science. Thus, if we are to offer a different vision of the future of technology from Abundance Technocracy (AT) and Entrepreneurial Sovereignty (ES), we need to understand what is at the root of their understanding of science, what this might miss, and how correcting this can open new horizons. To do so, we now explore the philosophy of science behind these approaches and explore how in both the natural and social sciences the advances of the last century arose from moving beyond the limits of these perspectives to a plural, networked, relational, multiscale understanding of the reality we live in. +     Technology follows science. And so if we want to understand Plurality as a vision of _what our world could become_, we need to start off by understanding Plurality as a perspective on _how the world already is_. The Abundance Technocracy (AT) and Entrepreneurial Sovereignty (ES) perspectives, which we critiqued because of their over-emphasis on one particular way of solving social problems (a global expert class in the former case, and entrepreneurs and corporations in the latter case), also have a long history of being tied to what we consider overly simplistic analogies of science. -### Atoms and the universe +Technocracy has a long history of being justified by science and rationality. The idea of “scientific management” (a.k.a. Taylorism) that became popular in the early 1900s was justified by making analogies between social systems and simple mathematical models, and logic and reason as ways of thinking about them. High modernism in architecture is similarly inspired by the beauty of geometry. Entrepreneurial sovereignty also borrows heavily from physics and other sciences: just like particles “take the path of least action”, and evolution maximizes fitness, economic agents “maximize utility”. Every phenomenon in the world, from human societies to the motion of the stars, can ultimately be reduced to these laws. -     The simplest and most naïve way to think about science is what might be called “objectivist”, “rationalist” or, as we will dub it, “monist atomism”[^MonistAtomism]. The physical world has an objective state and obeys an ultimately quite simple set of laws, waiting to be discovered. These can be stated in mathematical terms and dictate the deterministic evolution of one state into another through the collision of atoms. Because these laws and the mathematical truths they obey are unitary and universal, everything that ever will happen can be predicted from the current state of the world. These laws are often expressed in “goal-seeking” or “teleological” terms: particles “take the path of least action”, chemical compounds “minimize free energy”, evolution maximizes fitness, economic agents “maximize utility”. Every phenomenon in the world, from human societies to the motion of the stars, can ultimately be reduced to these laws. All one needs to do — in this frame — is have sufficient computational power/intelligence, sufficiently precise observations, and the courage to strip away one’s superstitions/social constructs/biases and one will be, essentially, gods, omniscient, and possibly omnipotent. +These approaches have achieved great successes that cannot be ignored. Newtonian mechanics explained a range of phenomena and helped inspire the technologies of the industrial revolution. Darwinism is the foundation of modern biology. Economics has been the most influential of the social sciences on public policy. And the Church-Turing vision of “general computation” helped inspire the idea of general-purpose computers that are so broadly used today. But there are also limits to the power of each of these sciences, as we have been increasingly discovering in the past century. Gödel’s Theorem undermined the unity and completeness of mathematics and a range of non-Euclidean geometries are now critical to science. Symbiosis, ecology, and extended evolutionary synthesis undermined “survival of the fittest” as the central biological paradigm. Neuroscience has been reimagined around networks and emergent capabilities. -     The pattern of such thinking runs through almost every scientific field at some point in its development. Euclidean geometry, which aspires to deduce nearly all mathematical facts from a small set of axioms and concepts, and Newtonian mechanics, which describe the relationship between the motion of an object and the forces acting on it, are perhaps the most famous examples. In biology, the simple version of Darwinism focuses on the survival of the fittest species, with individual animals (or in later versions “selfish genes”) constantly struggling against each other to survive [^NaturalSelection]. In (primitive) neuroscience (especially phrenology), atoms are regions of the brain, each undertaking an atomic function that together add up to thought. In psychology, behaviorism saw thought as reducible to stimuli and response. In economics, the atoms are the self-interested individuals (or sometimes firms) of economic theory, each seeking their own advantage in the market. In computer science, the Church-Turing Thesis sees all possible operations as reducible to a series of operations on an idealized computer called a “Turing Machine”. +Plurality similarly looks at social systems from multiple perspectives, and appreciates that any single perspective has limits to its power to explain the world. A corporation can be viewed as a player in a bigger game, but a corporation is simultaneously itself a game, where employees, shareholders, management and customers are all players, and whose outcomes often do not look anything like a coherent utility function. What's more, the abstraction often leaks: individual employees of a corporation are often influenced through their _other_ relationships with the outside world, and not through the corporation itself. Countries too are both games and players, and there too we cannot cleanly separate apart actions between countries and actions within a country: the writing of this very book is a complicated mix of both in multiple ways. -     Whatever their limits, these approaches have achieved great successes that cannot be ignored. Newtonian mechanics explained a range of phenomena and helped inspire the technologies of the industrial revolution. Darwinism is the foundation of modern biology. Economics has been the most influential of the social sciences on public policy. And the Church-Turing vision of “general computation” helped inspire the idea of general-purpose computers that are so broadly used today. +Plurality is thus heavy with analogies to natural sciences: it uses many precisely because it understands the limits in relying too much on any single one. We can give a few examples. -     They are also the foundation of the Abundance Technocracy (AT) and Entrepreneurial Sovereignty (ES) worldviews we discussed in the last chapter, though each emphasizes a different aspect. AT focuses on the unity of reason and science inherent in monism and seeks to similarly rationalize social life, harnessing technology. ES focuses on the fragmentation intrinsic to atomism and seeks to model “natural laws” for the interaction of these atoms (like natural selection and market processes). In this sense, while ES and AT seem opposite, they are opposites within an aligned scientific worldview. +### Mathematics +19th century mathematics saw the rise of formality: being precise and rigorous about the definitions and properties of mathematical structures that we are using, so as to avoid inconsistencies and mistakes. At the beginning of the 20th century, there was a hope that mathematics could be “solved”, perhaps even giving a precise algorithm for determining the truth or falsity of any mathematical claim. 20th century mathematics, on the other hand, was characterized by much more uncertainty. -     For all that shared worldview has inspired, the science of the 20th century showed its limitations. Relativity and even more quantum mechanics upended the Newtonian universe. Gödel’s Theorem and a variety of following works undermined the unity and completeness of mathematics and a range of non-Euclidean geometries are now critical to science. Symbiosis, ecology, and extended evolutionary synthesis undermined “survival of the fittest” as the central biological paradigm. Neuroscience has been reimagined around networks and emergent capabilities, which in turn have become conceptually central to modern computation. Critical to all these developments are ideas such as “complexity”, “emergence”, “networks”, and “collective intelligence” that challenge the elegance of monist atomism. - - -### Complexity and emergence - -     The central idea of complexity science is that reduction of many natural phenomena to their atomic components (what we can call “reductionism”), even when conceptually possible, is often counterproductive. At the same time, studying complex systems as a single unit is often uninformative or impossible. Instead, structures (e.g. molecules, organisms, ecosystems, weather systems, societies) emerge from “atoms” at a range of (intersecting) scales that can be understood most usefully at least in part according to their own principles and laws rather than those governing their underlying components. Some of the common core arguments for “complexity”, or what we will call “pluralism”, in all the domains it is applied include: - -- Computational complexity: Even when reductionism is feasible in principle/theory, the computation required to predict higher-level phenomena based on their components (its computational complexity) is so large that performing it is unlikely to be practically relevant. In fact, in some cases, it can be proven that the required computation would consume far more resources than could possibly be recovered through the understanding gained by such a reduction. This often makes the theoretical possibility of such reduction irrelevant and creates a strong practical barrier to reduction. -- Sensitivity, chaos, and irreducible uncertainty: To make matters worse, many even relatively simple systems have been shown to exhibit “chaotic” behavior. A system is chaotic if a tiny change in the initial conditions translates into radical shifts in its eventual behavior after an extended time has elapsed. The most famous example is weather systems, where it is often said that a butterfly flapping its wings can make the difference in causing a typhoon half-way across the world weeks later. In the presence of such chaotic effects, attempts at prediction via reduction require unachievable degrees of precision. To make matters worse, there are often hard limits to how much precision is feasible, as precise instruments often interfere with the systems, they measure in ways that can lead to important changes due to the sensitivity mentioned previously. The most absolute version of this is the Heisenberg Uncertainty Principle, which puts physical upper limits on measurement precision based on this logic. In the computational view of the universe, the principle of computational irreducibility (whereby certain computations (processes in the world) can only be known by carrying out the computation[^ComputationalIrreducibility]) articulates this challenge or impossibility of reduction. -- Multiscale organization: While some might take the above observations as a council of scientific despair, an alternative is to view it as a reason to expect a diversity of analytic/scientific approaches to be fruitful under different conditions, at different scales of analysis and in ways that will intersect with each other. Indeed, in evolution natural selection is known to operate at multiple levels with major evolutionary transitions occurring from the individual to groups that become a new, higher-level organism[^MultilevelSelection]. In this view, it is natural to seek to characterize these different approaches, their “scope conditions” (viz. when they are likely to be most useful), how they can interact with each other and to consider this sort of approach as a core part of the scientific endeavor. -- Relationality: Multiscale organization implies many imperfectly commensurable ways of knowing. But if these could each be sliced into distinct scientific spheres, could monist atomism still prevail within each of several scientific fields? Yet a critical element of complexity is that phenomena at different scales often determine the interactions between and even constitute the nature of items at other scales. Units at smaller scales, for example, may have their identities and the rules they obey constituted by the larger units they in turn make up. While approximations ignoring these interactions may be useful for some phenomena, it is frequently important to trace down these dependencies in other contexts and ensure one accounts for them. -- Embedded causality: As a result of the preceding points, causation can rarely be understood completely or exhaustively in a reductive manner, where the explanation of higher-level phenomena is reduced to simpler or more atomic components. Instead, while specific causal arrows may follow such a pattern, others in the same system will take an opposite form, where the behavior of “atoms” is explained by the way they are situated in larger systems. Causal analysis will thus have quasi-“circular” elements that form equilibria and independent causation will usually emerge from forces within these equilibria, rather than by predictable reduction to a constant set of atomic “unmoved movers”. - -     Together these elements constitute a basic reimagining of the scientific project compared to monist atomism. In monist atomism, the search for scientific truth and explanation resembles something of a process of digging from different start points on a planet’s surface towards its core: people may start from many different points, but as they strip away falsehood, superstition, error, and misunderstanding, they will all find the same underlying core of truth, reducing everything they see to the same fundamental elements. - -     In the plural view, the almost the exact opposite metaphor applies: the scientific pursuit resembles the building of structures outward from the surface of a planet. While these structures might initially crowd and compete, if they grow far enough out the space they have to fill expands into the infinite void beyond. As these structure branch out, they diversify and fragment, making the possibilities for them to interact and recombine ever richer and yet the potential of their converging to a single outcome ever more remote. Furthermore, each of these recombinations can, roughly as in sexual reproduction, form new structures that themselves extend further off on their own trajectories. Progress is complexity, diversification, and intersectional recombination. - - -     While this plural vision doesn’t offer the hope of final or absolute truth that monist atomism does, it offers something perhaps as hopeful: an infinite vista of potential progress, expanding rather than contracting as it moves on. As the scientific revolutions of the 20th century so dramatically illustrated, shifting to such a plural perspective spells not the end of scientific progress, but rather an explosion of its possibilities. - -### The plurality of scientific revolutions - -     The twentieth century, and in particularly the Golden Age highlighted in the previous chapter, was the most rapid period of scientific and technological advance in human history. These advances happened in a range of disparate fields, but one common thread runs through most: the transcendence of monist atomism and the embrace of the plural. We will illustrate this with examples from mathematics, physics, biology to neuroscience. - -**Mathematics** - -     Perhaps the most surprising reach of pluralism has been into the structure of truth and thought itself. The gauntlet for twentieth century mathematics was thrown down by David Hilbert, who saw a complete and unified mathematical structure within grasp around the same time that Lord Kelvin saw the passing of the closing of the frontier in physics. Yet while the century began with Bertrand Russell and Alfred North Whitehead’s famous attempt to place all of mathematics on the grounds of a single axiomatic system, developments from that starting point have been quite opposite. Rather than reaching a single truth from which all else followed, mathematics shattered into a thousand luminous fragments. +- **Gödel's theorem**: a number of mathematical results from the early 20th century, most notably Gödel's theorem, showed that there are fundamental and irreducible ways in which key parts of mathematics cannot be fully solved. Similarly, Church proved that some mathematical problems were “undecidable” by computational processes. This dashed the dream of reducing all of mathematics to computations on basic axioms. +- **Computational complexity**: Even when reductionism is feasible in principle/theory, the computation required to predict higher-level phenomena based on their components (its computational complexity) is so large that performing it is unlikely to be practically relevant. In fact, in some cases, it is believed that the required computation would consume far more resources than could possibly be recovered through the understanding gained by such a reduction. In many real-world use cases, the situation can often be described as a well-studied computational problem where the “optimal” algorithm takes an exponentially large amount of time, and so good-enough “heuristic” algorithms often get used in practice. +- **Sensitivity, chaos, and irreducible uncertainty**: Many even relatively simple systems have been shown to exhibit “chaotic” behavior. A system is chaotic if a tiny change in the initial conditions translates into radical shifts in its eventual behavior after an extended time has elapsed. The most famous example is weather systems, where it is often said that a butterfly flapping its wings can make the difference in causing a typhoon half-way across the world weeks later. In the presence of such chaotic effects, attempts at prediction via reduction require unachievable degrees of precision. To make matters worse, there are often hard limits to how much precision is feasible, as precise instruments often interfere with the systems, they measure in ways that can lead to important changes due to the sensitivity mentioned previously. +* **Fractals**: many mathematical structures have been shown to have similar patterns at very different scales. A good example of this is the Mandelbrot set, generated by repeatedly squaring then adding the same offset to a complex number:      Geometry and topology, once the province of Euclidean certainties, turned out to admit endless variations, just as the certainties of a flat earth vanished with circumnavigation. Axiomatic systems went from the hope for complete mathematical systems to being proven, by Kurt Gödel, Paul Cohen, and others to be inherently unable to resolve some mathematical problems and necessarily incomplete. Alonzo Church showed that other mathematical questions were undecidable by any computational process. Even the pure operations of logic and mathematics, it thus turned out, were nearly as plural as the fields of science we discussed above. To illustrate: @@ -51,66 +29,40 @@ **Figure 1: The Mandelbrot Set (characterizing the chaotic behavior of simple quadratic functions depending on parameter values in the function) shown at two scales. Source: Wikipedia (left) and Stack Overflow (right).** -- Church proved that some mathematical problems were “undecidable” by computational processes and subsequent work in complexity theory has shown that even when mathematical problems might be in principle decidable, the computational complexity of arriving at such an answer is often immense. This dashed the dream of reducing all of mathematics to computations on basic axioms. -- Chaos has proven inherent even to many very simple mathematical problems. Perhaps the most famous example involves the behavior of the complex numbers of iterated application of quadratic polynomials. The behavior of such iterations turns out to form such intricate and rich patterns that characterizing them has become the source of “fractal art” as shown in Figure 1. These structures illustrate that even solutions to apparently “obvious” mathematical questions may depend on infinitely intricate details, that dazzle even our senses with their richness. -- While mathematics is not primarily concerned with phenomena well described by scales, the above phenomena have implied that rather than collapsing into a single field, twentieth century mathematics blossomed into an incredible diversity of subfields and sub-subfields, covering a range of phenomena. Geometry alone has a dozen major subfields from topology to projective geometry, studying radically different and only loosely intersecting elements of what was once a single, highly axiomatic, and largely closed set of phenomena. -- Relationality is a fundamental aspect of mathematics, as it concerns the study of the relationships between objects and the structures that emerge from those relationships. In mathematics, different branches are often interconnected, and insights from one area can be applied to another. For instance, algebraic structures are ubiquitous in many branches of mathematics, and they provide a language for expressing and exploring relationships between mathematical objects. Moreover, the study of topology is based on understanding the relationships between shapes and their properties. The mix of diversity and interconnectedness is perhaps the defining feature of modern mathematics -- Again, while “causation” is not quite the right way to understand pure mathematics, one of the most remarkable features of this modern field is its opposition to the reductionist approach, where seemingly simple questions are reduced to axioms and everything filters down through these. Perhaps the most famous example is Fermat’s Last Theorem, the claim by a the 17th century mathematician to have proven that a simple equation admits no whole number solutions. The eventual proof in the 1990s by Andrew Wiles building off centuries of intervening mathematics involved a range of techniques (especially related to so-called “elliptic curves”) developed for other purposes far more apparently advanced that the statement itself. The same is believed to be true of many other unsolved mathematical problems, such as the Riemann Hypothesis. +- **Relationality in mathematics**: in mathematics, different branches are often interconnected, and insights from one area can be applied to another. For instance, algebraic structures are ubiquitous in many branches of mathematics, and they provide a language for expressing and exploring relationships between mathematical objects. The study of algebraic geometry connects these structures to geometry. Moreover, the study of topology is based on understanding the relationships between shapes and their properties. The mix of diversity and interconnectedness is perhaps the defining feature of modern mathematics -     Many of these advances in pure mathematics have remained puzzles of curiosity and toys of the mind. Yet many of these apparently abstruse ideas have helped transform modern technology. The same elliptic curves that were central to Wiles’s proof are the foundation of one of the leading approaches to public key cryptography, given the intractability of certain solutions to problems involving them. Other advanced mathematics has proven core to the design of computer circuitry, medical image analysis, civil and aeronautical engineering, and more. Each of these applications depends on wildly different and only occasionality tangential areas of mathematics, rather than on the monolithic and integrated theory that Hilbert, Russell, and Whitehead once dreamed of. +### Physics +At the end of the 19th century, Lord Kelvin infamously proclaimed that “There is nothing new to discover in physics now.” The next century proved, on the contrary, to be the most fertile and revolutionary in the history of the field. -     In short, in sharp contrast to the monist atomist vision, the world-defining science and technology built on it in the twentieth century arose from their diversity: fields of knowledge proliferated and speciated, and each field internally, like a fractal, mirrored the same richness. The closer we looked into each area, the greater intricacy revealed itself. Surprising connections and relationships have emerged, but have only added to the complexity, rather than implying “unity”. +* **Einstein's theories of relativity** overturned the simplicity of Euclidean geometry and Newtonian dynamics of colliding billiard balls as a guide to understanding the physical world at a very large scale. When objects travel at large fractions of the speed of light, very different rules start describing their behavior. +* **Quantum mechanics and string theory** similarly showed that classical physics is insufficient at very small scales. Bell's Theorem demonstrated clearly that quantum physics cannot even be fully described as a consequence of probability theory and hidden information: rather, a particle can be in a combination (or “superposition”) of two states at the same time, where those two states _cancel each other out_. +* **“Heisenberg’s Uncertainty Principle”** puts a firm upper limit on the precision with which the velocity and position of a particle can even be measured. +* **The three body problem**, now famous after its central role in Liu Cixin's science-fiction series, shows that an interaction of even three bodies, even under simple Newtonian physics, is chaotic enough that its future behavior cannot be predicted with simple mathematical problems. However, we still regularly solve _trillion-body problems_ well enough for everyday use by using seventeenth-century abstractions such as “temperature” and “pressure”. -     Structures at every level of intersecting scale and described from the perspective of every way of knowing have proven important to progress: nuclear bombs reshape human societies, setting off environmental changes that reshape weather, twisting human psychology and feeding into the designs of computational systems that help cure disease and so on. - -**Physics** - -Pluralism is perhaps least surprising in biological systems; we can see the complexity of these all around us in everyday life. More surprising, perhaps, is the way in which 20th century physics revealed that these principles go “all the way down”, to the heart of the physical sciences that Newtonian monist atomism pioneered. - -At the end of the 19th century, Lord Kelvin infamously proclaimed that “There is nothing new to discover in physics now.” The next century proved, on the contrary, to be the most fertile and revolutionary in the history of the field. Relativity (special and especially general), quantum mechanics, and to a lesser extent thermodynamics/information theory and string theory upended the Newtonian universe, showing that the simple linear-time, Euclidean-space objective reality of colliding billiard balls was at best an approximation valid in familiar conditions. The (post-)modern physics that emerged from these revolutions beautifully illustrates pluralism in science, illustrating how pluralism is, as suggested by the epigraph from prominent physicist Carlo Rovelli, baked into the very fabric of reality. - -- Computational complexity is the core reason for the field of thermodynamics and its many offshoots. In fact, the field of information theory so core to computer science is built almost entirely on top of concepts derived from thermodynamics. The impossibility of simulating the action of billions of sub-units (e.g., molecules in a gas or compound, electrons in a wire, etc.) implies the need for thermodynamic techniques describing the average behavior of these sub-units. -- The ideas of sensitivity, chaos, and irreducible uncertainty originate or at least achieved their first intellectual prominence in physics. The simplest example of a chaotic system is three comparably sized bodies acting under gravitational forces. The behavior of smoke, of ocean currents, of weather, and many more all exhibit chaos and sensitivity. And, as noted above, the most canonical and best-established example of irreducible uncertainty is “Heisenberg’s Uncertainty Principle”, under which the quantum nature of reality puts a firm upper limit on the precision with which the velocity and position of a particle can be measured. -- For both these reasons, modern physics is organized according to the study of a wide range of different scales, illustrated by the famous “scales of the universe” walk at New York’s Hayden planetarium that takes visitors from quarks through atoms, molecules, chemical compounds, objects, planets, stars, star systems, galaxies, etc. While all systems in theory obey the same set of underlying physical laws, the physics at each scale is radically different, as different forces and phenomena are dominant and in fact, physics at the smallest scales (quantum) has yet to be reconciled with those at the largest (general relativity). -- Perhaps the most striking and consistent feature of the revolutions in twentieth century physics was the way they upset assumptions about a fixed and objective external world. Relativity showed how time, space, acceleration, and even gravity were functions of the relationship among objects, rather than absolute features of an underlying reality. Quantum physics went even further, showing that even these relative relationships are not fixed until observed and thus are fundamentally interactions rather than objects, as highlighted by Rovelli above. His interpretations of more recent developments pull ideas of time and space further apart. -- Given the diversity of levels of reality, causation in physics is profoundly embedded, shifting and cycling across scales at dizzying speeds. Atomic interactions, carefully constructed by sentient beings harnessing nano-scale computing, can trigger explosions that destabilize a planet. Collisions between stars can lead to a collapse of a microscopic blackhole that becomes the center of a galaxy. +Perhaps the most striking and consistent feature of the revolutions in twentieth century physics was the way they upset assumptions about a fixed and objective external world. Relativity showed how time, space, acceleration, and even gravity were functions of the relationship among objects, rather than absolute features of an underlying reality. Quantum physics went even further, showing that even these relative relationships are not fixed until observed and thus are fundamentally interactions rather than objects. Thus, modern science often consists of mixing and matching different disciplines to understand different aspects of the physical world at different scales      The applications of this rich and plural understanding of physical reality are at the very core of the tragedies of the twentieth century. Great powers harnessed the power of the atom to shape world affairs. Global corporations powered unprecedented communications and intelligence by harnessing their understanding of quantum physics to pack ever-tinier electronics into the palms of their customers’ hands. The burning of wood and coal by millions of families has become the cause of ecological devastation, political conflict, and world-spanning social movements based on information derived from microscopic sensors scattered around the world. - -**Biology** +### Biology If the defining idea of 19th century macrobiology (concerning advanced organisms and their interactions) was the “natural selection”, the defining idea of the 20th century analog was “ecosystems”. Where natural selection emphasized the “Darwinian” competition for survival in the face of scarce resources, the ecosystem view (closely related to the idea of “extended evolutionary synthesis”) emphasizes: -- The persistent inability to form effective models of animal behavior on reductive concepts, such as behaviorism, neuroscience, and so forth, illustrating computational complexity. -• The ways in which systems of many diverse organisms (“ecosystems”) can exhibit features similar to multicellular life (homeostasis, fragility to destruction or over propagation of internal components, etc.) illustrating sensitivity and chaos. -- The emergence of higher-level organisms through the cooperation of simpler ones (e.g., multicellular life as cooperation among single-celled organisms or “eusocial” organisms like ants from individual insects) and the potential for mutation and selection to occur at all these levels, illustrating multi-scale organization. -- The diversity of interactions between different species, including traditional competition or predator and prey relationships, but also a range of “mutualism”, where organisms depend on services provided by other organisms and help sustain them in turn, exemplifying entanglement, and relationality. -- The recognition of genetics as coding only a portion of these behaviors and of “epigenetics” or other environmental features to play important roles in evolution and adaptation, illustrating embedded causality. - +- **Limits to predictability of models**: we have continued to discover limits in our ability to make effective models of animal behavior that are based on reductive concepts, such as behaviorism, neuroscience, and so forth, illustrating computational complexity. +- **Similarities between organisms and ecosystems**: we have discovered that many diverse organisms (“ecosystems”) can exhibit features similar to multicellular life (homeostasis, fragility to destruction or over propagation of internal components, etc.) illustrating sensitivity and chaos. +- **Higher-level organisms** that operate through the cooperation of simpler ones (e.g., multicellular life as cooperation among single-celled organisms or “eusocial” organisms like ants from individual insects). A particular property of the evolution of these organisms is the potential for mutation and selection to occur at all these levels, illustrating multi-scale organization. +- **The diversity of cross-species interactions**, including traditional competition or predator and prey relationships, but also a range of “mutualism”, where organisms depend on services provided by other organisms and help sustain them in turn, exemplifying entanglement, and relationality. +- **Epigenetics**: we have discovered that genetics codes only a portion of these behaviors, and “epigenetics” or other environmental features play important roles in evolution and adaptation, illustrating embedded causality.     This shift wasn’t simply a matter of scientific theory. It led to some of the most important shifts in human behavior and interaction with nature of the 20th century. In particular, the environmental movement and the efforts it created to protect ecosystems, biodiversity, the ozone layer, and the climate all emerged from and have relied heavily on this science of “ecology”, to the point where this movement is often given that label. -     While this point is easiest to illustrate with macrobiology, as it is more familiar to the public, the same lesson applies perhaps even more dramatically to microbiology (the study of the inner workings of life in complex organisms). That field has moved from a focus on individual organs and the mechanical study of genetic expression to a “systems” approach, integrating action on a range of scales and according to many different systems of natural laws. This may be best illustrated by focusing on perhaps the most complex and mysterious biological system of all, the human brain. - - -**Neuroscience** - -    Modern neuroscience emerged from two critical discoveries about the functioning of brains. First, in the late 19th century, Camillo Golgi, Santiago Ramón y Cajal, and collaborators isolated neurons and their electrical activations as the fundamental functional unit of the brain. This analysis was refined into clear physical models by the work of Hodgkin and Huxley, who built and tested in on animals their electrical theories of nervous communication. Second, and more diffusely, a rich and nuanced picture emerged over the course of the twentieth century complicating the traditional view, often derided as “phrenology” that each brain function was physically localized to one region of the brain. Instead, while researchers like Paul Broca found important evidence of physical localization of some functions by studying brain lesion patients, a variety of other evidence including mathematical modeling, brain imaging, and single-neuron activation experiments suggested that many if not most brain functions are distributed across regions of the brain, emerging from patterns of interactions rather than primarily physical localization. - -    The understanding that emerged from these findings was that of a “network” of “neurons”, each obeying relatively simple rules for activation based on inputs, and updating the underlying connections based on co-occurrence. Again, the themes of pluralism emerge elegantly (THIS COULD USE SOME HARD LOOK BY REAL NEURO FOLKS): - -- Of all fields, neuroscience showed most sharply the bounds imposed by computational complexity. As early as the late 1950s, researchers beginning with Frank Rosenblatt built the first “artificial neural network” models of the brain and hoped to simulate a full human brain within a few years, only to discover that task was computationally many decades off if ever attainable, forcing a great diversification of ways (both model-based and experiment-based) for studying the brain. -- EXAMPLE OF SENSITIVITY AND CHAOS IN BRAIN HERE. SOME FIRST THOUGHTS IN FOOTNOTE, BUT THESE NEED TO BE BROUGHT TOGETHER MORE COHERENTLY AND CONSISTENTLY WITH OTHER MATERIAL TO BELONG HERE[^NeuroscienceComplexity] -- The wide-ranging investigation of different forms of partial physical localization and interaction centers around multiscale organization, where some phenomena are localized to very small structures (a few physically proximate neurons), while others are distributed over large brain regions, but not the entirety of the brain and others still are physically distributed but appear to be localized, at different scales, to various consistent networks of brain activity. -- The Hebbian model of connections, where they are strengthened by repeated co-firing, is perhaps one of the most elegant illustrations of the idea of “relationality” in science, closely paralleling the way we typically imagine human relationships developing. -- Neuroscience also elegantly illustrates embedded causality. Brain structure is famously plastic to learning and what is learned depends heavily on the social contexts that humans inhabit and construct as well as on the nutrients human economic and social system provide to brains. Thus, the higher-level phenomena (societies, relationships, economies, educational systems), which one might hope to help explain with features of human neuropsychology, are some of the central factors that shape the nature and function of those brains. Causation thus traces a classic circular pattern across levels. - -     Modern neuroscience has transformed this understanding into a range of applications: treatments of patients with damaged brains, development of psychiatric medicine, some treatments and interventions based on transcranial stimulation and other brain activation approaches, and more. Yet the most transformative technologies inspired by neuroscience have been at least partly digital, rather than purely biomedical. Neuroscience is increasingly central to two of the more exotic and exciting areas of digital technology development: brain-computer interfaces and the use of brain organoids as a substrate for computation. +### Neuroscience -     Most pervasively, the “neural network” architecture inspired by early mathematical models of the brain has become the foundation of the recent advances in “artificial intelligence”. Networks of trillions of nodes, each operating on fairly simple principles inspired by neurons of activation triggered by crossing a threshold determined by a linear combination of inputs, are the backbone of the “foundation models” such as BERT and the GPT models. These have taken the world by storm in the past half-decade and increasingly dominated the headlines in the last two years. All the critical features of neuroscience discussed above, and of pluralism more broadly (e.g., multiscale organization, relationality, embedded causation), manifest in the operation of these systems. +    Modern neuroscience started in the late 19th century, when Camillo Golgi, Santiago Ramón y Cajal, and collaborators isolated neurons and their electrical activations as the fundamental functional unit of the brain. This analysis was refined into clear physical models by the work of Hodgkin and Huxley, who built and tested in on animals their electrical theories of nervous communication. More recently, however, we have seen a series of discoveries that put chaos and complexity theory at the core of how the brain functions: +* **Distribution of brain functions**: mathematical modeling, brain imaging, and single-neuron activation experiments suggested that many if not most brain functions are distributed across regions of the brain, emerging from patterns of interactions rather than primarily physical localization. +* **The Hebbian model of connections**, where they are strengthened by repeated co-firing, is perhaps one of the most elegant illustrations of the idea of “relationality” in science, closely paralleling the way we typically imagine human relationships developing +* **Study of artificial neural networks**: As early as the late 1950s, researchers beginning with Frank Rosenblatt built the first “artificial neural network” models of the brain. Neural networks have become the foundation of the recent advances in “artificial intelligence”. Networks of trillions of nodes, each operating on fairly simple principles inspired by neurons of activation triggered by crossing a threshold determined by a linear combination of inputs, are the backbone of the “foundation models” such as BERT and the GPT models. ### From science to society