The Status of Emergence, roundtable

Friday, October 24, 2003

Society for Literature and Science 17th Annual Conference Austin, TX

October 23-26, 2003The Status of Emergence Roundtables Victoria Alexander (organizer/chair), Susan Oyman, Katherine Hayles, John Johnston, and Eve Keller.

Introduction by Victoria N. Alexander

The Status of Emergence I: Theory and Representation

The gap between the arts and sciences is due to the different forms of knowledge they pursue. The arts investigate the realm of subjective knowledge. The sciences seek to establish objective knowledge. Artists tend to look at meaning holistically. Scientists have argued that all phenomena can be analyzed by breaking them down into the component parts and discovering the physical laws that govern each part. Scientists are reductionists; artists are anti-reductionists.  (This is a huge generalization. I asked the man on the street his opinion.)

The complexity sciences abandon the classical tradition of reductionism. These sciences study the irreducible aspects of natural processes. They are like artists in that they are interested in “emergent” phenomena, which are more than the sum of the parts. Mind as opposed to brain, the kind of self-organized complexity that characterizes life, and the meaning(s) of a literary work are three examples of emergent phenomena.

Such emergence is called epistemological emergence. It appears to be impossible to understand the “global” behavior of a complex system by analyzing the “local” behavior of the individual parts. Thus, complexity scientists study and compare the qualitative behaviors of different dynamical systems as wholes.[1] A case for ontological emergence would require a quantitative definition of emergent properties.

So, even though some scientists have adopted an “artistic” approach to understanding complex phenomena, the opposition between subjective (qualitative) and objective (quantitative) forms of knowledge in the sciences persists. Complexity science remains a fringe science.

You might say that Developmental Systems Theory and Constructivism both take “artistic” approaches to understanding emergence. Their methods agree in many ways with complexity science.

The Developmental Systems approach to evolutionary theory defines an organism holistically, as an emergent phenomenon. Development is deterministic, but the results do not come directly from the initial conditions because, during the process of development, interactions generate effective factors that make perfect prediction impossible.

Developmental Systems researchers don’t only count genes that are passed on. They count whole systems or “repeated assemblies.” They believe that determining whether or not a gene has survived through successive generations is not sufficient to conclude that there is continuity in the changing population.

To define a developing system as a system is a messier project than counting genes.  Gene-centered critics might claim that defining developing systems is a qualitative evaluation; whereas, counting genes a quantitative one. An observer looking at a system, tries to define the roles or functions of the parts within the whole. Functions depend on context and perspective. So the problems in attempting to argue for the Developmental Systems approach to questions of biological nature are similar to those complexity scientists encounter trying to argue for the objectivity of emergent phenomena.

Constructivists argue that mental representations emerge from interactions between brain processes and the environment. The observer’s hardware and prior experience influence the way an object is perceived. As Prof Hayles puts it, constructions result from “self-organizing, transformative processes.”

If Constructivists, Developmental Systems Theorists, and complexity scientists believe that emergent properties can’t be quantified (in the usual reductive way), then we may align them with post-classical science and subjectivism.

But Prof Oyama does not advocate an “anything goes” theory of biological nature. She says it is not a solution to say “that we have no nature, for this is apt to be taken as a claim that we can be anything, or else as a denial of within-species commonalities.”

Likewise Prof. Hayles argues for a notion of “constrained constructivism.” Two species with different tools of perception and different kinds of processes used to transform and understand reality are still able to communicate. Despite differences, “constraints operate universally to eliminate certain configurations from the range of possible answers.” According to Prof. Hayles, constrained constructivism “rescues scientific inquiry from solipsism and radical subjectivism.”

Meanwhile, however, the science wars between humanities scholars and conservative scientists continue. Objectivity is normal science’s definition and goal. Conservative scientists assume that emergent phenomena can be reduced; we just haven’t figured out how yet. Without quantitative analysis, you can’t make a case for ontological emergence.

The resolution of this anomaly won’t come about with one side or the other winning.  The solution will dispose of the need to see things in terms of a strict subjective/objective dichotomy. I believe this is what scholars like Profs. Oyama and Hayles are doing. I also think that their work can be compared to very recent work in complexity science.

In the past, efforts have been made to quantify complexity by comparing complex systems as wholes, but if wholes are irreducible and depend on context and function, these measurements are subjective interpretations. However, theoretical physicists now claim it is possible to describe emergent phenomena quantitatively rather than qualitatively.  They look for the local rules of development, not the overall pattern.

In a beating heart, the fairly regular beat is the overall pattern. It is not a direct result of the sum of activities of the individual cells. Heart muscle is made up of cells that each pulse randomly (spontaneously according to its own internal state). But since cells are correlated (i.e. the activity of one cell affects its neighbor’s internal state), nonlinear relationships come to exist within a living heart.[2] This means that localized groups of cells (ensembles) self-organize and start pulsing in a rule-governed manner. The rule might be described by a structurally complex string: on, on, off, on, off, on, wildcard, on, on, off (repeat),  which includes opportunities for indeterminate activity. These complex behaviors travel through the heart tissue in waves resulting in a fairly regular periodic beat that can speed up and slow down depending on the body’s needs. The rule itself is not confined to particular cells. While the rule remains stable, there is fluidity in the ensemble borders. A dynamically stable rule results in an adaptable heartbeat. The adaptable heartbeat is an example of self-organization, of emergent behavior, that appears to be purposeful or designed in such a way as to create and then to sustain a cohesive and irreducible whole by sometimes resisting change and sometimes adapting to it.

This example is easily translated into any group of correlated cells: traders in a financial market, ants in a colony, neurons firing in a human brain, or individual artists working in a single genre. The degree of complexity of the rule that governs local behavior can be quantified.  However, the self-organized behavior of the heart as a whole remains irreducible and unpredictable. The local rule is an “objective” measure of complexity in the sense that it is generated by the system it is defining (i.e. it is the intrinsic computation being performed by the system), and it is subjective in the sense that some places in the string are more meaningful than others. For instance, “off” is always followed by “on,” but “on” may be followed by “on” or “off” depending on where a cell is in the string. In other words, context matters. This approach, called Computational Mechanics,[3] developed by Jim Crutchfield, allows one to measure with relative objectivity the structural complexity of an emergent whole. The study of emergence can move into accepted science, while changing the definition of what is acceptable in science.

If the objectivity of emergent phenomena can be proved, then we may be entering the post post-classical era of science, in which the interests of artists and scientists will not seem so opposed.

Profs Hayles, Oyama, Johnston and Keller have each contributed to the interrogation of dichotomies: subjective/objective, artificial/natural, machine/organism, nature/nurture, mind/matter. With a theory of emergence, now authorized by science,  we understand why dichotomies don’t make sense. Mind is qualitatively different from the materiality from which is originates, but it is not disembodied or separate from that matter.

The Status of Emergence II: Scholarship and Popularization

In “The Art of Fiction” (1884), Henry James argues that the dynamical interaction of impressions acquired over one’s lifetime, give one the “power to trace the implication of things, to judge the whole piece by the pattern” (53), an idea which is remarkably similar to recent research in the Complexity Science, which seeks to understand emergent properties, the global pattern, by discovering the underlying local rules governing the parts. Many artists, such as James, have insisted that the whole is more than the sum of the parts– and that what a thing is, is given in its dynamical interactions. We might argue then, as Eve Keller has, that new discoveries about the way intentionality emerges from distributed parts embedded in the environment is not so new, but a rediscovery of old ideas. These ideas, she notes, were ousted by Enlightenment reductionism.

John Johnston has argued that early cybernetics research was undervalued in its day.  He shows how they succeeded in breaking down the dichotomy between man and machine by creating machines that behaved in an organic way with distributed parts functioning dynamically to give rise to self-organized behavior. Cybernetics was abandoned in favor of Artificial Intelligence, a linear, strictly mechanistic, approach to mimicking intentional behavior,

Last session, we discussed how the subjectivity/objectivity dichotomy might be overcome. In this session, we will consider the trends in science that help create that dichotomy between art & science, subject & object, man & machine, nature & artifice.

In this session, we will also take up a more practical issue for those of us working in art-science relations. The most productive art-science collaborations would involve scientists who use aesthetics to understand natural processesãinstead of using art merely to illustrate scientific concepts. Likewise, humanities scholars should focus on art/artists that use science, not just as a subject, but as a tool for understanding the artistic process. Only if two-way interaction is accomplished will general audiences be persuaded of the relevance of art-science explorations, Today’s roundtables are, I think, setting an example of this kind of interaction.

We must also try to make discussions of the art/science interface accessible to general audiences. Much of the content is fairly challenging. We can’t expect audiences to want to make that extra effort unless they are interested in the issues. The content of art-science programs has to be compelling. For example, we can show that the complexity Sciences relate, at a very basic and profound emotional level, to major philosophical questions about the nature of selfhood: Is human thought “something more” that cannot be reduced to mechanistic description? Do we have free will? or is our behavior uniquely predetermined ? Do artists intentionally create works of art? or are their works merely produced by the cultural/social/material environment in which they work?

Today we’re going to be comparing two notions of selfhood, one associated with Enlightened Humanism and the other associated with emergence and the posthuman. Profs Keller and Johnston have both investigated notions of selfhood as conceived by cybernetic researchers, insomuch as they attempted to make life-like machines and to define what it is to be alive and mimic intelligent behavior.

Is selfhood an emergent formãin the way in which we defined emergence last session?  Or is selfhood given a priori or predetermined by initial conditions? as the Humanists would have it?

The humanist subject, in Keller’s words, “compared what it encountered in the world to an abstract conceptual model of that world built into its central control system.” The posthuman self, in contrast, involves distributed control. Individual parts interact with the environment, resulting in an over all self-organized behavior. In Johnston’s words,  cybernetic self-organization is “fully embodied and situated in the world. ”

According to a strictly mechanistic view of the cosmos, the concept of a self is an illusion.  If humanism and posthumanism argue that there is a self, then they are both anti-reductive, supposing that a self exists that is something more than the sum of the parts. The question that distinguishes the two views of selfhood, Humanism & posthumanism is: Where, when and how does the “more” in the more than sum of parts originate?

Is it pre-existing, or separate from matter? Variations of the idea of a localized control center or plan are found in a mind that controls the activity of a human, a God that controls the cosmos, a seed that determines a tree, and DNA that directs the development of an organism. In this view, humans are unique among animals in their possession of a self, a directing mind somehow distinct from matter.

Or is the “more” in the “more than the sum of its parts” a product of the interaction of parts? Does an intentional self emerge from anarchy? In this view of selfhood, humans are not unique in their possession of intention.  Even things, like robots, and events can be intentional.

Susan Oyama was trained in Harvard University’s Department of Social Relations, and she retained her early commitment to interdisciplinary scholarship. She is best known for her writings on the nature/nurture opposition, the notion of genetic information, and alternative understandings of the processes of development and evolution. Publications include: Evolution’s Eye,  The Ontogeny of Information, and co-editor of  Cycles of Contingency. She is presently Professor Emerita at the John Jay College and CUNY Graduate School.

Katherine Hayles is Professor English and Design/Media Arts at the University of California – Los Angeles.  Author of such celebrated books as Chaos Bound and How We Became Posthuman, she is currently working on two books: Virtual Bodies: Evolving Materiality in Cybernetics, Literature, and Information, a history of cybernetics from 1945-present and relating it to poststructural critical theory and contemporary literature and Riding the Cusp: The Interplay between Narrative and Formalisms, a collection of essays focused on showing the importance of narrative in a series of scientific sites, from game theory to sociobiology and artificial life.

Eve Keller is associate professor of English at Fordham University; research focuses on interanimation of early modern medicine and emerging notions of the self; work has been published in Milton Quarterly, Prose Studies, ELH, 18th Century Studies (and elsewhere); currently completing a book manuscript called “Generating Subjectivity: the Rhetorics of Reproduction in Early Modern England.”

John Johnston Emory University, specializes in modern/postmodern fiction; British and American Poetry; critical theory. Author of Carnival of Repetition: Gaddis’s The Recognitions and Postmodern Theory and Information Multiplicity, Literature, Media, Information Systems. Currently working on two monographs. The first is entitled, The Lure of the Post-Natural: Information Machines in Contemporary Culture, a study of how the new information machines first introduced by Alan Turing cross and erode conceptual boundaries at the conjunctions of information processing and dynamic systems theory. Chapters on psychoanalysis and cybernetics, Artificial Life, evolution and complex systems theory, machinic vision, distributed systems in the new AI and CogSci, and new narratives of the Internet (collective intelligence, the virtual, the post-human).  The second is entitled How it Changed: New Media and Complexity in Contemporary Science Fiction, a study of the new sites and agents of change, and particularly the role of new media, in contemporary science fiction. The study will argue that this fiction constitutes a new Bakhtinian chronotope in which humanity splits and fractalizes and new forms of identity appear; it will focus on the human/machine, human/virus, human/alien interfaces across a range of novels including Philip K. Dick, Rudy Rucker, William Gibson, Bruce Sterling, Pat Cadigan, Michael Swanwick, Richard Calder, Greg Egan, Neal Stephenson, Linda Nagata, and others.

[1] Michael Silberstein and John McGeever, “The Search for Ontological Emergence,” Philosophical Quarterly 49 (1999): 182-200.

[2] Winfree, Arthur T. The Geometry of Biological Time.  NY: Springer-Verlag, 1980.

[3] Computational Mechanics seeks to discover and quantify structure using a combination of computation theory, statistical inference, and information theory. It seeks to detect the intrinsic computation being performed by the system. The entropy rate and the Kolmogorov complexity rate do not measure pattern or structure or correlation or organization.