Evolutionary Programming IV, Proceedings Of The Fourth Annual Conference On Evolutionary Programming. John R. McDonnell, Robert Reynolds and David Fogel, eds. A Bradford Book, MIT Press, Cambridge, 1995. Pages 319-331.

Ethnography of Artificial Culture: Specifications, Prospects, and Constraints

Nicholas Gessler

 

Abstract

In a recent article I discussed some of the possible research benefits for anthropology of applying the computational paradigm of artificial life (AL) to the scientific study of cultural evolution. I referred to this program as artificial culture (AC). Culture, in this view comprises not only the cognitive processes of individuals (what they think), but includes their behaviors (what they do), and the products of their labors (what they make) as well. AC comprises a population of individual agents, each with its own sensors, cognizers, and actuators interacting with other agents, with products of their own manufacture, and with an external world containing objects and limited and resources situated on a grid. All inanimate objects are given a materiality constrained by a physics, and animate objects are further constrained by a metabolism. Further specifications, potentials and constraints are discussed here for a simplified implementation called T.I.E.R.S. (Trade, Information Exchange, and Risk Sharing). Along with robot sociology, this strategy may help to clarify the role of the individual in society, the dynamics of cooperation and competition, and the general epistemology of emergence.

 

1 BACKGROUND

The continuing evolution of economical high-speed computers has made it possible to extend the domain of scientific modeling in ways never before imagined. Capitalizing on a continual state of revolution in computer science, particularly in evolutionary algorithms, multi-agent simulation, and the emergent phenomena of artificial life, artificial culture (Gessler 1994a, note 1) is poised at the convergence of at least three major intellectual traditions. 1) The science fiction vision of evolving sentient beings and societies inside of computers. 2) The search for scientific approaches to complex adaptive phenomena. 3) The quest for a theory of culture emphasizing the role of materiality in directing cultural change. The challenge is before us, open to those who are willing to breach the confines of traditional disciplinary domains to explore a new uncharted frontier.

 

Exhibit A - Science Fiction

Personetics... A specific type of personoid activity serves as a triggering mechanism, setting in motion a production process that will gradually augment and define itself; in other words, the world surrounding these beings takes on an unequivocalness only in accordance with their own behavior... As hundreds of experiments have shown, groups numbering from four to seven personoids are optimal, at least for the development of speech and typical exploratory activity, and also for 'culturization.' On the other hand, phenomena corresponding to social processes on a larger scale require larger groups. At present it is possible to 'accommodate' up to one thousand personoids, roughly speaking, in a computer universum of fair capacity... studies of this type, belong... to a separate and independent discipline - sociodynamics. (Lem 1978, 167ff.)

 

Exhibit B - Science

The external world exists in its own right, and that includes the properties of the archaeological record... It is the availability of the external world, regardless of the character of our cognitive devices, that makes it possible for science to work. We can learn the limitations of our own ideas, as science has demonstrated over and over again, through skillful interaction with the world of experience, the external world. (Binford 1986, 403.)

Explanation... (may proceed) without the necessity for absolute, universal laws of behavior. (Read 1986, 11.)

 

Exhibit C - Anthropology and Archaeology

American archaeology is anthropology or it is nothing. (Binford 1962):

Behavior is the working out of deeper, structuring properties, hence the focus on the regularity of behavior and behavioral products characteristic of much of scientific archaeology incorrectly directs attention away from the structuring processes towards their consequences. (Read 1986, 16.)

The dynamics between individual action - which, ultimately, is the source for societal attributes measured at a more summary level - and group properties, including societal organization and cultural systems, is a constant problem that archaeological theorizing has not adequately addressed. (Read 1990, 50.)

(The ethnographer) becomes increasingly dependent upon informants to provide him or her with information regarding their knowledge and beliefs in terms of which the local people operate... (Ethnographers) are still not operating in a scientific role. Instead, they have adopted the role of intercultural translators... (This) has compelled many social researchers to rely on their informants to create their data. In turn, these same informants guide the interpretation and ultimately mediate the understanding of the data. What ethnographers report is not data but information, the intellectualized expression of experience. (Binford 1986, 395-96.)

In fact, I believe we can profitably do without the concept "culture," since it appears to be unoperational in analysis. (Hill 1977, 103.)

Under the direct influence of postmodern philosophers and literary critics such as Paul De Man, Jacques Derrida, and Michel Foucault, interpretationist anthropologists have adopted an increasingly arrogant and intolerant rhetoric aimed at ridding anthropology of all vestiges of scientific "totalizing" paradigms. (Harris 1994, 65.)

 

The Challenge Before Us

As should be clear from the statements above, the struggle for a science of culture is not a goal shared by all anthropologists. Still, there are those of us who persist (Gessler 1989) despite many current counter trends in contemporary anthropology. The road to a science of culture following the path of artificial culture appears to hold out promise for advances along two major fronts of scientific inquiry, notably cultural materialism (after Marvin Harris) and processual archaeology (after Lewis Binford). AC offers hopes for the development of an operational definition and analysis of culture.

 

2 PHILOSOPHY OF ARTIFICIAL CULTURE

There are two likely paths for philosophers to follow in their encounters with artificial life: They can see it as a new way of doing philosophy, or simply as a new object worthy of philosophical attention using traditional methods. Is artificial life best seen as a new philosophical method or a new phenomenon? There is a case to be made for each alternative, but I urge philosophers to take the leap and consider the first to be more important and promising. (Dennett 1994, 291.)

There is only a weak analogy between the philosophies of artificial intelligence and artificial life (Dyer 1994a, Keeley 1995). The latter has its own distinctive implicit philosophy which it shares with artificial culture. The epistemology is based upon a modeling relationship, the quest for a congruence between our representational models and external world phenomena. These representations may vary in complexity, understandability, ease-of-use, and predictive power. There is a hierarchy of possible representations, from the most intuitively understandable and simple to the most incomprehensible and complex. Cognitive representations are largely understandable, personal, and certainly portable, but are limited in power, although varying in complexity. They are bounded by the host organism’s physical body. Formal representations may extend the boundaries outside of the host organism’s body to writing and diagrammatic notation. Static artifactual models are extra-somatic artifacts which may be either graphic or sculptural (e.g. an architect’s model). Dynamic artifactual models are artifacts incorporating mechanisms (e.g. model vehicles). Computational models may embrace more than the above, becoming largely counter-intuitive (not understandable), impersonal, and not portable, but they may nevertheless be extremely powerful embodiments of theory in calculating engines (e.g. computer simulations). Highly immersive models, such as virtual realities, are computational representations which dominate the sensorium through visual and other sensory interfaces. It is among massively parallel computational models broadly defined where we see the highest promise (Resnick 1994). In the near future it may be among mixed computational strategies including hybrid digital and analog circuitry (in robotics) and in molecular and biological structures where we may see additional opportunities.

The goals for this new anthropological paradigm for are four. 1) Discover the minimal primitives required to produce interesting and suggestive global behaviors. 2) Explore the relationship between perturbations at various levels and the resultant global behaviors. 3) Produce global behaviors which will pass the culturization test. 4) Add insight to a growing interest in a comprehensive theory of emergence.

The methodology is to construct an artificial culture (AC), which like a natural culture (NC) may be subjected to all the traditional operations of archaeology and ethnography, with all of their methodological difficulties, advantages and shortcomings. Although an AC may defy understanding in ways similar to a NC, at least an AC may be "captured" and scrutinized in ways that are impossible for its natural counterpart. Unlike a NC, an AC may be "rewound" and played again, with all of its parameters and states knowable to the anthropologist. Unlike a NC, an AC may be played out with alternate scenarios, and explored with a full range of "what if" experiments. An AC may further be given characteristics not found in any contemporary NC, such as agents with different cognitive abilities and now extinct environments, so that problems of prehistory (which constitute 99% of our hominid evolution) may be more fully analyzed and synthesized.

We live in an emergent world in which quick, small processes give rise to slower, larger behaviors. It is a world in which evolution proceeded from the bottom-up, enabling top-down processes to be nested within previously evolved bottom-up structures. From such processes, there emerge different levels of phenomena, each level seemingly obeying different "laws" or regularities. Constructing a theory of emergence would necessitate linking those "laws" across different levels. The discovery of these levels and their linkages is one goal for a science of emergent complexity. It is a grand narrative that has from time to time been attempted (Miller 1978), and is worth repeated efforts.

The term cyberculture is already encountered loosely in anthropological discourse, but a concise set of meanings is in the process of being articulated (Read & Gessler 1995). Cyberculture may best be conceived as a culture which is mediated in some significant way by computer technology. Cyberculture is the culture of, and the culture in, computational media. It includes the collection of computational representations of culture in which the physics and spatiality of the environment, as well as the cognition and materiality of the actors, has been materially changed. As a subset of computiationally mediated culture, artificial culture projects (Gessler 1994a, Karakotsios 1992 and 1994-95) along with related approaches such as Artificial Society (Bankes 1994), Cultural Algorithms (Flannery et al 1989, Reynolds 1994) and others (Lansing et al 1994) may soon attract serious attention within anthropology.

One might envisage, for example, the outlines of another postorganic form of anthropology developing in the context of cyberspace, and anthropology specifically engaged in addressing the problems of engineering cyberspatial forms of intelligence. (Tomas 1992, 33.)

 

3 SPECIFICATIONS FOR ARTIFICIAL CULTURE

Minimally, an architecture for artificial culture should embrace a number of the following primitive objects and functionalities, which are listed hierarchically from the most primitive to the most derived. The derived objects and functionalities should inherit the attributes of all their ontogenic primitives.

Physics should comprise the dimensions of both space and time. Space may be planar or toroidal and measured in continuous or cellular units. Geometries based upon either grid or vector measurements may be mixed or uniform. Time should run forward and be reversible like a video to review and analyze rapidly changing action. Random seeds should be available to insure the repeatability of experiments. Time steps may vary or be uniform for each object.

Objects should include natural inanimate objects, flora, and fauna. Persons (agents or personoids) should include sensors, cognizers, and actuators. Analyzers should include situated and non-situated personoid observers, emergent pattern detectors, and analyzers. Natural Inanimate Objects should include unique identifiers, provenience, mass, size, resource values, and decay functions (e.g. mean-time-to-decay). Natural Flora should include progenitors, and reproductive, metabolic, growth, and vitality functions. Natural Fauna should include limited sensor, cognizer, and actuator functions.

Persons should include sensors, cognizers and actuators: Sensors (input to agents) should distinguish exteroceptors (external) and interoceptors (internal) sources. External sensors should include seeing (vision), hearing (sounds and signals), and listening (to speech). Reception should decay as a function of range. Internal sensors (for self-awareness) should include motivational levels for satisfying drives and seeking or avoiding hunger, danger, stress, and sex. Cognizers (mental culture or "what people think") a.k.a. strategizers should make use of a number of algorithms and functionalities. Strategies should explore bottom-up approaches emphasizing situatedness, embodiment, grounded intelligence, and distributed emergence (Brooks 1991). Representations may be variously instantiated as combinations of sign/symbol clusters in n-dimensional attribute-space which may include either associative (symbolic) or grounded (sensed) referents and whose dimensions may include static or contextually dependent axes. Geographic space may be represented cognitively as either a landmark decision tree or a map projection. Operators and rules may be represented as multiple-agent-based (e.g. Marvin Minsky 1986, Steels 1994) or represented in PROLOG-like or LISP-like functions. Learning may be encouraged by implementing Darwinian or Lamarckian evolutionary strategies on either individuals or populations mimicking ontogenic and phylogenic processes. In addition to evolutionary programming, genetic algorithms, neural nets, genetic programming with LISP-like trees, and Automatically Defined Functions should be evaluated (Koza 1992 and 1994). The highest goal should be to design an architecture that would encourage and facilitate the spontaneous creation and capture of emergent-pattern-detectors which could then heighten the cognitive abilities of the agent. The emergence of higher-order behaviors involving manipulation or tactical deception would be of particular interest. Actuators (behavioral culture or "what people do") a.k.a. agent-outputs should include such abilities as take-object, leave-object, make-object, alter-object, eat, drink, mate-person, challenge-person, fight-person, seek-person, flee-person, make-sound, and speak-language. It is expected that a dynamically evolved self-defined language would be of primary interest. Its translation into English should also present an interesting challenge. Internal actuators should link the above functionalities to incrementing or decrementing specific motivational drives.:

Artifacts (material culture or "what people make") should include the following: manufactory requirements (manpower and materials); operational requirements (manpower, energy, and contingencies); maintenance requirements (manpower, mean-time-before-failure, and mean-time-to-repair); and functionality (manpower, matter/energy, information utility).

Observers should be implemented to collect data as agents either situated and "within" the simulation or non-situated and "outside" it. They may be visible or invisible, naive, trained, or super-human. Situated observers would include naive agents acting subjectively, as well as trained ethnographers and archaeologists operating more objectively within the culture, interacting with and visible to the other agents in the culture. Non-situated observers would include agents radically "altered" in their functionalities to range from super-human to god-like, thus approaching various idealizations of objective observation, including "the view from nowhere" (Nagel 1986).

Analyzers should include the following: behavior analyzers and pattern recognizers commensurable with those used for NCs (Harris 1964); strategy determiners possibly based upon the notion of intentionality; and narrative generators (Gessler 1994b). Normative analyzers should characterize the ratio of shared to non-shared culture. Information route and mode characterizers should track direct and indirect, materially mediated (stigmergic) information flow within the culture. Interpreters should assess language meaning, sensory groundedness, and symbolic associativity.

Visualizers should include run-time visualizations (user analyzed and transient), run-time archiving of visualizations, facilities for instant-replay and time-reversal, and multiple-run visualizers of phase and attribute spaces. Visualizations will be an important element in establishing the culturicity or goodness-of-fit between an AC and a NC.

 

4 PROSPECTS FOR T.I.E.R.S.

T.I.E.R.S. is an acronym for Trade, Information Exchange, and Risk Sharing. It is software under development to explore some of the "primitives" which have been hypothesized to explain some of the origins of human society and some of the dynamics of hunter-gatherer level societies. During the course of cultural evolution, especially during hominidization, the emergence of new food-getting strategies including trade, information exchange, and risk sharing have been the foci of numerous theoretical and empirical debates. In simplified form these foci are being implemented in software to test the interrelationship of these factors.

This implementation offers the promise of evaluating a number of contemporary hypotheses and approaches to understanding the evolution and origins of culture. Among many potential applications are the following: the processes of hominidization; the strategies of reproduction and parental investment; the origins of task differentiation, specialization, professionalization, and class; the dynamics of corporate and kinship groups;.the notion of trade as tolerated theft; the balance of exchange and risk-management strategies; the co-evolution (plurality) of adaptational strategies; and finally, the trend towards increasing social complexity from hunters-gatherers, to agriculturists, to the state and civilization.

One of the major issues of contention among anthropologists is the relative importance they attribute to super-structural (cognitive) versus infra-structural (material/energetic) determinants of cultural change and stability. T.I.E.R.S. will attempt to build artificial worlds in which both explanatory strategies can be tested. It will include the exploration of the following language functionalities: language as an externally referential predictive model, as a manipulative device, as a medium of deception and as ambiguity; language as a self-referential construct of associativity; and language as an epi-phenomenon, as a post-hoc rationalization of pre-determined behavior. It will include the exploration of the following artifactual functionalities: artifacts as stigmergic repositories of information; artifacts as matter/energy repositories and processors; and artifacts as physical models of external world processes.

The T.I.E.R.S. implementation is an ambitious program, but it appears to be a necessary undertaking. Its orderly development will require the systematic assessment of alternative methods of representing each of the primitives of the simulation. It would seem that only an exhaustive search through the space of possible representations will keep the project from being viewed as a "toy solution waiting for a problem." However, we may find that the problems we encounter in simulating artificial cultures may lead to the recognition of new kinds of data and problems that will need to be extracted from natural cultures.

 

5 MEASURING CULTURICITY

Successful correspondence between the model and its referent may be determined as a function of their similarities, including their primitives, structures, processes, behaviors, and emergences of interest to the researcher. This parallels the argument that a deep analogy may also indicate a contagious analogy between model and referent..

Observing people watching silent visualizations of artificial life (AL), one frequently notices spontaneous outpourings of emotions characteristically reserved for natural life (NL). Without verbal cues, people empathize with the objects of the visualizations. They laugh, they cry, they worry, and they feel the pain that they imagine the objects are feeling. Despite knowing the simulated agents are artificial, observers are led by a powerful feeling that these objects are alive. When forced to describe what they are seeing, viewers are hard pressed not to use a vocabulary rich in metaphors of life. All of these reactions are indications that AL produces visualizations which our sensory and cognitive apparatus have evolved to recognize as belonging to living creatures. The creatures of AL are certainly alive in this sense, but they are alive in a very different way. They are not natural, but artificial. They live not in our familiar world, but in an electronic world where the laws of physics and consequently biology may be redefined at will. In short, they live in their own physics, but they do live. We may make a parallel argument for artificial culture.

We might justifiably ask at what threshold does a simulation deserve to be called an AC? We might derive our inspiration from the artificial intelligence community where the Turing test was proposed to give an operational definition of intelligence: In short, intelligence should recognize intelligence. The test was to determine if a human could recognize humanness when the obvious material differences in physics between a human and a computer were concealed. Alternatively, a human and computer could be connected to a human observer through a monitor and keyboard which would pass information in both directions. If the human observer could not distinguish between the human and the computer at the other end, then the computer passed the test. A modified Turing test has been proposed for artificial life. Known lightheartedly as the purring test (Tilden 1994), it was based upon the principle that life should recognize life. If a cat recognizes an artificial creature as being alive, then it is in some sense alive. In this same vein, I propose the culturicity test as applicable to artificial culture, based upon the principle that a member of a culture should recognize a culture. If we hide the obvious and irrelevant differences in physics between a natural culture and an artificial culture, by viewing the NC through the same visualization which we use for the AC, and if the human observer cannot distinguish between naturally and artificially produced behavior, then we may in the same sense talk about real culture instantiated in both natural and artificial worlds. The culturicity test is passed to the extent that an informed observer interprets the visualizations produced by an AC as being visualizations reduced from a NC. Believability is the test. Based loosely upon a formalization for AL (Dyer 1994b), culturicity (the degree to which an observation exhibits "culture") may be defined as a function of its physics, its structure, its visualization, and its complexity. Physics is the laws of physics, chemistry, and physiology extant in the world. Structure is the suite of primitives used. Visualization is the medium through which that world is viewed. Complexity is the magnitude of the minimal specification of that system.

 

6 CONSTRAINTS

Ideally, one would hope to build a computational system in which emergences would be captured and used as primitives for yet higher level emergences (Steels 1991). Although we have not yet ruled out the possibility of creating a computational environment in which an avalanche of emergences will evolve, there may be some serious limitations to that realization.

In the real world, everything is emergent from everything else. Even the primitives of a system under analysis are in fact emergent manifestations of yet deeper primitives hiding beneath. In the natural world these cascading primitives are simply there. They exist. In an artificial world their antecedents need to be specified. We must speculate in advance as to which primitives we will fix as impenetrable, and which primitives we will specify as the result of deeper emergences, but it is difficult to foresee how far we need to take each primitive in order to give rise to a particular culturally emergent phenomenon. This will probably have to be determined by experiment. The current speed, capacity, and power of high performance computers make it feasible to model emergence, even at the expense of having to specify both a spatiality and materiality that are inherently foreign to solid-state electronics.

With the progress being made in robot ethology, a.k.a. autonomous mobile robot collective behavior (Mataric 1993 and 1994, Steels 1994a and 1994b, Kube & Zhang 1994) we may eventually find it easier to model cultural emergence using these hybrid computational/material platforms rather than solely computational platforms. Although robots are materially different from ourselves, they nevertheless share with us a materiality which does not have to be coded into an all-informational exclusively computational platform. On the other hand, for evolutionary studies, the actual materiality of the robot is an impediment to operations such as mutation and recombination which may be more easily performed on the virtual materiality of a simulation. For similar reasons as well as for speed and efficiency, the advent of molecular computers may further accelerate our modeling abilities (see Adelman 1994 and 1995, Birge 1995, Lipton 1994, Mead 1990).

It would be interesting to know how to stimulate and compound emergences. I offer the following hypothesis: the richness of emergence expected from a model is a function of the richness of its embedded primitives. Should this be borne out in practice, then in AC as in AL, the computational limits to the implementation of a vast array of primitives would seem to be our ultimate constraint. Beyond being the subject for a new anthropological investigations, artificial culture may significantly reform our outlook, requiring us to change the very questions we ask of the natural and artificial worlds. We may find that AC data are not commensurate with NC data. Relevant NC data may have been overlooked or may be impractical to collect. In this eventuality, I offer the following heuristic: the absence of evidence is not the evidence of absence.

 

NOTE

The term artificial culture was coined in consultation with Michael Dyer, Computer Sciences, UCLA during 1993.

 

REFERENCES

Adleman, Leonard M. (1994). Molecular Computation of Solutions to Combinatorial Problems. Science 226: 11 November 1994.

Adleman, Leonard M. (1995). On Constructing a Molecular Computer. Draft dated 8 January 1995.

Bankes, Steve (1994). Exploring the Foundations of Artificial Societies: Experiments in Evolving Solutions to Iterated N-Player Prisoner’s Dilemma. In Artificial Life IV, Proceedings Of The Fourth International Workshop On The Synthesis And Simulation Of Living Systems, eds. Rodney Brooks and Pattie Maes. MIT Press, Cambridge, 337-342.

Binford, Lewis R. (1962). Archaeology as Anthropology. American Antiquity 28:2, 217-225.

Binford, Lewis R. (1986). Data, Relativism, and Archaeological Science. Man (N.S.) 22, 391-404.

Birge, Robert R. (1995). Protein-Based Computers. Scientific American 272:3, 90-95 (March 1995).

Brooks, Rodney (1991). Intelligence Without Reason. In International Journal of the Conference on Artificial Intelligence.

Dennett, Daniel (1994). Artificial Life as Philosophy. Artificial Life 1: 291-292.

Dyer, Michael (1994a). Toward Synthesizing Artificial Neural Networks that Exhibit Cooperative Intelligent Behavior: Some Open Issues in Artificial Life. Artificial Life 1:1/2, 111-134.

Dyer, Michael (1994b). Discussions at the James Reserve Workshop, Idyllwild, California, 4-6 November 1994.

Flannery, Kent V., Joyce Marcus and Robert G. Reynolds (1989). The Flocks of the Wamani: A Study of Llama Herders on the Punas of Ayacucho, Peru. Academic Press, San Diego.

Gessler, Nicholas (1989). The Sciences of Man. In The Outer Shores: Proceedings of the Queen Charlotte Islands First International Scientific Symposium, eds. Geoffrey G.E. Scudder and Nicholas Gessler. Queen Charlotte Islands Museum Press, British Columbia, 195-198.

Gessler, Nicholas (1994a). Artificial Culture. In Artificial Life IV, Proceedings Of The Fourth International Workshop On The Synthesis And Simulation Of Living Systems, eds. Rodney Brooks and Pattie Maes. MIT Press, Cambridge, 430-435.

Gessler, Nicholas (1994b). Automatic Narrative in Artificial Culture. Abstract in The Society for Literature and Science 1994 Conference. SLS, New Orleans.

Harris, Marvin (1964). The Nature of Cultural Things. Random House, New York.

Harris, Marvin (1979). Cultural Materialism: The Struggle for a Science of Culture. Random House, New York.

Harris, Marvin (1994). Cultural Materialism is Alive and Well and Won’t Go Away until Something Better Comes Along. In Assessing Anthropology, ed. Robert Borofsky. McGraw Hill, New York, 62-76.

Hill, James N. (1977). Systems Theory and the Explanation of Change. In Explanation of Prehistoric Change, a School of American Research Book. University of New Mexico, Albuquerque, 59-104.

Karakotsios, Ken (1992). SimLife: The Genetic Playground. Software published by Maxis, Orinda.

Karakotios, Ken and Michael Bremer (1992). SimLife: The Official Strategy Guide. Prima Publishing, Rocklin.

Karakotsios, Ken (1994-1995). Personal communications by electronic mail, 6 November 1994 and 5 January 1995.

Keeley, Brian L. (1995). Against the Global Replacement: On the Application of the Philosophy of Artificial Intelligence to Artificial Life. In press, The Philosophy of Artificial Life, ed. Margaret Boden. Oxford, Oxford.

Koza, John R. (1992). Genetic Programming: On the Programming of Computers by Means of Natural Selection. MIT Press, Cambridge.

Koza, John R. (1994). Genetic Programming II: Automatic Discovery of Reusable Programs. MIT Press, Cambridge.

Kube, Ronald C. and Hong Zhang (1994). Collective Robotics: From Social Insects to Robots. Adaptive Behavior 2:2, 189-218.

Lansing, Stephen J. and James N. Kremer (1994). Emergent Properties of Balinese Water Temple Networks: Coadaptation on a Rugged Fitness Landscape. In Artificial Life III, ed. Christopher G. Langton. SFI Studies in the Sciences of Complexity, Proceedings Volume XVII, Addison Wesley, Reading, 201-224.

Lem, Stanislaw (1978). Non Serviam. In A Perfect Vacuum. Harcourt, Brace, Jovanovich, San Diego, 167-196.

Lipton, Richard J. (1994). Speeding Up Computations via Molecular Biology. Draft dated 9 December 1994.

Mataric, Maja J. (1993). Designing Emergent Behaviors: From Local Interactions to Collective Intelligence. In From Animals to Animats: Second International Conference on Simulation of Adaptive Behavior (SAB92). MIT Press, Cambridge, 432-441.

Mataric, Maja J. (1994). Learning to Behave Socially. In Dave Cliff, Philip Husbands, Jean-Arcady Meyer, Stewart Wilson, eds. From Animals to Animats 3: Proceedings of the Third International Conference on Simulation of Adaptive Behavior. MIT Press, Cambridge, 453-462.

Mead, Carver (1990). Neuromorphic Electronic Systems. Proceedings of the IEEE 78:10, October 1990, 1629-1636.

Miller, James Grier (1978). Living Systems. McGraw-Hill, New York.

Minsky, Marvin (1986). The Society of Mind. Simon and Schuster, New York.

Nagel, Thomas (1986). The View from Nowhere. Oxford University Press, New York.

Read, Dwight and Nicholas Gessler (1995). Cyberculture. In Encyclopedia of Cultural Anthropology. In press, Human Relations Area Files Inc., New Haven.

Read, Dwight W. (1986). Mathematical Schemata and Archaeological Phenomena: Substantive Representations or Trivial Formalism? Science and Archaeology 28: 16-23.

Read, Dwight W. (1990). The Utility of Mathematical Constructs in Building Archaeological Theory. In Mathematics and Information Science in Archaeology: A Flexible Framework, ed. Albertus Voorrips. Studies in Modern Archaeology 3. Holos, Bonn.

Resnick, Mitchel (1994). Turtles, Termites, and Traffic Jams: Explorations in Massively Parallel Microworlds. MIT Press, Cambridge.

Reynolds, Robert G. (1994). An Introduction to Cultural Algorithms. In Proceedings of the Third Annual Conference on Evolutionary Programming, eds. Anthony V. Sebald & Lawrence J. Fogel. World Scientific, Singapore, 131-139.

Steels, Luc (1991). Towards a Theory of Emergent Functionality. In From Animals to Animats: Proceedings of the First International Conference on Simulation of Adaptive Behavior, eds. Jean-Arcady Meyer and Stewart W. Wilson. MIT Press, Cambridge, 451-461.

Steels, Luc (1994a). Emergent Functionality in Robotic Agents Through On-Line Evolution. In Artificial Life IV, Proceedings Of The Fourth International Workshop On The Synthesis And Simulation Of Living Systems, eds. Rodney Brooks and Pattie Maes. MIT Press, Cambridge, 8-14.

Steels, Luc (1994b). A Case-Study in the Behavior-Oriented Design of Autonomous Agents. In Dave Cliff, Philip Husbands, Jean-Arcady Meyer, Stewart Wilson, eds. From Animals to Animats 3: Proceedings of the Third International Conference on Simulation of Adaptive Behavior. MIT Press, Cambridge, 445-452.

Tilden, Mark (1994). Attribution made during discussions at the James Reserve Workshop, Idyllwild, California, 4-6 November 1994.

Tomas, David (1992). Old Rituals for New Space: Rites de Passage and William Gibson’s Cultural Model of Cyberspace. In CYBERSPACE: FIRST STEPS, ed. Michael Benedikt. MIT Press, Cambridge, 31-47.