View Electronic Edition

Artificial Life

A BIOLOGIST NAMED THOMAS RAY has devised what he believes to be the first truly open-ended digital evolution system. To Ray, allowing the system to find its own fitness a natural selection as opposed to an artificial one was key to creating living things on the computer. Ray's own definition of life hinged on that factor: "I would consider a system to be living if it is self-replicating, and capable of open-ended evolution..." he wrote. "Artificial selection can never be as creative as natural selection---- Freely evolving creatures will discover means of mutual exploitation and associated implicit fitness functions that we would never think of."

Ray's interest in the field was not that of a digital alchemist but of a professional biologist. Although he studied chemistry at Florida State University and had been planning to take further undergraduate work in physics and math, Ray became interested in ecology, "sort of in the sixties frame of mind," he explains, somewhat sheepishly. He completed a doctorate in biology at Harvard, and did fieldwork in the rain forests of Costa Rica. But one particular experience from his Cambridge days settled into his mind like a dormant spore. Ray had taken an interest in the Chinese game of Go, and one day in the late 1970s he was the recipient of a remarkable one-on-one de-construction of the ancient game, from a beak-nosed, ponytailed hacker working at MIT's Artificial Intelligence Lab. To Ray's astonishment this person coolly analyzed the game in biological terms, matter-of-factly mentioning that computer programs could self-replicate. Ray instantly made the connection between self-replication and natural selection and became very excited at the implications.

At the time, Ray's computer experience was insufficient to experiment with the concept. And soon, the pressures of his subsequent passion, rain forest conservation, took precedence. It was not until late in 1989, when Ray had become an assistant professor at the University of Delaware, that the spore revivified. Ray had become familiar with the workings of personal computers, and he also followed the news of computer viruses. For some reason, the words of the mystery Go player tumbled back into his head. Could computer viruses be included among the potential life forms the hacker had postulated? Could he exploit these possibilities to perform a digital form of the Darwinism he had studied so closely this past decade? Ray became determined to find out.

No one else at Delaware was much interested. When Ray brought up the idea at a graduate seminar in ecology, "I was virtually laughed out of the room," he says. Ray's colleagues, who had previously voted him down for tenure, considered the premise wacky. But Ray persisted. Although he had a grant to study tropical ecologies, he neglected the project. Instead he hatched ideas for techniques of stimulating evolution. "It was something that was obsessing me, and I felt I had to go where the flow of my energies were," he says. "Artificial life was the thing that kept me awake at night."

Wondering whether others were similarly impassioned, he posted an inquiry on various computer networks and was led to the proceedings of the first artificial-life conference. They galvanized Ray. He arranged to go to New Mexico to visit Langton, Farmer, and the other T-13 a-life researchers to discuss his idea for an open-ended evolutionary system.

It was a good thing he did. Ray's idea had been to create creatures consisting of computer instructions who would "live" inside the machine's core memory and compete for space in that silicon terrain. A potentially treacherous plan. Although Ray planned to run his experiment in an isolated personal computer labeled "containment facility" and protected with metal bars covering the disk drive and serial port, there was no guarantee that, through negligence or sabotage, his creatures would not be transferred to other computers. If for instance they found their way into one of the time-sharing mainframes on the Delaware campus, they could infect other jobs working on the computer or even migrate from that machine to the data highways of the international computer network. Ray's experiment could have been the equivalent of importing a deadly predator to an ecology that had evolved no protection against such an invader. It could be even more destructive than the notorious "Internet Worm" loosed on the computer nets in November 1988 by a mischievous Cornell student ? almost exactly a year before Ray's trip to Los Alamos. Unlike the comparatively primitive worm, Ray's organisms would be constantly evolving. Natural selection would favor those organisms most difficult to eradicate, and, like certain insects immune to DDT, mutated variations of Tom Ray's experiment might become permanent, and unwelcome, residents on the computer nets.

Langton and Farmer suggested a modification, based on Turing's perception that any digital corn-puter could emulate any other digital computer. They suggested, in effect, that Ray should create an imaginary computer and simulate its operation within a real computer. That way, his organisms, in their competition for memory space in a virtual computer, could use a nonfunctional computer language, one that worked only in the model. If someone attempted to liberate the creatures and use them outside this theoretical cage, the code would not work.

Langton and Farmer warned Ray not to expect too much from his experiment. Ray's ideas of open-ended evolution depended on the creation of viable creatures whose subsequent mutations would drive the system toward a diverse set of more complex creatures. Mutations, however, were more often destructive than beneficial. Although natural organisms, with built-in redundancy, can accommodate occasional mutations, computer programs generally cannot. Non-open-ended simulated evolution systems such as GAs algorithms and Dawkins-style biomorphs avoided this problem by having an outside force weeding the population by a predetermined definition of fitness a neat way to sweep away poorly mutated organisms. Because an open-ended system found its own fitness, Ray would not have that advantage.

But Ray thought he knew the way around the problem. Again, the virtual computer concept was the hero. Because an imaginary computer's machine-language requirements could be made much less exacting than those of a real computer, Ray could devise a specialized use of a computer instruction set that would be more forgiving to mutations. The scheme relied on using what Ray called "electronic templates." These were small blocks of computer instructions contained in each organism; replication occurred when the organism found the opposite template in the environment. Because the environment was well stocked with potential matching templates, even mutated organisms with altered instruction blocks could easily reproduce. In addition, when an organism searched for complementary templates it was in effect examining its environment. Thus Ray's digital organisms had the equivalent of sensory apparatuses. By searching their environment for matching parts, Ray's creatures behaved in the spirit of von Neumann's imaginary kinematic self-reproducing automaton.

As soon as Ray returned to Delaware, he began creating the artificial environment he would call Tierra. Previous work in open-ended artificial evolution focused on the origin of life in an attempt to evoke the behavior of biology from a prebiotic environment. The archetypical example was the VENUS simulator, codesigned by Steen Rasmussen, the Danish physicist who was part of the Los Alamos T-13 group. Although Ray considered VENUS interesting, he felt that it was tmnecessary to begin so early in biological history. "It's based on the physics mentality the Los Alamos guys want life to evolve virtually from quarks!" he says, the dismissive hyperbole underlying his conviction that his approach is superior. "They want to start with fundamental particles and get life to emerge spontaneously, at the origin-of-life level. What they get are more like molecules than chemistry. There's no individuality. It's a far cry from organisms."

Ray modeled his system on a later stage in life's development, the explosion of biological diversity that signaled the onset of the Cambrian Era, roughly six hundred million years ago. From a relative paucity of phyla, the earth teemed with unprecedented new life forms. Ray believed that his system's exploitation of open-ended evolution, if not providing a similar profusion, would demonstrate the mechanics of that diversification.

The Tierran system was, a competition for computer processing time and memory space. Whereas natural organisms drew energy from the sun to maintain their order, the digital organisms within the Tierran environment drew their energy from the virtual computer's central processing unit (CPU) and used that energy to power the equivalent of their own energy centers, virtual CPUs assigned to each organism. The components of the virtual computer CPU, memory, and operating-system software were the environment, and the digital creatures themselves were assembly-language programs that ran on the computer. (Assembly language consists of digital instructions read directly by a computer's central processor.) Like many other digital creatures, the code of Tierran organisms acted both as a genotype, in that the code was copied during reproduction, and as a phenotype, in that the execution of the program performed a function that determined its fitness. Typically, executing the code would cause a creature to be copied from one part of the environment to another. Cell division, or replication, occurred when the move resulted in a daughter cell that required its own slice of CPU time. Essentially, Tierran organisms were genetic replication machines, digital kin to the hypothesized RNA-world life forms that supposedly were the ancestors of all known subsequent forms of life.

All this took place in a block of computer memory that Ray referred to as "the soup." The creatures living in the soup were arranged in a circular queue, lined up to receive their slice of time from the virtual computer's CPU. A function Ray called the "reaper" made sure the soup did not stagnate and policed the population by lopping off the creatures at the top of a separate, linear, "reaper queue." These were generally the oldest, which climbed up the list simply by aging. However, by successfully executing instructions, organisms could postpone their climb and thus fend off the reaper. Flawed creatures rose quickly up the queue and reached their fatal peaks after a short existence. But even relatively fit creatures could not permanently stave off their rise toward death because newcomers constantly were introduced below them. In Tierra, as on earth, death was inevitable.

Evolution in Tierra was driven by several methods of mutation. First, the soup itself was subject to noise random bit-flipping that Ray considered "analogous to mutations caused by cosmic rays." (This insured the demise of even superbly adapted creatures, whose high fitness eventually would be worn down by the background bit-flipping.) Ray also implemented mutations during the replication process in order to emulate genetic variation. Finally, there was a form of mutation that sometimes caused random alterations of instructions when the creatures executed their code. The cumulative effect of all these mutations was to vary the Tierran environment and the evolution of its inhabitants each time the program was run; thus Tierra was not a deterministic system but a probabilistic one.

For two months Ray programmed furiously, and soon after New Year's Day 1990 he was ready to begin the test runs of Tierra on the high-powered Toshiba laptop he used for development.

The first time Ray ran Tierra, he did not expect much. "The Los Alamos people had told me it was going to be really hard to do what I wanted, that it would take years of work," he recalls.

"I believed that. They told me it wouldn't work with the type of instructions I used, because they're too brittle, mutations would stop the system. I believed that, too, but I wanted to try it, as Chris Langton put it, to find out why it wouldn't work. So when I first ran the system I just wanted to get it working. I figured out how many instructions it would require to replicate, rounded it off, and that was my instruction set. Then I built a creature to test the simulator, a creature that self-replicated and didn't do anything else. 1 thought, 'Okay, I'll get the simulator working, and it'll take me years to get evolution out of the system.'

"But as it turns out, I never had to write another creature."

On January 3, working at night on a table in the bedroom of his apartment while his wife slept, Ray "inoculated" the soup with his single test organism, eighty instructions long. He called it the "ancestor." Its replications took somewhere over eight hundred instruction executions each time. The ancestor and descendants quickly populated the soup, until it was 80 percent full. Once that threshold was attained, the reaper began its grim task and ensured that the population would grow no further.

The experiment proceeded at twelve million instructions per hour. (Later, using more powerful computers, Tierra would run six times faster.) Ray tracked the proceedings on a dynamic bar chart, which identified the organisms and the degree to which they proliferated in the soup. Initially, clones of the ancestor dominated thoroughly; these typically replicated only once before dying. Then mutants began to appear. The first was a strain of creatures seventy-nine instructions long. The horizontal bar on the chart representing those creatures began to pulse, the bar representing the eighty-instruction ancestors shrank, and soon the lower bar inched past the original. Eventually, some bars directly below those two began pulsing, indicating that even smaller mutations had successfully found ways to self-replicate. Ray was thrilled; Tierra was displaying the effects of evolution, as variations on the original were discovering more successful strategies for coping in the environment. The smaller organisms were more successful because their slightly shorter length allowed them to reproduce while occupying less CPU time. (Ray had the option of adjusting the system parameters to reward larger organisms instead of smaller ones.)

Then something very strange happened. In the lower regions of the screen a bar began pulsing. It represented a creature of only forty-five instructions! With so sparse a genome, a creature could not self-replicate on its own in Tierra; the process required a minimal number of instructions ? probably, Ray thought, in the low sixties. Yet the bar representing the population of forty-five soon matched the size of the previous largest creature. In fact, the two seemed to be engaged in a tug-of-war. As one pulsed outward, the other would shrink, and vice versa.

It was obvious what had occurred. A providential mutation had formed a successful parasite. Although the forty-five-instruction organism did not contain all the instructions necessary for replication, it sought out a larger, complete organism as a host and borrowed the host's replication code. Because the parasite had fewer instructions to execute and occupied less CPU time, it had an advantage over complete creatures and proliferated quickly. But the population of parasites had an upper limit. If too successful, the parasites would decimate their hosts, on whom they depended for reproduction. The parasites would suffer periodic catastrophes as they drove out their hosts.

Meanwhile, any host mutations that made it more difficult for parasites to usurp the replication abilities were quickly rewarded. One mutation in particular proved cunningly effective in "immunizing" potential hosts extra instructions that,.in effect, caused the organism to "hide" from the attacking parasite. Instead of the normal procedure of periodically posting its location in the computer memory after reproducing, an immunized host would forego this step. Parasites depended on seeing this information in the CPU registers, and, when they failed to find it, they would forget their own size and location. Unable to find their host, they could not reproduce again, and the host would be liberated. However, to compensate for its failure to note its size and location in memory, the host had to undergo a self-examination process after every step in order to restore its own "self-concept." That particular function had a high energy cost it increased the organism size and required more CPU time ? but the gain in fitness more than compensated. So strains of immunized hosts emerged and virtually wiped out the forty-five-instruction parasites.

This by no means meant the end of parasitism. Although those first invaders were gone, their progeny had mutated into organisms adapted to this new twist in the environment. This new species of parasite had the ability to examine itself, so it could "remember" the information that the host caused it to forget. Once the parasite recalled that information it could feast on the host's replication code with impunity. Adding this function increased the length of the parasite and cost it vital CPU time, but, again, the tradeoff was beneficial.

Evolutionary arms races were a familiar turf for ecologists such as Tom Ray. In the natural biosphere, of course, they extended over evolutionary time, measured in thousands of years. But even a true believer such as Ray was astonished at how easily a digital terrain could generate this same competition. Tierra had developed identical phenomena within ten minutes! Just as remarkable was that his system had produced this situation, previously wedded to biology's domain, without any manipulation whatsoever.

"In my wildest dreams that was what I wanted," he said. "I didn't write the ancestor with the idea that it was going to produce all this."

Most satisfying to Ray was an effect clearly triggered by Tierra's open nature: on its own, the system had shifted the criteria for what constituted a fit organism. When the soup filled with organisms, the evolutionary landscape itself changed; the digital creatures were forced to seek novel responses to their altered circumstances. They did this by rewarding what previously would have been hopelessly ineffectual mutations. The door was opened to unprogrammed diversity.

"At the outset, selection favors efficiency towards the size bias we set up," explains Ray. "But as the system runs, mutants do odd things, and one of the odd things they do is discover other creatures, then exploit them. The parasites don't contain all the information they need to replicate, but they find that information in their environment, which now consists of other creatures. And it even turns out they alter each other's information, and in that way divert someone else's energy resources into the replication of their own genome. That's where the evolution gets interesting, because they're all still trying to make their code more efficient, but the bulk of evolution is coming from exploiting each other. The organisms have added a whole new realm to the fitness landscape, a new adaptation for passing on their genes, a specific mechanism not present in the ancestor. In this case, parasitism, or immunity to parasitism."

The emergence of diversity in Tierra's maiden voyage was no anomaly. Although each subsequent run differed in some respect, the major effects kept repeating. Within a few million instructions parasites would emerge, and an evolutionary arms race would ensue.

Ray conducted a variety of experiments with Tierra. As an alternative to inoculating the system with a single ancestor, he injected the soup with creatures evolved from previous runs. His gene bank soon grew to over twenty-nine thousand different genotypes of self-replicating organisms, of over three hundred size classes. Typically, he would isolate a certain host and a certain parasite and see the effects. Then he would sort and analyze the results with the aid of an accompanying program called Beagle, honoring the ship on which Darwin voyaged to the Galapagos. "This sort of thing should be very interesting to population geneticists," says Ray. "Never before has anybody been able to look at genetic change in a population right down to frequency of every genotype in every species of a community. I'm making a record of every birth and death. I can go back and figure out why one variation beat out another, look at its code and determine what gave it the advantage."

Using this method, Ray duplicated various biological phenomena observed in the field by ecologists. In one experiment, Ray gauged the effect of introducing a parasite organism into a previously pristine ecology; then, trying the opposite, he removed parasites from the soup. "Just as in natural ecological communities, the presence of a predator doubles the diversity," he says. "The predator [parasite] tends to suppress the dominant host competitor, and prevents it from competitively excluding the weaker competitors. So Tierra reflects real ecological communities in a very nice way."

Ray's other experiments indicated that genetic mutation itself is not necessarily the driving force behind evolution. In one experiment, he adjusted the parameters of the system by switching off the background noise and eliminating mutations from replication. He inoculated the soup with hosts and parasites, and diversity emerged as surely as before. He attributed this to the effect caused by the "sloppy replication" that occurred when parasites tampered with the host genomes. The host codes were sometimes broken, causing an effect much like crossover. Along with Hillis's findings and related work by Kauffman and Koza, this result was a further indicator that the evolution of organic complexity, and possibly sexual reproduction, might owe much to the emergence of parasites.

Like Hillis and Koza, Ray believed that digital evolution had the potential to become the engine of practical computer programming in the next century. Tierran organisms, like Hillis's Ramps and Koza's LISP creatures, were capable of brilliant feats of code crunching. As millions of instructions were executed, Tierran organisms optimized their size, managing to compact very complicated algorithms into instruction sets much smaller than those with which they began. Ray saw organisms clock a 5.75-fold increase in efficiency. The organisms performed this wizardry by using the extremely nonintuitive techniques that come naturally to artificial organisms.

One example illustrated how organisms discover programming tricks. Ray's creatures commonly had pieces of code, or templates, to mark where they began and where they ended. (These acted as a sort of membrane to isolate the creature from the environment.) But one species of creature hit on an idea that enabled it to evolve without using a template to mark its end. "These creatures," wrote Ray, "located the address of the template marking their beginning, and then the address of a template in the middle of their genome. These two addresses were then subtracted to calculate half of their size, and this value was multiplied by two ... to calculate their full size."

Ray's organisms were capable of more complicated tricks. After one run of fifteen billion instructions, he examined a certain creature he named "72etq." (Ray named his organisms by the number of instructions in their genotype, followed by three letters, representing the order in which the creature appeared in his experiments. Thus the ancestor was called 80aaa, and 72etq represented the 3,315th different version of a creature seventy-two instructions long.) This particular organism executed a series of algorithms that performed a sophisticated optimization technique called "unrolling the loop." It allowed the creature to operate with a genome half its actual size (thirty-six instructions) by a complicated but highly compact series of instruction swaps and self-examinations. According to Ray, "The optimization technique is a very clever one invented by humans, yet it is implemented in a mixed-up but functional style that no human would use (unless very intoxicated)." Ray managed to examine carefully only a small percentage of the genomes of his more evolved creatures; it is logical to assume that others had devised equally impressive optimization schemes that may or may not have been worked out by humans. One could speculate that environments like Tierra might find utility as virtual laboratories for generating the algorithms that would drive the devilishly complex computer programs run on the supercomputers in the next century.

Still, the most spectacular news from Tierra was its analogue to biology, particularly in the diversity that emerged when Ray allowed it to run for mammoth sessions. A series of eras unfolded. These appeared suddenly, after long stretches of stable behavior. (This was further confirmation of Hillis's discovery that punctuated equilibrium emerges spontaneously in computational evolution.) In each of these, genetic explosions erupted noiselessly, marked on the screen only by a profusion of different levels on the dynamic bar chart. (Later, one of Ray's students improved the display so that different organisms would be represented by colored rods.) Yet to an observer supplied with the knowledge that the tiny world had undergone a sort of evolutionary apocalypse, these shifts in light seemed accompanied by Wagnerian fanfares and blinding flashes of lightning. It was history on the grandest possible scale.

A typical experiment of this sort could begin with an inoculation of a single ancestor. Soon came the almost inevitable appearance of parasites and host adaptations to resist the parasites. For millions of instructions, Tierra maintained a pattern wherein two sets of organisms descendants of organisms of around eighty instructions and parasites with around forty-five instructions maintained their presence in the soup. Suddenly, a new sort of organism arrived and began to dominate. On examining the code, Ray discovered that these new mutants were /iyper-parasites: although derived from the genomes of host organisms, they had developed an ability to divert the metabolism of the parasites in order to bolster their own replication function.

The hyper-parasites were remarkable creatures. They were the same length as the eighty-instruction ancestor, but subsequent evolutionary pressure had changed almost one-fourth of the genome and replaced the ancestor's instructions with others. Those changes greatly enhanced their fitness by allowing them not only to replicate but also to fatally attack their small competitors. This stunt was dispatched in a manner that would win accolades and envy from any skilled hacker: hyper-parasites managed to examine themselves constantly to see whether parasites were present. If a parasite was detected, the hyper-parasite executed a Pac-Man-style maneuver. Transmogrifying from victim to vic-timizer, it diverted the parasite's CPU time to itself. The assault was so devastating that its continued repetition drove the parasites to extinction.

From that point on, cleansed of simple parasites, Tierra went into another long period of relative stability. No longer burdened with competing parasites, the host organisms, almost all of which were now hyper-parasites, searched evolutionary space in an attempt to maintain the genetic integrity necessary to replicate, while consuming less energy. The method by which the hyper-parasites accomplished this recalled experiments by Robert Axelrod and others studying the evolution of cooperation. Groups of hyper-parasites worked sym-biotically, sharing the code for replication. This new variation could not reproduce on its own but reUed on similar organisms to provide the missing piece of the reproductive gene. Like pairs of cooperating participants in the iterated Prisoner's Dilemma, each organism realized a benefit from the symbiosis.

This Utopian scenario continued for millions of instructions. It was, however, doomed. Formalized cooperation had become yet another aspect of the environment to be exploited by an opportunistic mutation. In this case, the interloper was an organism that shrewdly placed itself between two cooperating hyper-parasites and intercepted the genetic information for replication as it passed from one to the other. It was as though a quarterback and a halfback, smug in the knowledge that ' no defensemen were nearby, had been practicing handoffs, and suddenly, inexplicably, a defensive back emerged from nowhere and spirited away the precious football. By commandeering the replication code in one well-positioned grab, this hyper-hyper-parasite, or "cheater," was able to reproduce and thrive with a body length of only twenty-seven instructions.

"When the hyper-parasites drove the parasites to extinction at around 550 million instructions, I thought that I was never going to see them again because the defense seemed ironclad," says Ray. "But the evolution of sociality made them vulnerable again, and gave the parasites a way back into the system."

As each run of Tierra unfolded, Ray and others attuned to the behavioral mosaic of ecology could recognize biological phenomena as they emerged. But, because Tierra was life of a different sort, a truly synthetic form of life, it may have been displaying behavior that was lifelike but characteristic mainly of an alternative form of life. Tom Ray admitted a problem in identifying these possible effects: "What we see is what we know," he wrote. "It is likely to take longer before we appreciate the unique properties of these new life forms."

Tierra's instant ability to yield the drama, and apparently the dynamics, of an evolutionary biosphere changed Ray's life. The same ecologists at Delaware who once refused him tenure now came to his office and spent hours staring at the bars on his computer screen. Ray won tenure. Others at Delaware, particularly a group of computer scientists, became committed enthusiasts, and soon Ray was at the forefront of a Newark-based a-life study group.

Still, Ray suffered the reluctance of those who had difficulty conceiving of lifelike phenomena arising from the bowels of a computer. When one official at the Air Force Office for Scientific Research (AFOSR) reviewed Ray's work with Tierra, he passed it around and found not only resistance to supporting the idea but also an edge of ridicule, a suspicion that Ray had perhaps overly relied on science fiction for his vision. Some at the AFOSR wondered whether some of the modest funding devoted to other experiments under the rubric of artificial life should not be reconsidered: Doyne Farmer had to reassure the funders that serious science was indeed the agenda of this new field.

As Ray continued his work, however, and began circulating his results among computer scientists and biologists, Tierra gained a level of respect unprecedented among a-life experiments. IBM, excited about the possibility of transferring the methodology of Ray's organisms to the principles of programming massively parallel computers, awarded him a $15,000 prize in their super-computing competition. Ray's work won attention from science journals and the lay press. The Santa Fe Institute invited him to spend six months as a visiting fellow. Perhaps most impressive of all the reactions came from the ranks of biologists, most of whom were extremely wary about the possibilities of producing lifelike phenomena on computers. When Ray presented his results to a gathering of evolutionary biologists, he won the respect of key evolutionary biologists like John Maynard Smith. Ecologist Stephen P. Hubbell of Princeton, originally a skeptic, attended a seminar on the work and described it as "spectacular." Another noted biologist, Graham Bell of McGill University, described Ray's system as "the first logical demonstration of the validity of the Darwinian theory of evolution," and wrote a letter touting Ray's system.

This work has three important uses. First, it is a superb educational tool. Many people doubt that the theory of evolution is logically possible. ... Now, one can simply point to the output of Ray's programs; they are the ultimate demonstration of the logical coherence of evolution by selection. Secondly, it seems likely to provide a superior method for testing theoretical ideas in evolution, by providing more realistic general algorithms than have ever before been available. Thirdly, it may also represent a general advance in computation, since it makes it possible to evolve efficient algorithms for any purpose. ... I am writing to assure you that it... ranks among the most interesting developments in evolutionary theory in the last ten years.

Ray would cheerfully admit the limitations of his system he noted that several magnitudes of increased computer power would be required to support a system to evolve more complex creatures, one that could support life forms with the equivalents of both DNA and RNA, for instance, or multicellular organisms. But in a sense Tierra had already accomplished one of Ray's prime goals ? the beginnings of a shift in perception caused by a successful implementation of open-ended artificial evolution. An indication of this came in the August 27,1991, edition of the New York Times, declaring that, on the heels of Tierran evolution, "A new round of debate has developed among scientists as to where the dividing line between life and non-life may lie." A debate long anticipated by the proponents of artificial life.

Tom Ray had an additional viewpoint on the ability of Tierra to evolve the workings of biology from a digital soup: "The conclusion 1 draw from it," he says, "is that virtual life is out there, waiting for us to create environments for it to evolve into."