This is an article started in 2011, about the under use of evolution as a technique in experimental science. I think directed evolution could be used much more than it currently is; for solving problems whose complexity is greater than we can readily understand.
Evolutionary technique is, in fact, already widely used, most extensively in biology, but to some extent, in all the other areas mentioned below, including the attempt to evolve machine intelligence. Every day brings new examples of a creative way of using the evolutionary technique , and the field is growing exponentially.
So why bother suggesting the use of an already well used technique? I am suggesting far more radical uses, for instance in biology, although everyone is using directed evolution for increasing yields, or deriving certain targeted function molecules, no one (to my knowledge) is attempting to evolve a strain of bacteria to produce super strong carbon nanotube to serve as the material for a space elevator, or, for an even more useful example, to evolve an entire micro ecology to produce new antibiotics.
Humans (and especially scientists), have a natural ingrained desire to understand the mechanism behind things, to know how things work. This impulse has produced the remarkable success of modern science and brought about theoretical revolutions, and immense practical benefits. However, there is another altogether different avenue : random variation and selection, in repeated cycles. This is the mode employed by nature in evolution and it has produced even more remarkable results (us for instance) .What I am suggesting is to let go of, (or at least postpone), our natural desire to understand, and use the power of evolution to produce things which are beyond our current knowledge, and intellectual grasp, things which we could never design. After the result is obtained, we can go backwards and reverse-engineer it to see how it works, (or at least try to).
The under-use of the evolutionary process in experimental science.
It is a peculiar fact that, although the evolutionary paradigm is the central idea of modern biology, the use of the evolutionary process in experimental science is extremely limited, even in areas where it might produce better or quicker results than science based on understanding.
The tradition of modern Science is firmly rooted in a method which almost axiomatically assumes that the way to engineer or construct any new device or process is first to understand it. From Galileo on, you do science by constructing a theory and testing it empirically, and, having thus shown its truth, by using your new found understanding to manipulate Nature for your purposes. From the steam engine to the transistor to DNA splicing, this is how it's done, and this method, of course, has shown itself to be almost unbelievably powerful. The difference between our world and Galileo's is a reflection of the success of this way of doing science.
However, there is another way of making things work, the one employed by nature itself, and it is at least as powerful (perhaps more so), and it is the process of evolution.
I propose using directed evolution for the purpose of:
1) Creating artificial intelligence (or as near as we can get).
2) Medical uses such as fighting cancer and finding effective antibiotics, and anti-cancer agents.
3) Finding physical materials with desired properties such as room temperature superconductivity.
4) Growing nanoscale devices with tailored organisms (e.g. efficient solar cells).
It must be understood that the evolutionary process is employed to some extent already in each and every area I have named above, and it is employed extensively in biology to make all manner of biologically active molecules, and potential drugs. However, even in biology, it is almost always used to improve the efficiency or yields of some pre-engineered process, rarely is it used "from scratch", for example, to breed a bacteria that would produce novel antibiotics.
What I am suggesting is much more ambitious: to use evolution to do the whole job.
The classic Darwinian evolutionary sequence that goes on in the biological world can be abstracted and used in areas that have no direct link with biology at all.
The process is:
Given: a complex system which has a set of properties determined by its detailed physical state:
Reproduction: The system can copy itself (as in living organisms) or be copied by external agencies (this is how a virus is reproduced).
Variation: small random changes are made to the system. (In biology this is mutation, which is changes to the structure of DNA)
Selection: The environment (or the experimenter) exerts a consistent pressure which serves as a filter, certain traits are selected, and others are eliminated. In biology this serves to increase the prevalence of certain genes which code for certain "favorable" traits in the organism, which give a survival advantage.
Looping: The resulting system is the base on which the process is repeated. (In biology the selected organisms copy themselves with new variations indefinitely.)
In the biological realm, over time (quite a lot) this non-random selective pressure working on random variations has produced such miracles of organization, functionality, and complexity as cats, petunias, gnats, and ourselves.
This is what I meant when I said that the evolutionary process is as powerful (or more powerful) than the current scientific method, because although we can do many wonderful things with modern science, we can't make a gnat (or even get close).
The evolutionary process can be used in any area where the complexity of the problem is daunting, because the power of this process is that the complexity of the result is arbitrary. This is because nature does not have to understand a problem to solve it. It does so by doing without knowing.
We can do the same if we let go of our habit of insisting on understanding before making. We can always go back after we have a successful result and try to figure out how it works by reverse engineering, so even our understanding would be deepened, just not in the usual sequence.
Lets look in more detail at the areas mentioned above.
Starting with AI
It is not necessarily the case that we can ever achieve a machine which would be, even remotely what we would legitimately call intelligent. This is a debate which I will steer entirely clear of here.
However, our best chance of approaching the problem might be to do it the way it was done the first time, by evolution. The idea is to evolve computer software and hardware together, by competitively selecting for working combinations.
It is obvious that in the realm of computers, software (?the set of instructions that run a machine) can be easily changed, and also evolved. It turns out that the technology exists for evolving hardware too.
In the area of hardware: much interesting work is being done with the directed evolution of circuits under specifically designed selection pressure. The advent of the field programmable gate array (FPGA), makes such work possible. In this technology an external signal can change not only whether a circuit element is on or off (which is the way computer chips have always been working), but the path by which each gate or memory unit is connected to those around it. In other words external signals can change the "wiring" on the chip.
When this wiring is subjected to a regimen of small random changes and the resultant circuits selected for efficiency at a specific task, we have an evolutionary process. This can proceed more rapidly than in the biologic realm because one can have hundreds or thousands of generations in one afternoon of lab time. The speed of electronics makes it an ideal platform for evolutionary experiments.
Dr. Adrian Thompson, at the University of Sussex details some particularly wonderful examples: informatics.sussex.ac.uk/users/adrianth/ices96/paper
Even in simple chips with just 100 transistors used, he was able to evolve a circuit by applying steady selection pressure for the ability to discriminate between a 1 kHz signal and a 10 kHz signal. The most important thing from the point of view being espoused in this article is that this small chip achieved this in a way that is not readily comprehensible to us and so could never have been designed by us. It shows the advantages of being freed from such designer-imposed restraints as having a uniform clock. It takes full advantage of the physical features of the silicon in ways unanticipatable by current theory, but that just shows that current understanding is incomplete, and evolution is cleverer than we are.
Evolving new software is more problematical: Any set of instructions that comprise software can be re-written, and if random elements are introduced in the process, the resultant software can be selected for specific increases in measured functionality. This would constitute evolutionary pressure, and if kept up over many generations could result in software which was better at certain complex tasks than any designed software. There is a specific constraint here which exists in software more than analogous biologic systems: since software is a set of instructions in a language where each "word" of instructions itself represents a specific complex process for doing an exact task, almost all random changes in software (say changing a given word of instructions for some random word), would be completely fatal. (Picture changing the instructions for assembling an automobile by substituting random words to see how dysfunctional this would be). In biology changing a random "word" of the instructions encoded in DNA most often has no discernable results for the whole organism; if it does produce noticeable differences, they are usually disadvantages, but they are seldom fatal. They are occasionally advantageous,- and this serendipity is what fuels biological evolution.
But in the above example of the car assembly instructions one sees that all almost all random changes would be seriously detrimental, and one would have to wait billions of years before one random change was advantageous. The percentages are wildly different than in biology.
The only kind of evolution I can see running on software would be if changes were limited in advance by the programmer to be random but in very well defined areas. To be successful I would think only complete instructions or functional sets of instructions could be substituted for other instructions or sets. I think your chances of success would only be greater than zero if this were being done at the highest level of abstraction rather than the lowest as is the case in biology.
If however, you could get such a regimen running successfully then you could get computers to evolve hardware and software at the same time, to come up with combinations no human could have imagined. Once a high level of complexity was achieved, computers might be run against each other in an analogy to the predator-prey relationship in biology which results in rapid improvement to both.
In heading toward the goal of artificial intelligence, an example of the kind of task that might serve as a good criterion of selection would be object recognition. The ability to tell that a given object in different orientations, against a complex background, is in fact, the same object, is a test which current computer programs do poorly on. The percentage of correct choices is very well defined, and easily measurable. This would serve therefore as an excellent selection standard..
It is a good example of what humans (or other animals) do well, and computers can hardly do at all (even though we have very little idea as to how we do this).
Each software-hardware generation could be culled, or survive on the basis of success in this task, success not being narrowly defined as speed, but some weighted measure of speed and percentage of correct choices.
If the program-hardware sets started to actually get "smarter," a feedback loop could be introduced in which the original program that made the changes in the hardware wiring, or the software code, could be replaced by a smarter version, which could make changes that would no longer be completely random. In fact, partially random and partially directed changes might be more efficient, an example of directed evolution, having more possibilities than natural evolution.
2) Medical uses
To some extent the evolutionary process is already used in medicine and pharmacology, but its use could be greatly expanded. An example of current use is that when looking for new antibiotics, drug companies will scour the tropical rain forests, where the most diversity is found, for plants (and some animals) that show antibiotic properties, trusting that in the long course of biological evolution particularly potent molecules have been "discovered" by organisms to ward off infection. This is, in fact, how the vast majority of modern antibiotics were made. They were either present as such in existing organisms, or are slightly modified versions of molecules found there. Penicillin, the first effective antibiotic, was found by accident when it was noticed that bacteria did not grow in an area around the penicillin mold.
Many drugs which are not antibiotics were discovered in nature too, aspirin being based on the folk remedy cure of willow bark for fever.
However, what is an underused approach is running laboratory evolution for new drugs rather than collecting what nature has already produced.
A direct evolutionary approach would be to evolve microorganisms to fight other organisms. In current practice, engineered viruses have been used with mixed success to import modified DNA sequences into human cells for the purpose of correcting genetically transmitted diseases. A truly evolutionary approach would be not to engineer viruses but to select them for useful traits.
Viruses could be evolved to kill any specific strain of drug resistant pathogenic bacteria for instance, without damage to any body cells or even other bacteria.
The experimental setup for this might be relatively straightforward.
A likely virus would be picked by the experimenter, who would then subject samples containing many billions of them to radiation or mutation inducing reagents. Again, culture plates with the target organisms would be lowered into the soup of mixed viruses and they would self-select for those that would grow in those particular bacteria. Viruses do not reproduce themselves but are copied by the reproductive machinery of host cells or bacteria. In this case the pathogenic bacteria would serve not merely as the selection agent, but as the amplifying agent too, and the bacteria would produce enough copies of the virus to explode their own cells.
The resultant selected virus variant would be used as the starting organisms for the next generation, and within a fairly short time, you would have an effective bacteria killer.
Next, you would select all the successful working bacteria killers and evolve them so that they had no activity within human cells at all. Only the variants that were incapable of penetrating or being reproduced in a human cell would be selected in a negative filtering process.
(It might be better to reverse the order of these steps, or best of all, work them together if possible).
The final version of the virus could be injected into a patient with an otherwise untreatable strain of bacteria, and it would seek out and destroy them selectively. When the pathogen was gone, there would be no place else in the patient that the virus could be reproduced, and in short order it would be excreted.
Bacteria themselves could be enlisted as seek and destroy units against other bacteria. First, you would breed bacteria with no ability at all to attack human cells or produce toxins. This is not so farfetched as it may seem (think of yogurt or cheese). Then go through the steps of inducing mutations in large numbers and selecting only those that attacked the target bacteria.
Variations on this technique are, in fact, used commonly in biological research already, but most often as a refinement step in a process that has already been carefully engineered from theory. Very few research programs simply start from scratch with promising organisms and then mutate them at random for long periods of time without further engineering attempts.
There might be real problems in using tailored bacteria as agents in humans because the immune system would attack them, and producing bacteria that would be capable of fooling the immune system or defending against it might be truly dangerous. A mutation in such bacteria might result in a strain that was itself pathogenic, and that the immune system could not fight.
If it turned out that the problems with introducing tailored bacteria into the patient were insurmountable, then a less direct evolutionary approach might work: Evolve a strain of bacteria that produced strong antibiotics against the target strain of bacteria. This is simply running an artificial selection program on a process that goes on all the time in nature. Many bacteria, such as the ones involved in spoiling milk, establish themselves by producing antibiotics against their competitors.
Evolving whole Ecosystems
This is not done because an entire ecosystem with all its interrelationships would be too complicated to have any hope of being understood, but that's exactly the point of this model: you don't have to understand it, just use it. It could actually be a system so complicated that there would never be any real hope of understanding it, but it could still be bent to your purposes.
Doing this just goes against the grain for most scientists; even if they design an evolution- driven experiment to produce a desired result, they want the system to be simple enough so that they can "reverse engineer" it after something useful is produced. This might not be easy or even possible when using a complete ecosystem, but that's quite acceptable.
A critical area that could benefit from the immense power of the directed evolution technique is anti-cancer research.
Everything outlined above for the creation of antibiotics could be used for creating new anti cancer agents.
The most effective approach might be to create viruses that would only attack and kill cancer cells. Since cancerous cells always have molecular differences from normal cells ( and often differences between different kinds of cancer), and since some of these molecular markers are present on the outer surface of the cells, they are potential targets for viruses evolved to be triggered into activity by just those markers. These molecular markers which are cancer specific, constitute a (hopefully) fatal weakness.
The crux of molecular biology, immunology, and ordinary drug design, is that molecules can be designed (or evolved) which will bond only to one other given molecule (or set of related molecules). This is an effect produced by the unique geometry and charge distribution of any given molecule, and provides for bio-chemistry, an unmatched degree of specificity.
Because of this remarkably fine discrimination, Bacteria or viruses could be evolved to destroy cancer cells preferentially.
The selective pressure on the experiment would be set to only allow the survival of organisms which attacked cancer leaving normal cells alone.
An example of this approach that has already produced extremely promising results is the creation of an oncolytic (cancer killing) virus called ColoAd1,
Oncolytic Viruses: The Power of Directed Evolution
created purely by selection. It has been effective enough in preferentially killing cancer cells that it is starting the next stage of clinical trials. If it is successful in human trials it will stand as solid proof of principle, and an argument for providing more resources for directed evolution experiments.
The main obstacle to this would be psychological: people are, as a whole, less than comfortable with the idea of being injected with a live bacteria or virus (in fact there have been popular movies where an attempt to create a cancer fighting virus backfired wiping out the human race (I Am Legend). This is not good press).
As for the case of antibiotics, extracting molecular anti-cancer agents from bacteria bred to produce them would get around social difficulties and still be possibly the most effective way of producing such agents.
This would also get around the biological problems of introducing bacteria or viruses directly, since only the resultant oncolytic molecules would be injected, circumventing problems with the immune system, and the truly dangerous area of the deliberate creation of an immune system resistant bacteria, or virus, something probably better not done at all.
It is now understood that a tumor that contains millions or billions of cells is itself a kind of evolutionary experiment gone wrong...only those cells that hit on a series of linked mutations enabling them not only to reproduce without limit, but also to avoid pre programmed cell death, and to provoke the patients body to grow blood vessels to supply it, can survive to create a large tumor. It is, in that sense, highly evolved.
Using evolution to create an organism that would attack cancer cells only, or produce anti-cancer agents, would, in this sense, be fighting fire with fire.
The rapid reproductive rate of bacteria (several generations an hour being typical) make them particularly attractive laboratories for producing novel molecules,either antibiotics or oncolytic agents, all microbiologists are aware that bacteria are among the most powerful chemical engineers in nature.
3) Evolving inorganic materials
This use of the evolutionary process in this area has been rare, although more is being tried currently. There is a journal "combinatorial chemistry" which tracks development in this field.
There are many areas in materials science where the interaction of substances becomes so complex that predicting the properties of different mixtures from theory becomes daunting with or without the aid of supercomputers.
An example of an area that might benefit from the introduction of evolutionary experimentation is research on superconductivity. Superconductors are substances which, due to some arcane and extremely complex quantum mechanical interactions, have the property of conducting electricity with zero resistance.
Superconductors make intense magnetic fields possible in devices like MRI machines for medical imaging. Current superconductors are highly limited by the fact that they must be cooled to cryogenic temperatures, to operate. If not for this condition they could be used for things like creating magnetically levitated trains that could revolutionize transport by making realy high speed rail travel possible. These applications make the search for room temperature superconductors the brass ring of material science, and laboratories over the world have mounted an intense program to find them. Progress has been made, but workable superconductors can still only operate at temperatures of a few hundred degrees F below zero. Theoretical models do not prohibit room temperature superconductors, but none have been found.
Research guided by conventional methods is stalled.
Here is how you might go about evolving a room temperature superconductor:
1) Using the same kind of thin film technology employed by current chip-making technology, print millions of microscopic samples of an existing high temperature superconductor on a suitable substrate. The superconducting material could be laid down by thin layer vapor deposit, producing something that looked very much like a computer chip, with millions of small sample areas.
2) Using current working proportions as a starting point, vary the proportions of the superconducting material slightly for each component across a full range. If there were a 1% variation in each of three variables (component substances) this would produce one million variants. A better program might assume that the ideal mixture would be very close to the original working proportions and vary only across a 10% range but in much finer increments, say 1 part in 10,000.
The underlying chip would be structured so as to have electrical leads embedded in each sample, and the chip would be cooled to operating temperature and each sample tested for conductivity.
Superconducting (hits) have their addresses recorded. (Each 2 dimensional address represents one exact proportion of components. The chip temperature is raised by very small increments, say 1/100 of a degree C, until some samples cease superconducting. The temperature is raised until only 100 or so chips are still superconductive. These "survivors" serve as the foundation proportions for the next generation of chips and exhibit by definition a slightly higher temperature superconductivity than the failed formulations.)
The successful formulations serve as the basis for the next generation; all new samples are even finer variations of them.
Once the best proportions (highest operating temperature) for a three component mixture are established, the next step is to add a new substance which is selected by the experimenter as being a possible useful component.
Because of the speed of the chip printing and testing process, many billions of variations can be tried and quickly culled to select the variants that work at higher and higher temperatures.
In transistors, the materials' basic properties are changed significantly by adding minute quantities of other substances called dopants.
One could start with a list of promising dopants for superconductivity and then just try wild or even random substances. The ability to create and test millions of variations in a small amount of time means that an evolutionary approach can come up with combinations that no one would have designed from theory. If many billions of such are tried (not unrealistic in even a year's run of the experiment), you might have a reasonable chance of coming up with a mixture that worked stably at room temperature.
You could then try and figure out why a particular formulation worked.
An alternative method which is closer to the way biological evolution transpires:
Make a reaction chamber in which the component materials are continuously forming aggregates of atoms of nanoscale size. The conditions should require that the mixtures be constantly combining into small arrays of atoms using the components in various proportions and, just as importantly, in various crystal arrangements (the way snowflakes form in nature). One could use for example mixed cold particle sprays in a vacuum.
Have this happen in an electric or magnetic field in such a way that a filter can be set up and only superconductive clusters will be selected.
?One proposal for this is to have the resultant spray of particles fall through an area with a magnetic field that increases rapidly and than plateaus. Electric currents would be produced in all conductive particles but would rapidly die out from resistance in ordinary ones. The superconductive ones would have steady eddy currents and be permanent magnets, and could be filtered on this basis.
Innman Harvey (an evolutionary roboticist at the University of Sussex) has pointed out that many things besides composition could be varied such as: annealing regimes, gasses in the reaction chamber etc. This illustrates a critical point about the evolutionary technique: not only can you change more than one variable at a time (say three or four components of composition at once) but you can change the categories of variables all at once: composition, temperature, the substrate the reactants are on, electromagnetic radiation impinging on the particles as they form, and (as said above) even the introduction of random substances. (Which Harvey calls adding dirt to the reaction)
You can in fact change all these things together... in evolution the messier the better. This inability to know why something worked because too many variables are changed at once is (to put it mildly) unacceptable in normal science.
Once certain clusters are selected, analyze them to see what their proportions and internal structure are; try to tune the reaction chamber to produce more of those kinds. Then run the selection again and keep raising the temperature incrementally.
A room temperature superconductor could be produced which would not be produced by
any combinations based on a theoretical understanding, (which in such situations are most always incomplete).
4) Growing nanoscale devices .
Much has been made in the popular press of the idea of using nanoscale mechanical robots to enter the body and repair injured organs. In some versions, these nanobots would be self reproducing like bacteria. There should be considerable skepticism about this idea since there are not even completely self-reproducing machines to be found at our scale. No untended industrial robot can build itself. Even the more modest idea of just using nanoscale machines to repair organs is a little impractical; just how would one design a machine with enough moving parts to do repair work, where the parts were on the nanoscale? What would the parts be made of and how would they be created to fit together with small enough tolerances for complex operations? What memory would store instructions (this might be the easiest obstacle to surmount, as the space memory requires shrinks yearly).
On the other hand it might be much easier,, to reverse this idea, and get biological organisms to grow nanoscale materials.
Actually this is already done all the time, as when trees grow wood that is partially burned to create activated charcoal, which is a three dimensional sponge with nanoscale cavities giving it huge surface area folded into a small volume. It is this that makes activated charcoal such a useful filter. Many such techniques are already in place in industry.
What I am pointing to is the possibility of evolving organisms to create useful nanoscale materials cheaply and in quantity.
An example of what might be a useful project with a reasonable probability of success would be to evolve bacteria to grow carbon nanotubes or graphine (single atom layer graphite). Both these materials have tremendous potential applications, e.g. nanotubes in ultra strong cable, and graphine in electronics However, nanotubes currently are difficult to grow in quantity and graphine, while easy to produce in micro-quantities, is so expensive in macro- quantities that it may well be the most expensive raw material on earth.
Since bacteria have three billion odd years of experience in making carbon chain molecules, why not "train" (evolve) them to create carbon nanotubes or graphine. This could be done by placing a soup of promising bacteria in huge vats and irradiating them to cause many mutations. A technique would be worked out to make those that produced carbon nanotubes to be somehow selected (e.g.- a plate that produced a suitable electric or magnetic field, and would interact with the peculiar electric properties of carbon nanotubes in such a way as to cause only those bacteria that produced them to stick). Following the pattern, the experimenter would just keep creating hundreds of billions of mutations, waiting for one which produced even minuscule quantities of nanotubes. (The first stage of this kind of evolutionary experiment is merely to wait.) If a lucky hit happened and you got any nanotube production, the next step would be multiplying that strain prodigiously until you have enough trillions to serve as the base for the next stage and then upping the bar by selecting only those that produce marginally more nanotubes. Repeat until you have a bacterium that cranks out nanotubes the way brewer's yeast cranks out alcohol. Get a couple of cauldrons of this bacteria working full time and you have industrial scale production of nanotubes for something like the price per pound of beer.
The objection may be raised that the chances of first getting a carbon tube producing bacteria are so small that you would have to wait forever, but thousands of trillions of organisms could be grown and the mutation rate could be increased thousands of times above what it is in nature.
One extraordinarily useful material that we might be able to evolve is solar cells. Current solar cells are about 10-20% efficient, and this makes them more expensive per kilowatt hour (if you do not count health and environmental effects ) than coal or oil burning power plants. But in the biological world, evolution has produced a process that converts sunlight to chemical potential energy with great efficiency. This is no less than ordinary photosynthesis, of the kind every plant carries out.
Maybe we can breed algae (say) that produce electricity from light instead of chemical energy. If not, at least we might be able to breed bacteria that produce useful nanoscale wires as absorbers, which would make the production of solar cells more efficient and cheaper per watt
Another possibility is to evolve an organism that can produce complete working solar (photovoltaic) cells. This is also not so farfetched; the electric eel produces substantial voltage across its body by converting chemical potential energy to electric potential. There would be nothing impossible a priori to get an organism to evolve a mechanism for direct conversion of light to electricity.
Also, we may currently know enough genetics to head-start the process by engineering bacteria that can make a short nanotube, and then use evolution to breed better producers.
This brings up a critical point: There is no reason to be a purist about the evolutionary approach; it will never displace the traditional knowledge based science program. In places where we have enough detailed genetic knowledge, we can use it to create the starting point for an evolutionary process.
This combination of traditional bio-engineering supplemented by evolutionary approaches may be the most powerful tool of all. Whenever we have some basic information to start with we should use it to cut a few million generations off the unaided evolutionary process.
In order to take advantage of the power of this tool, and implement it as often as is appropriate, what would be needed most is an attitude change in the scientific community. Scientists would have to become more comfortable with watching complex processes evolve on the basis of random change, and then working backward from a successful form to see out how it works.
This attitude change having been accomplished, the power of a process that has yielded something no less complex than human consciousness itself could be brought to bear on those areas of experimental science where the complexity of a problem presents great obstacles
Contact:
David Green aetherambler@yahoo.com .