
Hi ;)
2010/12/8 Mitar
Hi!
On Wed, Dec 8, 2010 at 3:39 PM, Alberto G. Corona
wrote: DNK? I think you mean DNA.
Sorry. In my native language it is DNK. ;-)
the genotype program that develips the fenotype is much more smooth and granular than a computer program. A chante un a gen does not make you to have a extra bone. It can make you to have your hand slighltly longer. or shorter.
Because we are using very small programs. A change of one constant in a Linux kernel would also be a very small change in general. Maybe some esoteric device would not function anymore, but this is it. Of course there would be also changes which would make whole kernel dysfunctional.
Of course there are smooth zones in the fitness landscape of any code. what is necessary is to direct the process by avoiding absurd replacements (mutations that goes straight to dead zones) and rules for changing from a smooth to another smooth area once the local maximum is not satisfactory. Or to detect them as early as possible (that is, rules again). That is what I mentioned before.
In fact there are metalevels of selection that discard abrupt changes. For example, when females ovulate there are a strong selection where thounsands of candidate cell ovules are tested and discarded. This is one of the reasons why anomalous mutations are scarce.
I doubt that. Aren't female eggs made while she is still a fetus? And they do not divide anymore later on?
This selection of candidate eggs happens each month within each ovulation period. It is mentioned in this superb conference series by Yale university. It is a very good introduction. I don´t remember the exact chapter. sorry: http://www.youtube.com/watch?v=VjgHd6HKtvE Moreover, the genetic code has evolved to evolve. For many reasons. Neither Haskell nor any conventional language has. One of the safety measures of the genetic code is protectin itself against undesired mutations. This is something vital for life. Cancer is one consequence. but another consequences are more deadly: If a mother spend nine months to produce an unfit child this is a dead end for the mother´s genome line. For this reason, undesired mutations, specially in reproductive cells are either repaired or discarded. This selection is more strong in females than in males (for investment reasons) . The rate of mutations in living beings is equal to the theorical optimum for procariots and eucariouts. This is another examople of how finely tuned the DNA has evolved for evolution. The molecular bonds are strong enough just to permit the right rate of mutation . But the DNA i´m sure, has much more secrets to learn and discover. So in conclusion there are much details to learn and to implement to have sucessful general genetic algorithms for genetic programming. But the research promises a lot of fun.

Hi!
On Wed, Dec 8, 2010 at 4:51 PM, Alberto G. Corona
Of course there are smooth zones in the fitness landscape of any code. what is necessary is to direct the process by avoiding absurd replacements (mutations that goes straight to dead zones) and rules for changing from a smooth to another smooth area once the local maximum is not satisfactory. Or to detect them as early as possible (that is, rules again). That is what I mentioned before.
But this is not blind evolution as we know it. Of course, if you want to speed up things you can use such techniques you are mentioning. But this techniques use prior knowledge for this particular set of problems. Evolution in general does not care about this. It has all the time it needs.
Moreover, the genetic code has evolved to evolve.
I agree with that.
Neither Haskell nor any conventional language has.
True. We should evolve also the language itself, not just programs. Mitar

Mitar
Neither Haskell nor any conventional language has [evolved to evolve]
True.
Well - thinking about it, there's no fundamental difference between genetic algorithms - where you have a "genome" in the form of a set of parameters and genetic programming - where the "genome" is a program of some sort, typically Lisp code, since the syntax tree is so accessible and malleable. In either case, you have an interpreter that interprets the genome, the difference is mainly in the expressive power of the language. I haven't looked closely, but I suspect you might not want Turing-completeness in either case (Alberto?). But yes, by designing the language to evolve, we can get a head start compared to nature. -k -- If I haven't seen further, it is by standing in the footprints of giants

Hi Ketil,
2010/12/9 Ketil Malde
Mitar
writes: Neither Haskell nor any conventional language has [evolved to evolve]
True.
Well - thinking about it, there's no fundamental difference between genetic algorithms - where you have a "genome" in the form of a set of parameters and genetic programming - where the "genome" is a program of some sort, typically Lisp code, since the syntax tree is so accessible and malleable.
In either case, you have an interpreter that interprets the genome, the difference is mainly in the expressive power of the language. I haven't looked closely, but I suspect you might not want Turing-completeness in either case (Alberto?
).
I´m not sure. (answer at the bottom). There are some characteristics on DNA code that are there just to evolve some parts of the code while maintaining safe some others. There are sequences that are extraordinarily stable for millions of years across species, while others change very often. Some others are in the middle. This is not arbitrary. Let´s say that the stable parts are the IF statements while the changing parts are the parameter values. Doing so, the species explores the safe part of the fitness landscape without wasting effort in experiments that have 99% chance to fail. Females will evolve whatever mechanism to avoid these experiments as soon as possible. However risky experiments shoud be taken from time to time, to explore new zones of the fitness landscape. Another characteristic of evolution, is its multilevel nature. There are selection of genes. Selection of gene replication alternatives, selection of reproduction alternatives. procariotic, eucariotic, multicelular (individual), social levels of selection. Not only that. During ovulation there is a selection of good eggs, as I mentioned. And this selection is related with the stability of critical gene sequences. this test for stability is regulated by other supervisor genes that test with different degrees of accuracy the absence of mutations in these protected sequences. This internal checking for consistency can be considered a level of selection, or autoselection if you like. . selective checking has evolved as a metalevel. But you can alternatively consider this consistency check as a set of rules written in the genetic code by selection itself . Rules are a form ofselection mechanisms after all. All of these "Rules" have evolved because the genetic code executes over itself using its own code as data: Some genes activate others. some others check for absence of mutations and so on. That is, some metalevel selection are carried out by the same code that is evolving. Moreover some rules are different for each genera or specie son it is not possible to extract (and abstract) out certain chekings from the code that is checked. If evolution is search for local maximums in a tree, the rules/metalevels are heuristic rules that guide the strategy of exploration. Another
But yes, by designing the language to evolve, we can get a head start compared to nature.
Yes.
What means what I said above for evolving programs? First that a simple genetic algorithm operating on auto modifying code can generate any solution if we wait a few billion of years. But we can help to do it faster by studying the DNA code, learn their already evolved techniques and use them to design languages, assign rates of mutation for each statement, discover rules, selection mechanism, species of programs (or design patterns) that acts as templates for each problem etc. Species of programs means that the seed of the genetic algoritm must not be turing comoplete I guess. It must be specific for each problem. -k
-- If I haven't seen further, it is by standing in the footprints of giants
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

Hi!
On Thu, Dec 9, 2010 at 1:48 PM, Alberto G. Corona
assign rates of mutation for each statement,
This could be assigned by evolution itself. If "if" will have high probability of mutation then resulting programs will not survive. So those probabilities can be assigned by evolution itself and be also something which is passed on with generations (with again possibility of mutations of those probabilities itself). What would be interesting is to have an evolution algorithm which would allow such protections to evolve during its run. So some kind of protection which would lower the rate of mutation for some code part.
Species of programs means that the seed of the genetic algoritm must not be turing comoplete I guess. It must be specific for each problem.
I do not see this as necessity. Just that those specimens which would use this power too much would probably not survive. (If they would remove protection for some sentences they have build before.) Mitar

In order to simulate nature, you need to have the mutation and selection process itself be part of the programs (and not the interpreter). How about you have a "world" consisiting of some memory, bombard this world with "cosmic radiation", and add some "enzymatic activity" in the form of an interpreter that interprets a location of the world as a simple language (that allows observation and modification of the world, as well as forking new threads of execution) - which is randomly started at random points. After some time, you might have threads that reproduce themselves, perhaps forming species, cooperation, pathogens, genders, and if you wait four billion years, civilizations... Life is really just core wars without the programmers :-) -k -- If I haven't seen further, it is by standing in the footprints of giants

Hi Ketil,
In order to simulate nature, you need to have the mutation and selection process itself be part of the programs (and not the interpreter). Hence the interpreter can itself be modified?
How about you have a "world" consisiting of some memory, bombard this world with "cosmic radiation", and add some "enzymatic activity" in the form of an interpreter that interprets a location of the world as a simple language (that allows observation and modification of the world, as well as forking new threads of execution) - which is randomly started at random points. To be honest, I did not understand the "location part"...
Life is really just core wars without the programmers :-) I'll quote you if I have the opportunity.
- Michael

Michael Lesniak
Hence the interpreter can itself be modified?
Well - the interpreter in nature is chemistry. Living organisms are just chemistry programs. -k -- If I haven't seen further, it is by standing in the footprints of giants

Have you read: Fontana & Buss : "What would be conserved if 'the tape
were played twice'?" in PNAS? It's quite fun - they model chemical
reaction as alpha-reduction in the lambda calculus and look at
evolution.
Tom
On Thu, Dec 9, 2010 at 10:15 PM, Ketil Malde
Michael Lesniak
writes: Hence the interpreter can itself be modified?
Well - the interpreter in nature is chemistry. Living organisms are just chemistry programs.
-k -- If I haven't seen further, it is by standing in the footprints of giants
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

You can do all sorts of fun things with computers. Assuming that you
are interested in modeling really real life, how will you estimate
parameters (e.g. mutation rates) based on real data? How will you
quantify whether this a good or a bad model? I think living in a
fact-free world is a bit pointless, but there are plenty of people who
got tenure that way... There's tons of this stuff in Artificial Life
and a book with that title by Stephen Levy which make many grand
claims.
Tom
On Thu, Dec 9, 2010 at 2:09 PM, Ketil Malde
In order to simulate nature, you need to have the mutation and selection process itself be part of the programs (and not the interpreter).
How about you have a "world" consisiting of some memory, bombard this world with "cosmic radiation", and add some "enzymatic activity" in the form of an interpreter that interprets a location of the world as a simple language (that allows observation and modification of the world, as well as forking new threads of execution) - which is randomly started at random points.
After some time, you might have threads that reproduce themselves, perhaps forming species, cooperation, pathogens, genders, and if you wait four billion years, civilizations...
Life is really just core wars without the programmers :-)
-k -- If I haven't seen further, it is by standing in the footprints of giants
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

The good fact about evolutionary simulations is that all is theoretically
possible . The bad fact is that in practice is very hard to achieve results.
Biota.org has links to some artificial life projects.Some of them are naive,
but some others may be interesting.
http://www.biota.org/ http://www.biota.org/
2010/12/9 Tom Nielsen
You can do all sorts of fun things with computers. Assuming that you are interested in modeling really real life, how will you estimate parameters (e.g. mutation rates) based on real data? How will you quantify whether this a good or a bad model? I think living in a fact-free world is a bit pointless, but there are plenty of people who got tenure that way... There's tons of this stuff in Artificial Life and a book with that title by Stephen Levy which make many grand claims.
Tom
On Thu, Dec 9, 2010 at 2:09 PM, Ketil Malde
wrote: In order to simulate nature, you need to have the mutation and selection process itself be part of the programs (and not the interpreter).
How about you have a "world" consisiting of some memory, bombard this world with "cosmic radiation", and add some "enzymatic activity" in the form of an interpreter that interprets a location of the world as a simple language (that allows observation and modification of the world, as well as forking new threads of execution) - which is randomly started at random points.
After some time, you might have threads that reproduce themselves, perhaps forming species, cooperation, pathogens, genders, and if you wait four billion years, civilizations...
Life is really just core wars without the programmers :-)
-k -- If I haven't seen further, it is by standing in the footprints of giants
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
participants (5)
-
Alberto G. Corona
-
Ketil Malde
-
Michael Lesniak
-
Mitar
-
Tom Nielsen