 
  
  
  
  
 Next: Summary Cont.
Up: Genetic Algorithms
 Previous: Parallelizing Genetic Algorithms
 
-  GAs randomized parallel hill-climbing search for hypotheses that
optimize a predefined fitness function
-  based on analogy to biological evolution - diverse population of
competing hypotheses, at each iteration most fit members of the
population are selected, combined by crossover and subjected to random mutation
-  GAs show how learning can be seen as a special case of
optimization - learning task is finding optimal hypothesis - this
suggests other optimization techniques - like simulated annealing -
can be applied to machine learning
 
Patricia Riddle 
Fri May 15 13:00:36 NZST 1998