Find the value of that maximizes , to an accuracy of at least one part in a million. Use a population size of fifty and a mutation rate of .
So randomly select a population of 50 binary string of length 8. Decode them into base 10. Look at their fitness levels (e.g. ). Now exclude of the strings with the lowest fitness levels. Use crossover between random pairs of strings to get 25 "child strings." Now use a mutation rate of on this new population of strings? Because you dont want a population of strings with end digit 0. This will cause domination.
Is this generally correct? How would you decide the length of the strings?