Jump to content

Using Genetic Algorithms for development.


Recommended Posts

I was thinking about how the new aero compares to FAR, and how rockets and planes don't wobble in the new atmosphere as they did in old FAR. This brought me to the train of thought of how they fine-tuned the whole thing. Then I had an idea. A lot of these things are based on subjective opinions on balance, and may not turn out quite right. So what if instead, you had a genetic algorithm (natural selection, but for designing things instead of being life) to tune things. For instance, a variation on a code or a cfg is run through a series of stress tests, and the best one (in the case of aero, perhaps it could be the SAS configuration that produces the least wobble) gets to bring its "genes" on to the next generation of code or cfgs or whathaveyou.

This is so abstract, this might fit in better in the Space Lounge than the development forum.

And now I'm wondering about the idea of using a genetic algorithm to control a rocket in kOS or something.

Edited by GregroxMun
Link to comment
Share on other sites

Well, afaik the hard thing about genetic algorithms is that you have to, you know, actually teach the neural network. So, in case of the whole aerodynamic model, I guess you'd have to have reference designs (and a lot of them) and know precisely how they should work at different speed/AoA, so you could evaluate the success rate of the network. And creating a "how good does it fly" metric would be a problem.

Link to comment
Share on other sites

Well, afaik the hard thing about genetic algorithms is that you have to, you know, actually teach the neural network. So, in case of the whole aerodynamic model, I guess you'd have to have reference designs (and a lot of them) and know precisely how they should work at different speed/AoA, so you could evaluate the success rate of the network. And creating a "how good does it fly" metric would be a problem.

I never said I had an idea of how exactly to use it, I just gave a basic example. It'd be up to Squad to figure anything else out.

Link to comment
Share on other sites

I was thinking about how the new aero compares to FAR, and how rockets and planes don't wobble in the new atmosphere as they did in old FAR. This brought me to the train of thought of how they fine-tuned the whole thing. Then I had an idea. A lot of these things are based on subjective opinions on balance, and may not turn out quite right. So what if instead, you had a genetic algorithm (natural selection, but for designing things instead of being life) to tune things. For instance, a variation on a code or a cfg is run through a series of stress tests, and the best one (in the case of aero, perhaps it could be the SAS configuration that produces the least wobble) gets to bring its "genes" on to the next generation of code or cfgs or whathaveyou.

This is so abstract, this might fit in better in the Space Lounge than the development forum.

And now I'm wondering about the idea of using a genetic algorithm to control a rocket in kOS or something.

What are your design variables (i.e. what are you optimizing) and what is your objective function (what are you optimizing FOR)?

Are you optimizing aerodynamic physics (what) so that planes fly as easily as possible (for)? If that's the case, then i'd say just revert back to pre 1.0 aero...

Are you optimizing aerodynamics physics so that good designs fly well and bad designs fly bad?

It's definately possible, if you define what your design variables, design constraints, and objective function are. You just need a code that randomly assembles planes tries to fly them, and then evaluates each one.

Or are you trying to optimize autopiloting/control? I recommend looking here. If you look at the pseudocode at the bottom, this can EASILY be implemented in kOS.

Link to comment
Share on other sites

I never said I had an idea of how exactly to use it, I just gave a basic example. It'd be up to Squad to figure anything else out.

Then what is the point of mentioning it? It's a very incomplete idea and I have a feeling using something akin to a phoropter, or binary search, would prove better. "Which is better, one, two, or no difference"

(Of course, we're dealing with a ton of variables; which means you'll use a "mastermind" algorithm, or rather that you keep changing multiple variables but, due to the way they're changed, determine the outcome in fewer turns than doing each separately.

Edited by Fel
Link to comment
Share on other sites

Basically, run a test of each generation? Taking the best values, and adding in mutations?

Like:

1.) Test a whole bunch of code

2.) Take the best instances

3.) Add a mutation to each one

4.) Repeat

Something like that?

Link to comment
Share on other sites

I think this would be a rather complex problem to solve with a GA. Evaluation of a fitness function is going to be time consuming as you'll have to run real time trials in KSP. GA's are also lazy little **** and won't necessarily find the optimal solution in a complex search space.

I think you'd be better of with some form of back-propagation algorithm which could adjust a setting in real time as you flew a craft. You'd need to characterize what the "error" was (ie the amount of wobble) and then have the algorithm adjust settings to attempt to reduce that error.

It's an interesting idea, but kinda tricky.

But re your evolved autopilot idea. I've been wondering about that too and also about evolving a rocket. But both those suffer from having very time consuming fitness functions (a whole launch per evaluation). So I've shelved those ideas for now.

Link to comment
Share on other sites

Basically, run a test of each generation? Taking the best values, and adding in mutations?

Like:

1.) Test a whole bunch of code

2.) Take the best instances

3.) Add a mutation to each one

4.) Repeat

Something like that?

Yes, pretty much exactly that.

Link to comment
Share on other sites

Basically, run a test of each generation? Taking the best values, and adding in mutations?

Like:

1.) Test a whole bunch of code

2.) Take the best instances

3.) Add a mutation to each one

4.) Repeat

Something like that?

Yes, pretty much exactly that.

You don't want to take the best instances, that would be what's called Elitist selection and it is generally doomed. You also don't want to add a mutation to each one, mutations are the vector for adding totally novel genes to a population but for the most part they are very detrimental to a population. You want to have a very very low mutation rate.

1) define population of (say 30) random genomes

2) "recombination" - select two genomes at random and combine them to form a new genome (can be the typical 50/50 split of genomes from each, a recombination rate of 0.5, but doesn't have to be)

3) "mutation" - on very rare occasions, say about 0.02% of the time, add a mutation to a gene in the new genome.

4) "contest" - select 3rd random genome from population and evaluate it's performance and the performance of the new genome. If new genome wins then it replaces that member in the population, otherwise throw it away

5) return to step 2

Breeding can happen between any population members, even really poorly scoring ones, that is essential for maintaining genetic diversity (elitist selection looses diversity very quickly so it's highly prone to getting stuck in a genetic blind alley).

Only need to run 2 fitness evaluations per "generation" (which helps speed things up).

Recombination is what drives a population towards genetic convergence (fixes characteristics in a population), mutation adds noise and acts against that. There is no rule about optimal rates for either, it depends on a number of things, length of genome, complexity of search space etc. If recombination rates are too high then the population will very quickly become clones of each other and all diversity is lost, if mutation rates are too high good traits will get lost (and mathematically you end up being no more effective than a random search algorithm).

Anyway, sorry for going all GA nerd on you, just thought you'd be interested!

Link to comment
Share on other sites

You want the best instances. That's how natural selection works, the best for a given environment is the most evolutionarily successful. Like with the finches in the Galapagos, the ones with the beaks that were unsuited died off, and the ones with better beaks for the new environment, a variation within the original population, were better suited.

Adding a mutation to each one would add variance, which is basically making all instances not the same. This is key to evolution.

Recombination of genes actually promotes mutations in the form of variance.

Link to comment
Share on other sites

Rather than evolve a system which I must admit sounds very cool, couldn't you just have a variable amount of control authority and reduce degrees of flap deflection/SAS Torque the closer you are to (for example) prograde with a fudge factor for rotational speed and acceleration?

Meaning if you are pointing prograde and not drifting there is no control authority but if you are drifting then you have 'some' based on (acceleration combined with drift speed, if accelerating away increase authority until you stop accelerating away then add a bit more to start heading back to the desired direction), the further away from prograde and the faster you are moving away from it or even accelerating away, the more control authority you have.

The most control authority would be when you are a long way from prograde and drifting away (with acceleration) from it on the navball.

Just my thoughts, I'm a lazy programmer...

Surely this is the way they already do it?

Edited by John FX
Link to comment
Share on other sites

You want the best instances. That's how natural selection works, the best for a given environment is the most evolutionarily successful. Like with the finches in the Galapagos, the ones with the beaks that were unsuited died off, and the ones with better beaks for the new environment, a variation within the original population, were better suited.

That isn't how evolution works.

Evolution is simply "X creature is more successful at creating viable offspring over a period of time than Y creature." There is no "best for the given environment" simply procreation. So, let's say a particular species of bird targeted other species of birds and wiped them out, all it means is that species of bird that is killing other birds becomes the "best survivor" in the environment and earns the right to bear the next generation of offspring. Does it mean that the birds that were killed off were inferior, or did not have traits that the genocidal birds would benefit from? No. All it means is that the birds were killed off due to the actions of other birds.

katateochi is very right here; mutations aren't how evolution works, GENES are. People have "some number I'm not going to bother looking up" genes that aren't even active! Occasionally we get two people to mate that have the same gene, and it just happens to work for their offspring and we get what you call "a mutation."

Mutations have another name, one you're probably more familiar with. Cancer. Of course, only occasionally do we get mutations where we need them, also called testicular cancer or ovarian cancer. OCCASIONALLY we get a defect that isn't malignant and we get something different that doesn't cause problems (doesn't need to be beneficial by itself, just needs to create a new gene that doesn't affect fertility.)

These algorithms aren't something you run and 30 seconds later you get a solution; they're very very big, very time consuming, and your population sizes generally get fairly large. You don't want too much initial variance because that will only make it longer before you get to a solution, and you don't want to just "prune" your populations because you'll be destroying genes that may prove valuable when combined in a particular manner. And the "genes" aren't going to be the values you want the genes only affect the evolutionary model, which is NOT a straight forward "just push it into aero and test", those values can be calculated from the genes, but the genes cannot be calculated from those values.

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...