Jump to content

N-body physics


N-body physics in KSP2  

244 members have voted

  1. 1. Will n-body physics be implemented in KSP2?

    • Yes
      39
    • Yes, as a hard mode setting
      72
    • No
      109
    • Don't care, just want more explosions
      24


Recommended Posts

4 minutes ago, Arugela said:

What if you thin out the calc when low gravity or no gravity and leave the calcs for curvs. And still generalize them a bit. You are talking a flight plane not the actual flight.

Every system has at least one star, "low" gravity not exists for calculations. and if you thin out and make the paths inaccurate,  you will not be able to plan flights anymore. The path shows an intersect with Duna, but in flight you will not even come near him.

Link to comment
Share on other sites

I mean thin out the paths between planets basically. Save the heavier calc for around bodies. Once you are out in space isn't the prediction simple relatively?

Edited by Arugela
Link to comment
Share on other sites

5 minutes ago, Arugela said:

I mean thin out the paths between planets basically. Save the heavier calc for around bodies. Once you are out in space isn't the prediction simple relatively?

Actually, no - it's the opposite.  When you're close to a large gravity source the prediction is generally fairly close to a 2-body system.  When you're well away from any other large masses is when minor variations start to make more of an effect.

Link to comment
Share on other sites

Was there a real world system for the moon or other long distance things they could mimic. Like checking the path every so often. Could be a part of the game for long distances.

Edited by Arugela
Link to comment
Share on other sites

10 minutes ago, mcwaffles2003 said:

care to put those numbers into perspective or do you just like saying big numbers without reference?

 

did you know the game file is approximately 34,000,000,000 bits? thats such a big number that no computer could probably ever hold it 

Simple math (but I had a mistake in it):

Unity default physics 50 per second, each 20 Millisecond.

1 year = 31,536,000,000ms
/ 20ms = 1,576,800,000 steps

Even if you only calculate vessels  and gravity object are on rails, you need 1,576,800,000 times recalculation of the positions of this objects.

This simple loop  take on my machine (AMD Ryzen 7 1700X) 4.5 seconds:

for (int i = 0; i < 1_576_800_000; i++)
{
  tester += i;
}

 

Link to comment
Share on other sites

This thread is running in circle, answers to those arguments were provided before.

You do not have to calculate each step like that. There are ways of lowering the amounts of steps without loosing precision. Principia uses integrator that are far more complex that an simple Euler and can compute the trajectories for a long time fast. 

N-Body is perfectly doable. It does not means it is a good idea.

Link to comment
Share on other sites

1 hour ago, runner78 said:

Simple math (but I had a mistake in it):

Unity default physics 50 per second, each 20 Millisecond.

1 year = 31,536,000,000ms
/ 20ms = 1,576,800,000 steps

Even if you only calculate vessels  and gravity object are on rails, you need 1,576,800,000 times recalculation of the positions of this objects.

This simple loop  take on my machine (AMD Ryzen 7 1700X) 4.5 seconds:


for (int i = 0; i < 1_576_800_000; i++)
{
  tester += i;
}

 

If you did something like what I said and kept an average for multiple times could you simplify the calculation accurately? Keep 1 second, 1 minute, 1 hour, 1 day, 1 week, 1 month, 1 year averages stored actively and seperately. Then calculate based on stored time to get an up to date path expectation with less calculations. Maybe slow down the calc by another factor of 20.

Edited by Arugela
Link to comment
Share on other sites

Just now, Arugela said:

If you did something like what I said and kept an average for multiple times could you simplify the calculation accurately? Keep 1 second, 1 minute, 1 hour, 1 day, 1 week, 1 month, 1 year averages stored. Then calculate based on stored time to get an up to date path expectation with less calculations. Maybe slow down the calc by another factor of 20.

That's not how math work...

Link to comment
Share on other sites

You can't have more accurate less often on the second and more on the later and update it? That would allow auto updates on predicted cross references between the given dates. Maybe even store per year of the planet as it's a full revolution. Then add in multiple previous averages up to so many revolutions of previous years to the point of needed accuracy.

How often would have have to keep the 1 second timer if it helped auto generate the other data? Would it drop the calculation times. the game might need to calculate nbody predictions at the start of a fresh game though. Or fast forwards to collect the data. The 1 second could also be an averaged period. Each adding in and feeding or updating the next value up. Then you only have to collect the 1 second average and the rest auto fill.  Then after the week-month calcs you can cross reference 52 weeks vs 12 months in longer calcs vs 1 year actual value. So, three calcs all compared to previous values. That is only 31,449,600 times the data value in storage in seconds.. That is small if you can use it as a stored value. It's the biggest one. Each value could be a check for that distance represented in time or something. It could allow zoom logic based on the smaller values within it. But could be done in chunks and split up to shorten the calc and parallelise it.

Edited by Arugela
Link to comment
Share on other sites

3 minutes ago, Arugela said:

You can't have more accurate less often on the second and more on the later and update it? That would allow auto updates on predicted cross references between the given dates. Maybe even store per year of the planet as it's a full revolution. Then add in multiple previous averages up to so many revolutions of previous years to the point of needed accuracy.

Still not how the math works. Every state of the simulation is based on the previous state. Decrease the frequency and you increase the errors, which compound over time. It's the nature of the problem. If you can find a solution to it that doesn't do this, there are some NASA scientists who would be really interested because it would save them any amount of supercomputer time.

Link to comment
Share on other sites

You can still be picky under the second time if needed to increase accuracy. Take samples of live every so many seconds to readjust it. 1/10 or 1/5 times. Just enough to reduce the calc. If you can split it into multiple threads to save time and do simultaneous calcs with a different formula... the pathing could be calculated over time and take a minute or two to get an accurate average potentially given set times.

Or add 1/60th of a second or another stored smaller piece of data and then still do the above with anything smaller than the smallest stored/averaged data.

And more data stored over previous iteration could help account for change for better scope. Part of the point is to get it as partly an HDD/Ram storage value and keep the part needed as a live calc to get the time down. Each part in a separate thread and parallel.

You can also store the averages of the averages in a value to speed up calcs. Do some predictive future averages then compare and keep an accuracy value.

1 second to 1 year value accurately is 30mb x value per value. 1/60th of a second to 1 year accurately averaged is 1.757gb x value. Between those value could store a decent amount of data in ram. Maybe a single value kept up over time. then you can keep only calculating each 1/60th seconds accuratley then stored in single smaller 30mb values. Then lots of those stored to do other parts of the calc. 60 1/60th of a value could be easier to deal with with lots of averages. You can store almost 60 instance of that value over time. For a minute in the 1.757gb. Not sure which values you need. But how much more do you need to make a separate continuous value? Store greater values in a single value. To reference with the others for over time accuracy based on recent and even predicted events. And if you store each reference in single values you can drop that significantly. Basically, segmentations with cross checks. Each segmentation can fill in with an amount of cross checks to help fill in if needed.

Edited by Arugela
Link to comment
Share on other sites

7 minutes ago, sarbian said:

This thread is running in circle, answers to those arguments were provided before.

You do not have to calculate each step like that. There are ways of lowering the amounts of steps without loosing precision. Principia uses integrator that are far more complex that an simple Euler and can compute the trajectories for a long time fast. 

N-Body is perfectly doable. It does not means it is a good idea.

Maybe with simplifyed system with grafity objects on rails, but in a full-scale physically correct n-body simulation: less step == less precision. Simple cause and effect, each moment all body positions changes and with it the gravitational forces.

If there was a formula for a quick and accurate n-body calculation, that works with normal consumer PC in an fast timewrap, then you should call Stockholm.

Link to comment
Share on other sites

There are lots of tricky things you can do to help the math, however at the end of the day full N-Body is O(n^n) and recursive and patched conics can *most* of it in O(n).  If the developers wanted to spend the time and effort, they could probably get reasonable speed out of N-Body, for a reasonable amount of ships, by working hard and taking a lot of shortcuts...

Or they could use patched conics and spend that time and effort on other things.

Link to comment
Share on other sites

1 hour ago, Arugela said:

simultaneous calcs with a different formula

predictive future averages

1 second value accurately is 30mb per value

a separate continuous values

store greater values in a single value

You can't just throw some mathy-sounding words at the problem and expect a solution. We're talking about systems of differential equations here - it's pretty high level mathematics.

Link to comment
Share on other sites

2 hours ago, runner78 said:

Maybe with simplifyed system with grafity objects on rails, but in a full-scale physically correct n-body simulation: less step == less precision. Simple cause and effect, each moment all body positions changes and with it the gravitational forces.

If there was a formula for a quick and accurate n-body calculation, that works with normal consumer PC in an fast timewrap, then you should call Stockholm.

There are ways to actually increase the step without loosing precision. With Runge–Kutta you can use an adaptative step, estimate the error for the next step and use a lower step if needed. And there are other more advanced method

And again it has been implemented and works fine. I do not know how versed you are with Principia but the math behind it are far more advanced that what is usually thrown around in this forum. Feel free to have a look at the doc and the linked paper and then come back to explain us that the only way to do this math is with a fixed step Euler...

Don't think that NASA/ESA/Whatever uses a simple integration method with a low step...

Link to comment
Share on other sites

2 minutes ago, burn boi said:

For n-body, am I right in assuming that the N stands for Newtonian?

N is "the number of" bodies.

So 2-body is what we do now, N body is (generally) every major mass.

The problem is, while 2-body is easily solvable in every case, N-body only works in special cases and as N gets bigger, the cases get rarer and less likely.

Link to comment
Share on other sites

Why not do nbody and add more things to make the basic kerbal system work as it does under nbody. Basically, shoehorn it! How about an ort cloud or black holes or things not in the first game that are made to purposely allow it to work well. Then the new systems can be more nbody appropriate in a traditional way.

Edited by Arugela
Link to comment
Share on other sites

7 minutes ago, sarbian said:

Don't think that NASA/ESA/Whatever uses a simple integration method with a low step...

Space agencies also have access to supercomputers. Most of us don't, and we aren't running KSP on them anyway.

Link to comment
Share on other sites

What if they did odd things like dark matter on the edge of the system. Even if it's not correct dark matter. Or make the other systems or an unreachable galaxy pull on it from afar. There might be lots of things to make it happen. Find a convenient answer and then make up weird stuff. it might work for the basic kerbal system and it's goofiness.

Link to comment
Share on other sites

2 minutes ago, sturmhauke said:

Space agencies also have access to supercomputers. Most of us don't, and we aren't running KSP on them anyway.

You sure about that? I don't see NASA using IBM's sequotia

ALSO, i don't think any of us have a supercomputer, unless you mean a very good pc

Edited by burn boi
Link to comment
Share on other sites

×
×
  • Create New...