# How much does TWR in an atmosphere matter. Is TWR < 2 or TWR > 2 better?

## Recommended Posts

Howdy folks. This should be fun...

The premise:

In KSP, many people have independently verified that while in the thick soupy lower atmosphere, the optimal ascent velocity is terminal velocity.

Additionally, this optimal ascent velocity in the lower atmosphere can be followed with a TWR of 2.

The optimal ascent profile for any particular vehicle depends on the details of each stage.

Let's assume the above three statements are true.

Given the premise, here are two questions:

1) Which is better in the lower atmosphere: a) TWR <= 2, start each stage with TWR < 2 and end with TWR >= 2, or c) TWR >= 2?

The reason for this question is that I think that many people know approximately what is best, but don't know by how much.

Similarly I think some people are wrong about what is best, but really are not too far from correct from a practical standpoint.

Thus the second question: how much does you TWR profile really matter?

If you disagree with the premise, that's a different discussion. Both discussions are welcome, but please indicate clearly if you are arguing against the premise rather than the questions.

- - - Updated - - -

Hmm. I tried to reply to my own thread, to keep the question and my answer in a different post, but the forum combined them into a single post...

The following are my opinions. I'm open to being corrected (via data, theory, or pointers to other threads). I've done some testing to support it. I may post images later. I would be happy to see other people's tests first, so it's not just me filling the entire first page of this thread.

1) From best to worst:

a) Terminal velocity ascent is best. A theoretical perfect KSP rocket would follow terminal velocity at least until the gravity turn.

TWR > 2, but thrust limited to terminal velocity is second best. It is not cost effective, but it is fairly delta-v effective.

c) TWR which keeps you near terminal velocity is third best. Split over and under terminal velocity based on changing TWR or ease of control or staging.

d) TWR > 2, and velocity above terminal velocity. (Even up to 20% to 30% above terminal velocity. Maybe higher. More testing required.)

e) TWR too low to reach terminal velocity

f) TWR much less than 2. (1.2-1.5?)

Now here's the kicker: What's the difference between

a) and : < 10 m/s delta-v spent to orbit

a) and c): < 15 m/s

a) and d): < 25 m/s

a) and e): < 50 m/s

a) and f): could be very large

These come from modifying a set of rockets to maintain certain TWRs. I don't have enough data at the moment to fully support this.

The reason why going above terminal velocity is not that bad on your delta-v budget is because it reduces your gravity drag.

The total drag goes up because atmospheric drag goes up faster than gravity drag goes down, but the total doesn't go up that quickly.

The faster you get into your gravity turn, the less total gravity drag you pay.

Additionally, for many engines, the Isp is worse at lower altitude. The faster you get out of the soup, the better that engine performs.

2) How much does the TWR profile matter for your final delta-v? For TWRs from 1.8 to 2.5, limiting yourself to terminal velocity or not, the delta-v difference from optimal ascent should be at most a few percent.

For many people, I posit that that delta-v difference is insignificant. This is a gameplay style question more than a delta-v question: What is more important to you: optimizing your ascent delta-v or cost or part count or beauty or awesomeness or (lack of) time spent on design or science points gained? The delta-v difference is so small that only people who care about efficiency need to think about it at all.

(Once you can perform a half-way decent gravity turn.)

Edited by Yasmy
##### Share on other sites

Most of my rockets actually lift off with a TWR of 1.2-1.5, and it seems to work just fine for getting to orbit.

I usually burn about 4.7km/s deltaV to get into orbit with that.

Overall I haven't noticed that much of a difference in costs with different TWR profile. It should be notided that I do always stay under therminal velocity (I let mechjeb do that for me)

##### Share on other sites

Well it seems to me it is highly dependant upon the engine.

For a better TWR engine, having a higher TWR isn't so bad.

"b) TWR > 2, but thrust limited to terminal velocity is second best. It is not cost effective, but it is fairly delta-v effective."

In this case, we'll assume it starts at a TWR of 2, and then by the end of the stage is higher than 2. In this case, you throttle down as your stage burns. The result is that you lift more mass in engines than you need, paying a dV penalty. If these are LV-Ns, the mass is significant. If this is a 48-7s then you haven't lifted that much unneeded mass.

Asparagus staging reduces this penalty by giving more stagign increments, so that unneeded engines aren't lifted that far.

"c) TWR which keeps you near terminal velocity is third best. Split over and under terminal velocity based on changing TWR or ease of control or staging."

I think this is best for convenince, given that this is a game, and playing time is limited, rocket size is not.

"The total drag goes up because atmospheric drag goes up faster than gravity drag goes down, but the total doesn't go up that quickly."

Air drag increases with the square of velocity. Going 2x terminal velocity only halves gravity drag, but it quadruples air drag.

As far as air/gravity drag is concerned, its a better to go slightly under terminal velocity than slightly over.

We could easily make an equation to show this, gravity drag being Ksub1*1/Velocity, and air drag being Ksub2*Velocity^2

(assuming a 90 degree ascent)

But, as you point out "The faster you get out of the soup, the better that engine performs." - althought this isn't a consideration for the aerospike - but I don't think this is valid. You can have your engine performing under a certain ISP for 30 seconds, or 45 seconds, depending on your throttle.... but the 30 seconds isn't really better if you are burning more fuel in those 30 seconds than you would during the 45 seconds. You still go from 0 to x meters in altitude with your engine at Y ISP. Halving the time at that altitude, but doubling the fuel used per unit time, doesn't save any fuel (yes it saves gravity drag, but then loses air resistance). I think the speed for optimal deltaV is best.

What it could affect, is at what altitude you start your gravity turn.... you'd want to spend as little deltaV as possible in the lower atmosphere spending most of it to get higher, not faster horizontally-but given the KSP aerodynamic model, by the time you start your grav turn, your engines are all nearly at vacuum ISP anyway. You spend your dV to get higher to reduce air drag, the improved ISP is insiginficant in comparison.

Another point: especially as you get higher, you need a TWR of better than 2 to maintain terminal velocity, because it increases rapidly.

A TWR of 2 can maintain a fixed terminal velocity, but if you need to accelerate 1,000 m/s in 10 seconds to maintain an increasing terminal velocity, a TWR of 2 will not cut it... so you want something slightly better than 2, so you can not only maintain terminal velocity, but also accelerate to reach ever higher speeds, as TWR increases faster.

At low altitudes, you may need to accelerate 1 m/s to keep up with terminal (basically .1 G), at higher altudes, its more like 100 m/s (ie, 10 Gs).

Of course, packing enough engines on your upper stage to get a 10:1 TWR also means lifting a lot more mass... which is less optimal.

But... once you start your gravity turn, you don't actually want to follow terminal velocity. You want gravity drag to equal air resistance.

Gravity drag is not Ksub1*1/Velocity, but rather sin(theta)*Ksub1*1/Velocity - as your velocity vector gets close to perpendicular, gravity drag starts getting lower (not even accounting for the strength of gravity getting weaker as you get higher, which is more apparent in KSP than real life).

Once you start your turn, you want to start lagging behind terminal velocity (and that at certain altitudes, terminal velocity is higher than escap valocity, should make this apparent)

So... there are a lot of variables, and the answer would change depending on what engine you are using.

At least... thats how I figure it.

Edited by KerikBalm
##### Share on other sites

The answer really depends on staging and what you are trying to achieve with the rocket.

If you use asparagus staging and want to maximize the payload fraction, the traditional wisdom is that TWR should be symmetrically around 2 during each stage. Ideally TWR should start from 1.6 to 1.7 with each pair of boosters, but 1.5 and even 1.4 is acceptable.

Recently I've been building rockets with two boosters and a two-stage core. My goal has been to maximize the payload for a given set of engines, instead of trying to maximize the payload fraction. A typical rocket might have TWR ranging from 1.2 to 3 with the boosters, somewhere between 1 and 1.5 with just the lower stage, and around 1 with the upper stage. I've found that it's almost never a good idea to throttle down with rockets like that, even if the it's flying well beyond terminal velocity. The climb rate starts falling as soon as the rocket drops the boosters, and keeps falling until orbital velocity reaches around 2000 m/s, so the climb rate at booster separation decides whether the rocket is going to reach orbit or not.

##### Share on other sites

Not sure if that helps you but i usually use alot of Solid Fuel Boosters paired with Normal Engines in the middle that control the Thrust towards the optimal speed i want to achieve in every given atmosphere (say kerbin for examplessake). I usually end up with a lifting TWR of 1.8 (at least!. going up to 2.2 even), that includes SFB's and the Engines in the Middle, the reason is once i dispose of the SFB's i want to have a considerable TWR still on my engines so i have some leeway.

While terminal velocity is optimal i still want to have control with the thrust. It sounds stupid maybe from a point of efficiency but since i do all my starts and orbits manually and i have LOTS of different designs and payloads i put above my heavy lifter i want to have manual control. And while you do that stuff manually it seems to me that it's easier optimizing your fuel usage to start tilting at the right moment for instance rather than squeezing every little bit of fuel out by going out of the rules of thumbs.

For instance if i know i dont use all the Delta V i just Thrust up fully so i save some time, when it comes to efficiency i obviously take thrust back and stay within the optimal velocity (which i use the rule of thumb that most people do) to maximize delta V (interplanetary travel with returns mostly). I never needed to tweak this more, my designs so far had enough Delta V for whatever i wanted to do.

So short answer would be, i usually over engineer more than what's optimal to allow for mistakes and increased control.

Edited by Lancezh
##### Share on other sites

Generally I use the following rules for building my rockets:

A ground TWR near 1.75-2. Limit to terminal velocity under 8500m. That's about where I start my turn. When I use MechJeb, my turn shape is set to about 35 degrees. If I don't use MechJeb, I would do my gravity turn over the course of about 10 seconds to just below the 45 degree line and hold that.

I tend to build my core stage so that it burns out at about 3500 dV - about 40,000 - 50,000 m. Above this, your upper stage can have a lower TWR (even as low as 0.9) and you're still carrying so much velocity that overcoming the small amount of atmospheric drag and gravity losses is trivial. You should be screaming fast at this point. My core stages burn out around 1200 - 1700 m/s in orbital velocity. I'd need to pull up my MechJeb stats to see what my actual ground speed is at this point.

I drop the core stage before my final burn so that maneuvering to circularize is easy, as it's still a fairly large mass, even if it's a completely empty tank.

Unless I'm burning straight to a higher orbit (250km) I typically get in around 4400 dV. I rarely break 4500. I don't go for the most efficient as I always tend to overbuild and have excess dV in the hundreds in my launch stages.

Having a launch TWR above 2 is basically a waste for me. Above that and you're building your apo so quickly that you're still at a low altitude when you reach it, and then you're burning to compensate for drag losses during the coast.

##### Share on other sites

Well it seems to me it is highly dependant upon the engine.

[...]

"The total drag goes up because atmospheric drag goes up faster than gravity drag goes down, but the total doesn't go up that quickly."

Air drag increases with the square of velocity. Going 2x terminal velocity only halves gravity drag, but it quadruples air drag.

As far as air/gravity drag is concerned, its a better to go slightly under terminal velocity than slightly over.

We could easily make an equation to show this, gravity drag being Ksub1*1/Velocity, and air drag being Ksub2*Velocity^2

(assuming a 90 degree ascent)

[...]

First, I mostly agree with you. Though I was hoping people would try to quantify statements like "highly dependant".

Second. My question was explicitly about the lower atmosphere. Let's just say I completely agree with the upper atmosphere stuff.

Third, your argument about the functional forms of the drags are relevant but incomplete for the optimization problem.

Variations around the global minimum must be quadratic.

That is, for some small velocity deviation away from the ideal velocity profile, the delta-v difference from ideal is symmetric.

Going a little slower should be just as bad as going a little faster. When this matters then is when your initial TWR is low enough

that it takes an excessively long time to get your velocity up near terminal velocity.

I don't doubt that 2x terminal velocity is highly sub-optimal. I really intended to discuss say, 10% below terminal velocity to 10% above terminal velocity, just to make up some numbers.

I've flown rockets with minimum TWR of 2.2, max 3+ at full throttle into orbit for less than 4400 m/s. (4396 m/s)

I'm betting that that is a lot more efficient than most people would expect.

The point is not that this is what people should do, but that it's not a bad thing to do. The difference from optimal is small.

What I'm hoping to see is that people will demonstrate that being off by 10% of ideal velocity in either direction can result in less than a percent or two difference in delta-v to orbit.

I see people are calling doom and gloom on losses to air resistance, without knowing that those excess losses plus the reduced gravity drag, can be quite small for non-trivial deviations from ideal.

Additionally, starting with a low TWR (< 1.6) can completely swamp excess air drag of a too high TWR (2.2+).

Obviously, a lot depends on the individual rocket. But for most rockets I fly, I find deviations from ideal have a real, but very small effect.

Edited by Yasmy
##### Share on other sites

Well, for 10% faster, in that case, its 1.1^2 = 21% more air resistance, while your gravity drag is reduced to 9/10ths...

At terminal velocity, gravity drag = air drag, and we can thus say the total drag has increased by 9/10*1.21 = 1.089.... almost 9% more drag.

Of course, the other question is what is your total dV lost? if it were an airless body, gravity and airdrag would be miniscule- you thrust up for a few seconds (so... lets say 30 m/s of grav drag), and then point horizontal and go full throttle, racing to the horizon. Then to make it equal, you'd do a hohman transfer to a 75km orbit....

You need ~4,500 to get to LKO... at that speed, your orbital velocity is about 2,300 m/s, so... you spend about 2,200 m/s getting to that latitude, and fighting the various forms of drag.

For simplicity, I'll assume gravity is constant, 1/2 Mv^2 = mgh m cancels out, 1/2 v^2 = 9.8*75000-> v^2 = 1470000 v= 1212m/s

so raising your apoapsis to 75km altitude requires approximately 1200 m/s of velocity (granted the gravity falls off, and you'd do a gravity turn, not shoot up to 75km from a 1200m/s vertical burn, then burn perpendicular for another 2,300 m/s)

Anyway, as a very rough number, you'd spend 3,500 m/s getting to a 75km orbit if Kerbn had no atmosphere, but with an atmosphere and optimal ascent, you spend about 4,500, so lets say you lose roughly 1,000 m/s to gravity drag and air resistance.

Since 10% faster than terminal velocity should result in 8.9% more total dV loses, I'd say take 1,000 m/s and miltiply it by 0.089...

I'd estimate going 10% over terminal velocity costs you about 80-90 m/s

##### Share on other sites

In general, I agree with Yasmy's first post. It seems the penalty for going above or below terminal velocity a bit isn't that bad. I would guess that many designs tend to suffer more losses from bendy joints and excess mass.

To me, the first post really says that if you are limited on time/effort, you are better off optimizing other areas of your ship than to worry excessively about how far from perfection your ascent speed and TWR is. It doesn't seem to me that Yasmy is arguing that speed/TWR doesn't matter, just pointing out that it isn't as big of a factor as one might initially believe.

##### Share on other sites

Minimum ÃŽâ€V != max payload. Sure you need a higher mass ratio, but even with lower Isp engines, you're only looking at ~4.3 to get to orbit. The really blatant example is SSTOs, where throwing more fuel at the design may increase your ÃŽâ€V requirement from 4400 m/s to 4600 m/s, but increases your payload by 30-70%. You might even see higher payload fraction, as those heavy engines are now less of the craft.

##### Share on other sites

Well, for 10% faster, in that case, its 1.1^2 = 21% more air resistance, while your gravity drag is reduced to 9/10ths...

At terminal velocity, gravity drag = air drag, and we can thus say the total drag has increased by 9/10*1.21 = 1.089.... almost 9% more drag.

Of course, the other question is what is your total dV lost? if it were an airless body, gravity and airdrag would be miniscule- you thrust up for a few seconds (so... lets say 30 m/s of grav drag), and then point horizontal and go full throttle, racing to the horizon. Then to make it equal, you'd do a hohman transfer to a 75km orbit....

You need ~4,500 to get to LKO... at that speed, your orbital velocity is about 2,300 m/s, so... you spend about 2,200 m/s getting to that latitude, and fighting the various forms of drag.

For simplicity, I'll assume gravity is constant, 1/2 Mv^2 = mgh m cancels out, 1/2 v^2 = 9.8*75000-> v^2 = 1470000 v= 1212m/s

so raising your apoapsis to 75km altitude requires approximately 1200 m/s of velocity (granted the gravity falls off, and you'd do a gravity turn, not shoot up to 75km from a 1200m/s vertical burn, then burn perpendicular for another 2,300 m/s)

Anyway, as a very rough number, you'd spend 3,500 m/s getting to a 75km orbit if Kerbn had no atmosphere, but with an atmosphere and optimal ascent, you spend about 4,500, so lets say you lose roughly 1,000 m/s to gravity drag and air resistance.

Since 10% faster than terminal velocity should result in 8.9% more total dV loses, I'd say take 1,000 m/s and miltiply it by 0.089...

I'd estimate going 10% over terminal velocity costs you about 80-90 m/s

This is a nice analysis, and it makes a testable prediction. I think I'll build that rocket tonight and find out.

I'm not convinced that the effects are multiplicative though. I would expect something more like (1.1^2 + 1/1.1)/(1 + 1) = 2.12/2 for 6% extra drag.

Additionally, I don't think the reasoning is correct around the global minimum:

If you go 10% below terminal velocity, air resistance is reduced by 0.9^2 and gravity drag increased by 10/9ths for either

a multiplicative savings of 10% or additive savings of 1 - (0.9^2 + 10/9)/2 = 4%.

Then either way, your argument suggests it is better to go below terminal velocity, because it reduces instantaneous total drag.

##### Share on other sites

Slight bump, but interesting thread that could use some additional clarification. Remember that engines aren't free, in mass or cost. In reality engine cost is much much larger than fuel cost, and terminal velocity is very high for rocket-shaped objects. Mass and cost don't technically matter in KSP yet unless you enjoy optimizing things, but that's a separate issue.

In KSP, what are you optimizing for? If it's fuel per unit altitude for a fixed design, then you want to get as close to terminal velocity as you can for a vertical ascent (low atmosphere). But your design isn't fixed, is it? If your TWR exceeds 2.2 or so (TWR of exactly 2 only maintains terminal velocity if the atmospheric density remains constant - since terminal velocity goes up as you ascend, you actually need a TWR of 2.1-2.3 to keep up with the acceleration), then you're throttling down to stay at terminal velocity, which means you're carrying engine mass that you aren't using, and burning fuel to lift that useless engine mass. You could've saved engine mass and used less total fuel by having fewer engines to start with, even if that drops your TWR a bit lower than required to maintain terminal velocity.

Search the challenges section for the payload fraction challenge, and see what the designs there look like. They have lots of asparagus stages so TWR stays roughly constant, and start out each stage a fair bit lower than TWR of 2.

Edited by tavert
##### Share on other sites

Keeping terminal velocity is delta v minimization. The optimal might be ~1,5g on main engines (by the time of booster drop it'll be near 2g) + 1.5-2g on boosters to get to the terminal velocity. Alternatively, you can have 2.5-3g at liftoff and reduce throttle when you reach terminal velocity. Minimizes fuel for given payload and engines (but might require heavier engines per given payload).

Maximal payload per engine is with 1.1-1.3 TWR, but it hugely wastes fuel at the initial ascent (but you do have extra fuel, however it also means extra tank mass). Actually an SSTO rocket with initial TWR of ~1.2g has enough delta v to land on Tylo. Also these designs often hugely benefit from adding some small boosters for takeoff.

In real world it's always a balance between engine and fuel cost per payload, but given that the engines are more expensive it's usually 1.5 or lower initial TWR.

Going over terminal velocity is just a waste with no real benefits, therefore it's often useful to reduce throttle if you go too fast in the lower atmosphere. Note that both Saturn V and N-1 had central engine cutoff on the first stage when they reached terminal velocity.

##### Share on other sites

Keeping terminal velocity is delta v minimization. The optimal might be ~1,5g on main engines (by the time of booster drop it'll be near 2g) + 1.5-2g on boosters to get to the terminal velocity. Alternatively, you can have 2.5-3g at liftoff and reduce throttle when you reach terminal velocity. Minimizes fuel for given payload and engines (but might require heavier engines per given payload).

Maximal payload per engine is with 1.1-1.3 TWR, but it hugely wastes fuel at the initial ascent (but you do have extra fuel, however it also means extra tank mass). Actually an SSTO rocket with initial TWR of ~1.2g has enough delta v to land on Tylo. Also these designs often hugely benefit from adding some small boosters for takeoff.

In real world it's always a balance between engine and fuel cost per payload, but given that the engines are more expensive it's usually 1.5 or lower initial TWR.

Going over terminal velocity is just a waste with no real benefits, therefore it's often useful to reduce throttle if you go too fast in the lower atmosphere. Note that both Saturn V and N-1 had central engine cutoff on the first stage when they reached terminal velocity.

I thought that CECO on the SatV was to reduce G-load on the crew and superstructure of the rocket? And just going off of experience, a liftoff TWR of ~1.2-1.4 is usually what you want, and I try to have stages not go over 3G. Any more than that and you fight against the atmo too much, any less and you fight gravity too much.

##### Share on other sites

This is a nice analysis, and it makes a testable prediction. I think I'll build that rocket tonight and find out.

I'm not convinced that the effects are multiplicative though. I would expect something more like (1.1^2 + 1/1.1)/(1 + 1) = 2.12/2 for 6% extra drag.

You are right.... if at terminal velocity Grav drag = air resistance, lets just call the quantity for each form of drag "D", total drag is thus 2D

So at 10% faster than terminal.... we should have 1.21D + 1/1.1D = (1.21+ 0.90909..) D = 2.119D 2.119/2= 1.0595... so yea, I agree, 6% more... or... about 60 m/s...

Interestingly it would seem then that going 10% under results in (.9^2)*D + 10/9D = 1.92111D.... which would be more efficienct... O.o ?

So then total drag would be D(X^2+1/X).... so the derivative of that is 2X+ (-1)X^-2

The minumum would occur when that equals Zero, so when 2X= 1/X^2 -> 2= 1/(X^3) -> X^3= 1/2 = cube root of 1/2 (ie, about 0.79)

So.... I must be calculating wrong somehow if my equation concludes that the optimal ascent velocity is about 80% of terminal velocity... Oberth effect or something?

Maybe I'm calculating to minimize dV losses, when its actually better to use the dV better (ie oberth effect, not all dV changes are equal), and accept some higher dV losses to drag.

Edited by KerikBalm
##### Share on other sites

Most of my rockets actually lift off with a TWR of 1.2-1.5, and it seems to work just fine for getting to orbit.

A launch twr of ~1.7 allows for moving at terminal velocity from about 4km to about 11km, having to throttle back by only 15 to 20% to prevent exceeding t.v.

I usually burn about 4.7km/s deltaV to get into orbit with that.

I get there with ~4400m/s. (though that's also due to other aspect of the ascent profile).

The amount of twr needed to maintain t.v above ~12km is prohibitive.

optimal ascent, you spend about 4,500

The bare minimum is 4300 and a bit:

Launch Efficiency Exercise

Calculating the theoretical minimum is rather complicated:

Edited by rkman
##### Share on other sites

Thanks tavert and Alchemist, and everyone. (Edit: and rkman who slipped in the thread when I was typing)

I was focusing on the effect of deviations from terminal velocity on delta-v to orbit. It doesn't make a huge difference in delta-v to orbit: +/- 20% of terminal velocity amounts to less than +2% delta-v to orbit. Deviations from ideal just don't hurt that much in delta-v.

But I completely agree with both of you that having that kind of thrust (always > 2):

1) is not delta-v optimal

2) is no where near payload-fraction optimal

3) is not cost effective (more fuel is cheaper and simpler than more and bigger engines)

Of course, a super-aspargussied-up rocket may be delta-v and payload-fraction optimal, but not cost or complexity optimal.

Me personally: I flip-flop between the crazy-asparagus mode, or just a simple no-nonsense two stage rocket with reasonable TWR (and maybe some boosters for launch)

Edited by Yasmy
##### Share on other sites

You are right.... if at terminal velocity Grav drag = air resistance, lets just call the quantity for each form of drag "D", total drag is thus 2D

So at 10% faster than terminal.... we should have 1.21D + 1/1.1D = (1.21+ 0.90909..) D = 2.119D 2.119/2= 1.0595... so yea, I agree, 6% more... or... about 60 m/s...

Interestingly it would seem then that going 10% under results in (.9^2)*D + 10/9D = 1.92111D.... which would be more efficienct... O.o ?

So then total drag would be D(X^2+1/X).... so the derivative of that is 2X+ (-1)X^-2

The minumum would occur when that equals Zero, so when 2X= 1/X^2 -> 2= 1/(X^3) -> X^3= 1/2 = cube root of 1/2 (ie, about 0.79)

So.... I must be calculating wrong somehow if my equation concludes that the optimal ascent velocity is about 80% of terminal velocity... Oberth effect or something?

Maybe I'm calculating to minimize dV losses, when its actually better to use the dV better (ie oberth effect, not all dV changes are equal), and accept some higher dV losses to drag.

I think the conclusion is that minimizing instantaneous drag or cumulative drag losses before the gravity turn, is not the same as minimizing total delta-v to orbit. Not that your math is wrong.

##### Share on other sites

I thought that CECO on the SatV was to reduce G-load on the crew and superstructure of the rocket? And just going off of experience, a liftoff TWR of ~1.2-1.4 is usually what you want, and I try to have stages not go over 3G. Any more than that and you fight against the atmo too much, any less and you fight gravity too much.

I watched SpaceX's recent ISS resupply mission 3. I seem to recall it throttled back just before MECO and SECO, presumably for the same reason. I might have to rewatch to confirm.

##### Share on other sites

What I calculated:

v- speed

a - acceleration

kv2 - drag

total thrust F=mg+ma+kv2

potential energy gain dWp/dt=mgv

kinetic energy gain dWk/dt=d(mv2/2)/dt=mva

now let's calculate energy per delta v per mass (which is energy gain per thrust)

efficiency = (dW/dt)/F = (mgv+mva)/(mg+ma+kv2)

to find maximum, let's do some modifications

1/efficiency=(mg+ma+kv2)/(mgv+mva)=1/v+kv/(mg+ma)

let's find the derivative

d(1/efficiency)/dv=-1/v2+k/(mg+ma)=0

kv2=mg+ma

drag=mg+ma

while the terminal velocity is described by drag=mg

While the terminal velocity is almost constant the delta v optimum is keeping the terminal velocity, therefore TWR is 2. When the terminal velocity start growing noticeably, the optimum is even slightly above it. When it starts growing faster than you can accelerate - it's time to start the gravity turn.

But again, if you have insufficient TWR and go below optimal velocity, you waste more delta v to gravity, but if it's because of carrying more fuel, you usually get more delta v than you waste. If you go faster, you waste your delta v to drag and don't get anything from it (except that slighgt ovespendure might be justified by usind lighter and cheaper engine without throttle control).

Edited by Alchemist
##### Share on other sites

I propose that "better" should be more clearly defined. I see people advocating 'best' launch twr's that i am sure cost more d-v to lko than a higher launch twr would.

With a launch twr of 1.4 you can't even make terminal velocity, meaning gravity loss is needlessly high.

If fuel/delta-v optimization is not the goal, then what is?

##### Share on other sites

You are right.... if at terminal velocity Grav drag = air resistance, lets just call the quantity for each form of drag "D", total drag is thus 2D

So at 10% faster than terminal.... we should have 1.21D + 1/1.1D = (1.21+ 0.90909..) D = 2.119D 2.119/2= 1.0595... so yea, I agree, 6% more... or... about 60 m/s...

Interestingly it would seem then that going 10% under results in (.9^2)*D + 10/9D = 1.92111D.... which would be more efficienct... O.o ?

So then total drag would be D(X^2+1/X).... so the derivative of that is 2X+ (-1)X^-2

The minumum would occur when that equals Zero, so when 2X= 1/X^2 -> 2= 1/(X^3) -> X^3= 1/2 = cube root of 1/2 (ie, about 0.79)

So.... I must be calculating wrong somehow if my equation concludes that the optimal ascent velocity is about 80% of terminal velocity... Oberth effect or something?

Maybe I'm calculating to minimize dV losses, when its actually better to use the dV better (ie oberth effect, not all dV changes are equal), and accept some higher dV losses to drag.

You are dividing gravity losses by your speed change coefficient because it's inversely proportional to time of the ascent. But you should divide the drag loses by the time factor as well! therefore that equation turns into D*x2/x + D/x = D*x + D/x, which obviously gives x=1.

That's the same reason why maximum distance and maximum flight time for an aircraft are achieved at different speeds.

##### Share on other sites

I propose that "better" should be more clearly defined. I see people advocating 'best' launch twr's that i am sure cost more d-v to lko than a higher launch twr would.

With a launch twr of 1.4 you can't even make terminal velocity, meaning gravity loss is needlessly high.

If fuel/delta-v optimization is not the goal, then what is?

Yes. The title of this thread was perhaps a bit trollish. I defined "better" TWR strictly from a delta-v perspective.

That's because I see lots of people saying you waste tons of delta-v with high TWR.

But that is just not true. Decreased gravity drag costs partially offset increased atmospheric drag costs, resulting in a very small delta-v penalty.

You do lose efficiency in terms of payload fraction. You need a heavier lifter with more engines to reach higher TWR.

In the real world, what matters is dollars, or whatever your national or nationalistic currency is. (Budget + pride + technology payoff + safety.)

As others have pointed out above, optimizing for economic cost, for payload fraction, or for overall complexity can be more important than minimizing delta-v.

In KSP? Well, I like to optimize for fun.

(Though I like to talk about theory about as much as I like playing.)

Edited by Yasmy
##### Share on other sites

Delta-v is an abstract quantity that's good for estimating the capabilities of a rocket. Unfortunately, when there are external forces such as gravity or drag involved, the delta-v figures of two different rockets are not directly comparable. Non-atmospheric landings are the typical example, where the delta-v usage depends mostly on TWR. If you want to minimize delta-v usage for a given landing, it's better to use inefficient engines. The lower Isp the engines have, the faster the ship burns fuel, and the faster the TWR rises, leading to smaller gravity losses. You can definitely see the effect in Tylo landings, and probably also in the upper atmosphere part of a Kerbin ascent.

Fuel usage is a concrete quantity that's mostly meaningless for launchers. Fuel is extremely cheap on launchpad, both in KSP and in the real world. (I remember reading somewhere that fuel costs are only 0.3% of the total launch costs for Falcon 9. For the price of one Space Shuttle Main Engine, you could buy enough fuel for 200-300 Falcon 9 launches.)

Payload fraction is also mostly meaningless for the same reason, as the majority of launch mass is fuel.

##### Share on other sites

I'm going to kick this dead horse one more time. I asked how much it matters if you don't follow terminal velocity.

Alchemist provided a definition of efficiency, energy gained per thrust, and that efficiency is maximized at terminal velocity.

Let's normalize it to be 1 at terminal velocity, vT:

e = 2 (v/vT) / (1 + v2/vT2)

Suppose you are moving near terminal velocity at v = vT + ÃŽÂ´v, where ÃŽÂ´v can be positive or negative. Series expanding the efficiency around terminal velocity:

e = 1 - 1/2 (ÃŽÂ´v/vT)2 + O(ÃŽÂ´v/vT)3

Small deviations from perfect efficiency had to be quadratic or higher, or terminal velocity wouldn't be a maximum.

If you are 10% above or below terminal velocity, your efficiency is only off by 0.12/2 = 0.5%.

I.e., it'll cost you less than 25 m/s extra delta-v to orbit at 10% above or below terminal velocity.

Of course this only applies in the atmosphere, so total losses to atmospheric inefficiency are even smaller.

(Actually, it may be slightly better still, since this is energy gained per thrust, not delta-v gained per thrust. Have to think about it...)

Correction: This should have been e = 2 (v/vT) / (1 + (v/vT)2/(TWR-1)).

Edited by Yasmy

## Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

×   Pasted as rich text.   Paste as plain text instead

Only 75 emoji are allowed.

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×

• #### Community

• Release Notes

• #### Social Media

• Store
×
• Create New...