# The Relativistic Rocket, moving from theory to application

## Recommended Posts

The justification of this thread follows a lines of conversation that basically starts a couple years ago in which there has been a consensus progression in this group with the realization that every means we have for space travel is basically unsuitable for interstellar travel, except theoretically, generational ships (falling under the assumption that with a source of fusion power and perfect recycling humans could manage to survive in some large volume for a long period of time).

This being unsatisfactory for many folks here we have two basic miracle power systems, the blackhole drive and the antimatter drive. (Excluding warp drives because of the emperical absence of known materials needed to make such a drive)

Theory. If you can completely convert your fuel to energy then you have the perfect energy supply. Its actually not so true, as we will see, but its very close.

For probes this seems like a really great thing, but how fast can you actually go.

This page tells you all you need to know. I will add a table (PL = payload, EM = Energy mass, RM = reaction mass)

 PL EM RM c - flyby c-start-stop Efficiency 1 1 0.6 0.42 Idealized (see 1) 1 1 0.42 0.25 Cosine losses 1 1 1 0.692 0.43 Idealzed (see 2)

The critical formula is the ISP formulations for mass ejecting setups are.

the n variable is the fraction of the fuel mass which is converted into energy. The Isp here is ISPv not ISPg. ISPg has no relevance for any equation dealing with relativistic rockets.

You can derive fraction of c below by simply removing c from the right side of the equation, however you might have to correct this based on efficiency of the photon lensing.

for photonic rockets (row 1 in table) https://en.wikipedia.org/wiki/Photon_rocket

If the rocket has dead weight, that is power units or antimatter storage containers that are not apart of the reaction mass but are simply discarded at the end of the trip these have to be included as part of the payload, which ultimately lowers the ISP.

From the final row of idealized 0.69c the velocities only go down, by the time the reaction mass is 100 the maximum flyby speed is 0.14c and start stop  is 0.09c.

Basic problem is for idealized values.
1. Photon drives, antimatter driven or black hole drive.
- unfocusable hv - both of these devices release photons that are difficult to focus using reflectors. The logic here is a reflector has a non-penetrating surface that forces photons to bounce back into space. High energy photons can penetrate just about everything, and they tend to do quite unpredictable things once they penetrate. Because of this if you beam HEhv at a plate, the best you are going to get is around 0.707 Ve in the -y. In this case the second entry takes account of this hv scattering cause by HEhv. Its actually worse than that.
- Damaging hv, radiation or radiation products. Part of the operation of an interstellar trip is to keep black hole or antimatter stable, the problem is that both can be quite damaging. In the case of antimatter you need containment, but once is undergoes annihilation, a necessary part of its behavior, it produces other particles and may annihilate with parts of the vessel creating radioactivity. This means that shielding and damage over time may occur, requiring redundant space craft systems. Black hole drives have a similar problem, at the end of their life the frequency of photon they release and the power increase extremely rapidly. As a consequence a black hole at endlife would have to be released and the ship would have to have a means of propulsion to take it far enough away from the black hole to survive its final moments. One way to avoid the endlife scenario is to carry life extending mass on the ship and feed the black hole, this would then increase the payload, most of the energy would be returned, but not all due to cosine losses.

2. Ablation drives, antimatter driven and shielding for black hole drive. The uranium sail used for the antimatter drive is coated with a fissile material, such as U238 which then degrades into Palladium 111. The antimatter is 1AU and the composite palladium is 222 meaning that 239 - 222 = 17AU go unaccounted for, given that the recoil velocities are 13,900,000 m/s; this is not explained by gained velocity energy. These other products are composed of radioactive materials. In this case the energies and damage induced go unaccounted for, Once again the tethered payload is directly in the path of ejection mass travel, and even though its a small footprint, over the life of the sail kgs of material, some of it incredibly radioactive (including neutrons), are being ejected at the antimatter containment field and payload. This has two effects, 1 by absorbing the ejecta the ISP of the ship is lowered, second shielding to protect the Antimatter containment system is needed. The protective shielding becomes part of the PL weight because it is neither an accelerant or an energy source. The other problem with ablation, the models assume a focused  pitting ablation in which the ejecta create by antimatter digs a deep well and then is ejected strain backwards (and strait backwards then hits the containment unit and PL). Thus, if such pits could be achieved, they are unwarranted, but shallow pits result in cosine losses. Consequently one does not expect ablation to have perfect efficiency and we are probably looking at ISPv on the order of 0.707*Ideal. As one can see with even some losses considered in the Ablation drive with minimal sail mass, we are already below 0.4c, even for a flyby, and the stated design is well below 0.1c. Caveot emptor.

3. Starting and stopping. The forth column in the table shows the effect of requiring a stopping point. This is the goal ultimately of robotized or manned interstellar missions. A flyby may give basic information such as hmm, that might be a habitable planet, a stop start robotized mission could stop, investigate, even add seeds to the planet (such as cyanobacterium) to kick start the planet from the long phase of evolution. The problem of start-stop with these sublight drive systems is now that we have sensitive equipment/living systems on board we have to protect them from ionizing radiation and antimatter. There are also limitations, for example, you really don't want to accelerate humans at 2+ g forces for a few months, then 0 forces for 3 years, the 2+ g forces at the end of the trip. Ideally you want constant acceleration and then deceleration as to provide a source of artificial gravity.

4. As previously discussed and not needed to discuss here, the energy requirements of either creating antimatter or blackholes.

I provide this thread as a gauge, if you see a website advertising that they can go 0.4-0.9c to alpha centuari in 8 to 20 years, beware, the devil is in the details. The page may say potential, but these potentials assumes ideality when only some of the theoretical restrictions are considered. Both Antimatter drives and BlackHole drives have a considerable mass devoted to operation and shielding and both have losses that cannot be absolutely defined until tested in real-world situations.

The next question that comes up is why can't we just reduce the payload and go faster. The Answer is that reducing payloads, or even assigning payloads (as in the table above) when the theoretical restrictions have not been applied only kicks the can down the road, because once the rocket is built to function, payload starts rising as a consequence of antimatter containment (blackhole shielding and endlife feeding or endlife separation), lensing systems, structural mass for the prementioned. Ideally these could be fractional to the mass of the energy mass and ejection mass, but more than likely these will be a high percentage of the payload mass. IOW payload has to have features that lend themselves to manipulating and stabilizing high energy systems. Other high energy systems include Nuclear reactors, high output power plants, high temperature chemical conversion plants - current experience with all of these is that their structures are usually massive, the fuel is often a small proportion of the mass, the higher the energy output, the more massive they become, low output nuclear power (such as TNGs) can be less massive. Examples of poorly contained but higher output energy generation systems include post-detonation nuclear weapons. Somewhere between these two types of systems is where modern technology stands.  We can thing of ablaters for example as low energy cosmic ray generators, this is something delicate systems should avoid in space if at all possible. We can think of Antimatter annihilation and endlife blackholes as mostrous X-ray/gamma ray machines and we can think of antimatter containment and endlife black hole drives as very massive nuclear weapons. So that before we can go, the system has to be safely contained, and before we can go efficiently the output has to be managed. Increasing Energy means increased containment and management, and eventually the reward is not worth the risk/cost.

## Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

×   Pasted as rich text.   Paste as plain text instead

Only 75 emoji are allowed.

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×

• #### Community

• Release Notes

• #### Social Media

• Store
×
• Create New...