Jump to content

SomeGuy123

Members
  • Posts

    244
  • Joined

  • Last visited

Everything posted by SomeGuy123

  1. I've asked one of the flight control staff this myself. It's just not budgeted like that. One interesting thing, I thought, is that when they do decide on which brand of toothpaste, etc, cost actually still matters. It shouldn't - when everything cost slightly less than it's weight in gold to launch, you might as well get the most expensive brand - but the money to buy toothpaste comes from a different pool than the money for launches, and you have to justify every cost on paper.
  2. I guess? I feel like the evidence we do have is pretty overwhelming that Orion would work, it just might not work as well as optimistic estimates now. (a) Nukes do detonate (b) if you build a thick enough pusher plate with good enough shocks, and you set the nuke off far enough away, it will give you some push without blowing up your ship. I don't see any possibility that Orion doesn't work. I'd say it's about as theoretical as SpaceX Falcon Heavy. It just might suck and have performance that is a fraction of what it is calculated to be.
  3. This creates a major problem. The hotter the nukes you use (higher yield) and the faster you set them off, the more hot plasma impinges on your pusher plate. At a certain point, the excess heat from this (if we're talking about a sustained burn) limits you from accelerating any harder. This radiation limiting point could be a fairly low acceleration. I've done some rough estimates for directed fusion drives - where you have a lot more control over the reaction and a lot less wasted gas - and once you start talking millions of ISP, your thrust becomes miniscule unless you can reject unrealistic amounts of waste heat. Orions do have the advantage of using fusion devices we have versus theoretical ones that might not ever work, and it works amazingly well in the atmosphere because the air is free propellant mass, creating a gigantic shockwave to carry your ship like a cork in a bottle.
  4. Monoprop thrusters would be higher. The reason is the exhaust is cooler (lower energy reaction) and so flat out you can dump more fuel into the engine faster. More efficient engines have lower TWR. Of course, you'd never want to do this, fuel mass isn't free, so nobody builds them this way.
  5. The hole here in your thinking is this. Neumanns seem virtually certain to work. We're examples of them, we see them everywhere, we can easily plan them on paper even though we lack the prerequisite techs in 2016 (nanoscale manufactories, sentient computers). You basically have to believe in magic to come to a rational conclusion that they don't work. (sure, until we actually have them running you can cast doubt but these doubts aren't very credible) So once we have such technology, some of us will have different ideas on how to use it. All it takes is a faction of us deciding to let em loose, perhaps riding along by brain uploads, and the rest is just a simple inevitable result of unstoppable exponential growth combined with natural selection. Warfare to "stop" them is silly and fantasy thinking. You can't, all you can do is build your own and try to claim some stars before they grab them all. Any plausible starship is so insanely vulnerable and fragile that it's a "defenders are god" kind of military scenario - anyone who has full control of a star system will have it forever. So by this theory, the fact that we don't see shadows from dyson swarms at every star in view implies that there aren't any intelligent beings with science at all, anywhere, within the area of the universe we can see. (because it took us under 300 years to go from Newton's scientific method to now, and we're probably less than a century from having straight up sentient von neumann machines. This seems like a reasonable estimate to me now that AI research is actually getting solid, respectable results and we are actually spending money to scan human brains at more detailed levels so we can steal the patterns it uses).
  6. I'm not concluding anything, just pointing out that if all the islands you can see look barren and deserted, and you don't actually have boats yet, though as a feat of epic accomplishment you managed to just barely dip your toes in the water and not die instantly, and your ancestors actually managed to make a tiny float and retrieve a sample from a piece of debris nearby...you might wonder what's going on. Maybe the boat building project is a dead end.
  7. I agree with you. To me, since I can draw a paper drawing of a ship the size of a chunk of a planet (so you can't argue that it doesn't have enough biosphere to be closed loop) and stick antimatter rockets on the back, and I know you could cram in enough factories to make the ship again, well, that's a von neuman machine. A stupendously slow and inefficient one but exponential growth is still exponential. So maybe interstellar travel at anything approaching a reasonable speed (0.1% C or something) is impossible. Or, umm, something really unusual. None of those explanations make any sense. What does make sense should be right up your alley. Evolution is a rat race that ends up with the competing organisms ascending to local maxima. As long as the environment is unchanged, any organism that tries to leave the peak of a local maxima gets out-competed. So if earth had no mass extinctions and no land, all water, logically it would be still nothing but various forms of photosynthetic bacteria. Maybe simple multicellular organisms, maybe not. Anything that attempts anything new (aka I mean mutations make it different) is sub optimal and it gets eaten over time. So it took about 3 billion years, 1/4 the life of the universe to reach this point. And maybe the cosmic lottery had to be arranged just right or you end up with nothing but a sea of cyanobacteria. That makes sense because it explains the data and it reflects the reality that evolution stalls out as an algorithm. What do you think?
  8. Yeah if you need a 100k ton tank to hold 1 kilogram of metallic hydrogen, you're not going to space today... Meta-stable hydrogen could exist but it just kinda seems super unlikely that it's going to be storable at pressures you can achieve in a lightweight tank... Also, state transitions? Remember the search for monopropellant in Ignition!? Everything that is any good as a monopropellant can, by definition, set itself off while it's still in the storage tank (because unstable molecules have higher energy)... I think the metallic hydrogen for rocket fuel is just a way to siphon off grant money for some chemists who want to study it, even though it is probably useless as a practical substance.
  9. Calming the histrionics a little : my argument is 2 fold. None of your pedigree and none of your arguments has even addressed the topic of my argument. 1. The vacuum of space, where available resources are a different element mix than the biosphere of Earth and there's a lot more radiation exposure, is a different environment than liquid water + 1 G + rad shielding + a while biosphere. If you're really the expert you claim to be, you know that any machine - replicating or not - has to be optimized for a specific environment. My statement is that if you could make machines that spread across the universe- whether by waiting centuries to travel between stars or some hypothetical instant teleportation that is (probably) impossible - implicitly means the machines spend almost all of their time in the environment that Earth life isn't optimized for. (or even particularly functional in) Your statement I'm responding to is " It's called life. It's absurd to think that robots are going to spread more rapidly, or consume more matter than living things, because a self-replicating robot is just another living thing. " 2. The second topic is that you know damn well, per your admitted pedigree, that out of all possible arrangements of matter that form a "self replicating machine the size of a eukaryotic cell" that has to operate in the same environment that current eukaryotic cells operate in now, evolution did not find the optimal one. In fact, due to version lock in inflexibility, finite biosphere space, and so forth, we don't know if the current setup is even close to the true theoretical maximum. You don't know, I don't know, nobody knows. So it's plausible that if you have a machine - an intelligent optimizer, a machine capable of accurately simulating physics to sufficient precision, a machine capable of searching the entire design space of all possible configurations of matter (yeah, you'd need some pretty clever algorithms to winnow this down) - you could beat life at it's own game. Probability (the fact that evolution has rolled dice and ended up at an arbitrary set of common arrangements of self replicating matter and it cannot investigate other possibilities due to the way the competition works) dictates that a significantly more optimal machine is immensely more probable than not. So yes, you probably could, if you had the resources, make a "grey goo" robot that ate the whole biosphere because it is factually faster and more efficient at reproduction than existing systems. But that wasn't the topic of discussion...
  10. You can be the most cited researcher on the planet. You're still wrong. The laws of physics are what determine who is right or not, and physical laws support me and they don't support you. Period. I haven't retreated from anything, a structure that is sorta like something but not something is not something. A campfire isn't a blast furnace. Dr. Drexler is kinda sorta a world authority on the subject, and there's a billion dollars in research funding to support directly his ideas. Where's your billion dollars in funding? There's nothing to learn because you're so far wrong there is no information in your posts. That's the problem. This would be like a discussion where I'm talking about jet aircraft and you're still talking about the 5 elements and whether or not a balloon covered in animal skin will fly.
  11. Like my Russian TA in Cell Biology points out, none of these processes are 100/0% like a properly designed system would be. Everything is floppy and statistical. You're wrong. I'm right. Argument is over. I'm sorry, but you don't have any clue about what you're talking about. Scale alone means the laws of physics are different. You know this, right? A machine that has more matter in it can sometimes be qualitatively different. Physics does care about scale. Nuclear chain reactions don't even function below a certain scale. In the computational world, a computer with insufficient complexity can't be sentient (inadequate memory to have enough states). So when I talk about a hundred ton machine and you talk about a cell, you cannot say that vaguely sorta similar systems are the same. They aren't.
  12. I already gave one. Let me reiterate. The factory I'm talking about uses hard carbon or metal components, everything tightly integrated like a watch. Thousands and thousands of steps are integrated together and the thing is hyperfast. Pieces of the factory do fail, and digital watchdog systems detect this and order the failed components recycled. Like this. A floppy eukaryotic cell is nothing at all like this. At all. Again, it's like the difference between a mitochondrion in a bird and a jet engine. The parts are different, the principles are different, the energy storage mechanisms are different - the only thing in common is both are combusting hydrocarbons for energy. The cell is floppy, it won't fail if a single atom gets out of place, it isn't cooled to near cryogenic temperatures, it isn't made of metal and diamond, it doesn't have digital sensors and exact control logic (mRNA regulation is statistical), it isn't in hard vacuum, it can't be intelligently redesigned, it can't operate using most of the elements on the periodic table, it won't have digital error correction between generations to reduce the probability of mutation to zero... I'm talking about machine phase chemistry. You're talking about goop in a bag. There are about alike as jet engines and mitochondria. And yeah, just like jet engines guzzle hydrocarbon fuel by the ton, nano machinery factory would burn through energy and feedstock at incredibly voracious levels. They would burn far more energy to, say, make an ingot of steel than a steel forge uses for the same task. (but the ingot would be perfect). Which doesn't matter - living cells don't have nuclear reactors or vast arrays of solar panels plugged in to them via power cables... Also, something else you might have missed : the minimum "replication subunit" of a nanomachinery factory is still a bunch of equipment. It's a plasma furnace to digest matter. A bunch of chemical reactors to get it to the atomically pure feedstocks. Probably multiple stages of filtering. (all sorts of high end equipment to do that - lasers, calutrons, etc) There are still 3d printers - they make the bulk stuff like frames and tank walls and casing and things where atomic precision isn't needed. And then the core machine, the nanofactory, is hundreds of thousands of separate assembly lines that converge on one another. Some assembly stations in the factory can be state adjusted - ON/OFF/A/B - which is how the factory can make multiple outputs. (different final products have modular components added or missing in sockets on them) The whole "nanofactory" isn't nano at all, once you include all the assembly lines and all the ports for input gases and coolant and power and all the duplicate backup assembly lines, the thing is desktop printer sized or larger. Any smaller and it's not functional. There are then robots as well. And a control computer. This whole machine weighs over a hundred tons. Remove any major part, and it will not replicate itself. It is too complex for nature to design via evolution, at least not without more time than the sun has left. It doesn't compete with living cells in the same environment and the "nanomachinery" isn't separate free running cells, everything must be welded down and connected to power leads and be heat sinked... I think maybe you had some sci fi nanobot the size of a eukaryotic cell roaming around free in mind. I will agree with you there - the environment at that level doesn't have enough energy to do things this way. The "nanobot" would never get enough energy to copy itself at any decent rate since it is having to synthesize diamond, etc. Also, it would just jam and fail since it's outside the vacuum chamber it would be intended to operate in. So you'd have to make it more flexible to survive in the living biosphere on earth, and after you do all that you end up with something that is marginally better than existing life at best.
  13. That's like saying birds have wings and muscles, therefore they are just like jet engines and airfoils.
  14. You're totally missing the point. Yes, these numbers are baked into the cost. But the reason why wind is cheaper is because it's cheaper to build and fix the turbines versus the same capacity in nuclear. And the reason this is true is because of all the delays and inspections that slow down nuclear reactor construction and repair. If we were immune to radiation like supermutants in fallout nuclear would beat everything else.
  15. Because we don't know that, maybe our understanding of space-time is wrong or incomplete. There could be "tricks" we have not discovered as a species. But we do know the basic trick of "self replication given resources, and those resources are mainly commonly available elements".
  16. And they are also very high for nuclear. More power density but you have the problem that radiation is dangerous and parts of the plant are inaccessible except by robots. A mistake will be costly so expensive and elaborate procedures must be followed to do everything. If you get "Joes cut rate windmill repair" to fix your windmill, worst case a blade falls off and some damage is done. Maybe a small number of people are killed if the whole tower were to fall over or something. Point is, it's much cheaper and you can safely take shortcuts because the risk to public health and safety is so small.
  17. It doesn't store much data because in actual living multicellular organisms, all the DNA is just shared in common between the cells (with certain minor edits during differentiation). So there's a lot of total data but it's only a few gigabytes of unique information.
  18. You're neglecting vast swathes of details. Vast swathes. Evolutionary algorithms are slow. The robots could be, and probably would be actually sentient and would be capable of self redesign. DNA doesn't store very much data - you could pack in far more contingencies and "adaptability" code segments and blueprints into a robot that has on the order of terabytes or petabytes of onboard memory instead of a few gigabytes... Details matter, not just theory. That's where I think we really disagree. I have a pretty clear mental model of what I think a von neumann machine would actually look like. (TLDR, it's a sentient robot spacecraft with redundant modular systems and some of those systems are machines that can digest chunks of asteroid, convert it to pure feedstock of various gasses, and then the gasses feed into the nanoassembly systems that are able to make the parts for any component of the spacecraft. The spacecraft's brain would be loaded with the algorithms - the fundamental thought processes and tools and ideas of the engineers who designed it. At some level of tech we'll be able to have something like an encyclopedia that doesn't just store facts and procedures, it stores snippets of logic ripped from the actual brains of people capable of using those facts and procedures skillfully at an expert level. So with such an encyclopedia and the right hardware, you'd be able to literally do anything already known... Nature has nothing comparable at all.
  19. Ah. Fair enough. I just didn't see how if the Moon was a big molten globule of "earth stuff" once, how it wouldn't end up roughly having the same amount of copper, distributed somewhere, that the earth does, as a proportion of the Moon's mass. Ditto everything else but light elements that would leak to space.
  20. Back to the topic at hand. Let me restate your argument. It's absurd to think that airplanes will fly more rapidly than birds or have more range, because an airplane is just another flying thing built by living beings. Similarly, a robot that can self replicate is going to use nuclear energy or vast swathes of solar fabric less than a millimeter thick for power. It would use high temperature radiators (for vacuum) or powerful fan to pump cooling air through (in atmosphere) it's internal radiators. It would "digest" it's food by converting it to plasma first and separating it into individual elements. It would be smart enough to understand it's own internal processes and would be capable of redesigning them if needed. How, exactly, do you think such a robot would be the same speed at consuming matter as living beings? Edit : I thought about it, and I may have figured out your reasoning error. I thought it was due to ignorance - you genuinely thought that evolution was sure to find the best possible arrangement of matter for any self replicator. But actually what you're thinking is that if we had to make flying machines that work the same way as birds and were made of the same materials (bone, feathers, muscles), our "ornithopters" would probably be inferior in overall performance and maneuverability to this day. Ditto if we had to make machines that are made of amino acids, must work in salty water, and have to be able to self replicate with cellular sized subunits - if we can beat nature's solution it won't be by very much. And you've made an argument from authority. I'm telling you that if a robot is a bunch of mechanical parts and circuit boards and wiring, well, you can make most of that stuff with CNC machines and fully automated assembly lines. We don't have self replication yet but we could. But since nobody has quite succeeded yet, and some of your bioscience guys like to go on about just how complex current life is (much of which must be to survive in the hostile competitive environment of earth's biosphere...) you like to feel superior in thinking self replicating robots are far away and will be no better than life is.
  21. What you're missing - because you didn't read nanosystems - is that Drexler is talking about machinery that is tight. As you must know, per your stated background, living systems rely heavily on diffusion and random processes. A true factory has a direct supply chain and active transport of every reactant and product. It would run like a gatling gun fed by a belt instead of relying on concentration gradients to eventually bring together the reacting species. That's the main way you get a performance boost of orders of magnitudes. That's what I'm talking about. A cytoplasm full of components sharing a common pool - or even a sub pool like organelles - is not as quick as this. Also, the second property is stiffness. This is a function of temperature and the materials used. A hypothetical nano machinery system would use parts that are stiff - meaning they flex only between useful conformations. Out of every state of a biological molecule - something you'd know about if you've done any protein folding or NMR studies, something I have done - only some of those states are useful and the state transitions are semi random.
  22. Actually it's neither. One constraint of evolution in a confined environment like Earth is actually a nasty form of version lock in. TLDR, everything alive on Earth actually is only 1 branch of many possibilities, and it can never leave that branch. The reason is that the codon space is full and the capability to migrate an existing genome to a new codon base space is too complex a task for any mechanism to evolve to accomplish it. So everything on earth is actually stuck on the same basic set of amino acids and has been stuck for several billion years... A lot has been accomplished but it's nothing like K^2 thinks is possible.
  23. You can build a machine that would have the same basic set of spinning blades connected by a common shaft that pumped liquids while using gas to drive the other set of blades. This is probably what you mean. The governing design constraints would be radically different - it depends on the gas temperature and pressure, the liquids properties, etc. So the engineer who designed it would totally change how the impeller was shaped on one side and how the gas turbine was designed on the other. It would vaguely resemble a car turbo in the same way a jet engine vaguely resembles a car tubo - some comparable parts but basically a totally different machine.
  24. What's your background? You're wasting both our times posting about things you don't understand. Factory style reproduction can't evolve because of how evolution works as a process. I'm not going to try to explain it to you - go take a course in it. I gave you a big hint when I mentioned dead zones, you should have already remembered if you had the background. You don't have a clue how evolution actually works because you either didn't take evolutionary bio or you didn't have a good course in it. Evolution is a mathematical algorithm and it can't just do anything, especially when it's constrained by the needs of physical replication like life on earth is. It has limitations. This in no way supports the arguments creationist nutjobs, but it does mean that life can't just be and do anything that is possible within the design space.
  25. No, but it helps to isolate your mistakes and not to toy with things that contaminate your living areas without taking appropriate precautions. I think if humans ever need to do nuclear power on a large scale again, they should build the reactors in remote areas and have trains to get the workers to work (kind of like the tram at the start of Half Life...). Expect from day 1 for some of the reactor cores to fail, and design them to handle melt downs gracefully. Have reinforced concrete that can take the heat underneath the core, with channels to send the corium into separate holding areas so no one portion is a critical mass. Make the whole reactor partially underground so it's easier to bury it. Put distance between each core so you can avoid the area where a core has failed and continue to operate the rest of the reactors. And if this is more expensive than solar + wind + natural gas peak plants, well, maybe we should just not use nuclear power at all.
×
×
  • Create New...