-
Posts
5,244 -
Joined
-
Last visited
Content Type
Profiles
Forums
Developer Articles
KSP2 Release Notes
Everything posted by PB666
-
They call it the opaque phase or era for a reason. What is believed to have happened is that energy they poured into the late inflation phase generated immense amounts of hv and highly exotic matter and antimatter. Eventually this spread out enough that normal plasma was stable and finally to the point that normal hydrogen and helium were stable, the universe went black. Radiation is scattered by cetain forms of dense matter but not others. The type of dense gas in the early universe scatters much of the higher energy radiation causing it to be opaque to our view. It is difficult to see the first stars this is because they blue light of these stars is scattered by the dense gas that surrounds them, but there are areas of space that allowed some earliest galaxies to be seen. Thus there were 2 opaque phases the initial and the post-starformation phase. http://www.scientificamerican.com/article/light-from-universes-first-stars-spotted-in-hubble-photos/
-
http://www.sci-news.com/space/science-aida-mission-asteroid-didymos-didymoon-03307.html
-
First of the green plant analogy is a red herring argument. This has nothing to do with mundane midspectrum qualities of light. That's why they create models. The models do not have to be absolutely perfect, but they do have to be better than previous explanations. Granted the concept of a second is a subjective one, in theory we should use Planck's time. But if we just made a solar spectrograph on a log scale and reduced the label on the x-axis to invisible size you could not really tell the difference between a Planck time and sec based as long as the unit was frequency. The grid lines would differ in position. So the explanation given you suffices. But the units we use are not going to affect the relationships. Planck's time, constant, C, etc will represent the same quantities when corrected for unit. So your song and dance here has only attempted to obfuscate the problem. When we get down to fundamentals humans do not have the mastery of things we know exist, like creating forces like within a supernova. And like I said the variances in our here and now are tiny, so you can't get around these essentials without some major change or addition to the physics.
-
Very much depends the species how much _you_ have smoked. - - - Updated - - - Before the moderator jumps in and kills this thread i have to ask, why do the unicorns have to be pink, doesn't that discriminate against white and sky-blue unicorns, Another on of the posts were i in advertantly forget the smiley face.
-
All wrong, its lit with a 1937 mother of pearl inlaid on solid platinum cigarette lighter held by the newest guy in the 'make them go to space today rotation'. I hear the health insurance is pretty good but they dock your wages if you lose the lighter. Just look for the shiney bald headed guy with no body hair. Should i add a smiley face, heh, nah, they will eventually figure it out.
-
It could come back as soil loosened and collected on our return craft, the nematodes after survived the last space shuttle disaster, including the reentry part. Bacterial spores might have an easier global re-entry. - - - Updated - - - But most of our asteroids that went to mars were from the early bombardment, can we name an asteroid that hit earth since complex life developed that had sufficient energy for an eject path to mars. Just survivng the drag of our atmosphere has a pretty good effect on dV, then getting past the moon and several passes of earth before finnaly being steered into mars. Tge mars fragments appeared to have reached earth, but they may be billions of years old and life that was not as well developed.
-
It might help us if we were very far from where we are at, but in the here and now most of the known physics varies extremely little (the only thing that I can think of is gravity varies in the 7th to 8th decimal place). FTL is based on concepts in which practical application thereof is based on speculation that certain particles (exotic in normal-space time) one of those particles supposedly travels only above the speed of light, so that we beings that travel only below the speed of light would have difficulty interacting with it, if it existed. Of course if we were traveling above the speed of light we could interact with the fabled particle, but there again, if we could do that to begin with we would not need the particle. Lets recapitulated -Gravity only varies by a tiniest of amounts 6 magnitude greater than the other constants, but still small. We might have to travel 100 of millions of light years from earth to see larger differences if they exist -The RF resonator does produce thrust but not efficiently and in the tiniest amounts, and there still could be a non-exotic explanation -The differences that are being seen in Mass accelerators, not so much deviations but new unexplained territories are working now in the Terra-electron volt range, the types of accelerations required to deal with those particles are way more than humans can withstand. -Dark energy - exists, but we don't have a handle on it, and it appears not to be a local phenomena, it appears to exist in space where nothing else exists and that excludes matter from manipulating it. Basically all the variances of known physics are weak and on the fringe, how do we exploit these things? Its hard to fathom.
-
And as a matter of point the dark energy that is supposed to permeate the universe is either dependent on near complete vacuum or a cosmological constant that we cannot measure locally. That means at least one of the fine structure constants is probably variable, how variable and the circumstances of which remain unknown. Gravitational constant of all of these is most likely a variant, even in the lab its hard to fix, the other constants like Plank's, C, Boltzmann's constant are rather unchanging. The problem is that during the transition from pre-inflation to inflation supposedly quantum gravity transitions to relativistic gravity, the first of which we have never been able to detect, the second of which may be resolved to 7 digits (I think securely 6). With regard to the first source of variation, it is very difficult here on earth to reduce all matter, all electromagnetism and all source of static fields, thus you have papers that argue that they think they have emptied enough space to make a valid reading, but the problem is without knowing the interactive nature of dark energy, we don't know what constrains it. When the universe was full of homogeneously dispersed matter inflation stopped (as far as we know) then around 7 billion years ago as much of the clustering of matter into dark matter strands and galaxies the universe begins to inflate again. Einsteins concept of a cosmological constant was sort of invalidated when Hubble showed that every 'point' in the visible universe was expanding away from every other point, but now that we see that this is inflating an inconstant cosmological 'constant' is gaining favor over the infiltration of energy into persistent quantum vacuum of deep space. We need better telescopes. The JWST will provide an important insights.
-
How do you know that we don't bring the lifeform back to earth and the destroy complex life here on Earth?
-
There is some evolution going on the thought about. The one comment made is that einsteinian laws of the inflationary boundary are not limiting at the boundary. IOW relativity cannot be used to prove itself in all instances. That even if it consistent in the observable there is no guarantee its consistent everywhere If you are on the edge your conclusion would be different, The concept that the universe is flat on the observable large scale has been held since CMBR, meaning that euclidian geometry averaging out the effects of local gravitational wells is flat. But there could be reasons why that is not always true, for one we could be close to the center of inflation than much of the universe. The CNBR could be giving a false impression, warped by dark energy differential effect, or the large scale structure of the universe may exist and may be more evident in other observations. This is listed under sci-fi, its a testimony to the fact the universe is expanding, and apparently the actual universe is much larger than what can be ascertained from the observable, which by itself is very large; to arrive at these hideous distances and still be flat, the instantaneous differential velocites of the extreme points of the universe are simply beyond our scope. Barring a infinite improbability drive (or warp drive) there is simply no way to achieve the speed or distance required to thoroughly test Einsteins prediction. There is another issue of course, that is space-time during inflation. during inflation all space grows from adjacent space at the same rate, matter is immaterial and energy is so high that electromagnetism is not present space and time are not resolvable in the same manner as present. The assumption however that inflation stopped. There is a priblem with this however, dark energy. Did inflation stop everywhere snd how did it stop. Certainly inflation slowed down or the opaque epoch would never have been entered and exited. But the universe is not homogeneous, there are places we can see further back in time; galaxies millions of years older than the expected oldest visible galaxies. In one scenario inflation is like a wave that keeps traveling outward with a fantastic comoving frame traveling faster and faster away from us, in this scenario instantaneous predictions of size are only useful at the very beginning of the universe and are virtually useless now because the concept of velocity at the fastest inflating areas is not a calculable quantity. To these frames our age is unfathomable, opaque phase hase not been entered and will, from our perspective never be exited. The laws of gravity that bind objects in our observable universe would not effect these new objects. So between this extreme and a finite universe are all variations including multiverses of expanding bubbles. The reason these various speculations can contend is because of one simple observable fact; the universe is larger than our observable universe. How do we know this to be true, if it were not true then CMBR would begin to fade in one part of the sky, weakening over time. Our perspective on the universe is akin to a housekeeping ant living in the middle of an ant-mound, never to see what goes on outside the darkness of its own personal creation, it thinks his world is complete, until it tries to contemplate its worlds creation, then its horizons fade into oblivian.
-
What would be the easiest way to OBLITERATE THE ENTIRE PLANET?
PB666 replied to Kerbface's topic in Science & Spaceflight
you can not destroy the sun by throwing nails into it. Deep sea vent life will survive for eons, even if you block the suns light and complex would spead all over the surface not soon after the block was removed. OK, redirect ceres from it current orbit into a polar orbit about the sun, then shrink it apo to just above mars with a pe that intersects earth say 1000 years in the future, the impact zone should be mid pacific. Althiugh ceres is a fraction of earths size, the is sufficient kinetic energy in the differential orbits to heat everything on earth by a few 1000'. Oceans would boil, nothing could survive, earth would be immediately thrown from orbit about the moon and if timed properly the moon would later collide with earth. Even so some life spore would survive on ejecta and fall back to earth starting evolution all over again. Do you folks sit around all day long thinking up morbid destructive ideas? -
Curiosity has a limited amount of travel distance left in the wheels, if they do anything it will probably be after all other targets have been reached. I suspect tgat will give them enough travel time on the surface to purge all terran microbes from the rover.
-
http://phys.org/news/2015-10-crucial-hurdle-quantum.html?quarkcolor=mauve This article is down-under kind of hypey.
-
http://www.nytimes.com/2015/10/06/science/mars-catharine-conley-nasa-planetary-protection-officer.html?_r=2
-
What is the most dangerous chemical that you know about
PB666 replied to Ethanadams's topic in Science & Spaceflight
I figure I would tag this on to this thread, since water is considered highly dangerous to some people, I supposed if you consider water :Frozen and shot as a bullet :In a storm surge of a hurricane (not factoring that it was the wind that blew the storm surge) :In a tsunamic (not factoring that it was a subduction fault rupture that caused the tsunami, or a major volcanic event - like Mt. Santorini explosion) :In a swimming pool for some one who doen't know how to swim We can also say air is deadly :Such as during a volcanic eruption :As the blast wave front in a bombing :As when you fall through it and go splat because you don't have a parachute (despite the fact that vacuum is more deadly :As when you breath to much of it, as in nitrogen embolism or Oxygen toxicity. Of course going by that logic we can also say a vacuum is more deadly than oxygen or water. :High vacuum kills within seconds. :Falling through a vacuum has not terminal velocity. :blastwaves propogate further and maintain deadly contact velocities over greater distances. So now that we have gotten rid of this foolishness. Reductio ad absurbum style - less so a violation of common sense than in most cases. So if we look at risks and dangers and we see that there are many interconnected risk. Education is by far the best deterrent to risk of any origin (for example the most educated people tend to have the least problem with sugar diabetes). In risk management there is a word called situational awareness, for example if you are working around a cobalt irradiator, its a good idea to know that it can be lethal, and it is best to take note any unusual behavior. If you have a beach house in a typhoon zone, its best to check the weather report more often and follow the guidance carefully. If you live in a war zone then have a complex risk aversion strategy that may involve kin-selection. Education also helps discern killers by there nature (a hand grenade, etc) from things that are lethal only in rare situations (like water, that beam above your head in your house, or an incoming asteroid). For example if you asked an extremely educated person what chemical statistically is most likely to kill you as a result of all it direct and indirect affects (due to overconsumption, for example) that substance would be dextrose. IOW through a process of statistically mediated risk aversion we can see in a process of exclusion; which things will increase or decrease life expectancy with exposure above an optimum - multiplied by its likelihood of exposure. Botox is not a likely risk, but cigarettes are given there is a risk of physical addiction, sugar is also a risk, etc. -
it needs a couple of telescoping panels, a guidance system, as far as i know noone has precisely measured the thrust vectors on the device so that the thruster itself needs to be gimbled. IOW it needs weight. it needs an antenna, kerbals are a bad choice of bling. When it reaches lunar orbit I would recommended frankies 'I did it my way song' , also it activates a KSP patch that adds a massless drive to the stock engine parts. . . . . . :^) OTOH. Has noone asked about why RF frequency? 1. for a given sized device do we know the optimal frequency. 2. If we know the optimal frequency, can we not shrink the device and use higher hv 3. if we can shrink the device we can make it smaller double the frequency means 1/8 th size means one eighth the weight. if you can shrink the weight you can have a device pointin each of 6 directions and 4 for attitude control. Or you could have a large device and 5 smaller devices for attitude control.
-
There is a technical aspect that was an important cosideration. They were relying on the electric output from an ACC on board nuclear reactor to provude liquid oxygen and hydrogen, both stages required Liguid O2. The storage capacity of the gases woukd require a third vessel, and the production time would have benn horrific at 1960s electrlyisis capability. To make this rocket feasable they would need vessel much larger than a carrier with several nucear facilities and a large storage capacity, in addition placing these liquid fuels under water would generate a large icing problem that also would have to be dealt with. if the USN needs nuc powered vessels to improve performance they are going to get the best, but the per watt cost of power on land are not as good as other fuels, and when we deduct the sheer size of the rocket and the inefficiency of electrolysis at sea, its a genuinly bad idea. On land, there is always excess capacity at night and so the capital costs of power don't need to be considered, and hydrogen can be made from methane. Oxygen can be made from sulftate but can also be generated by photrophs really easily. The design solves one problem, gravity of massive objects versus boyancy, but creates several other problems that need to be dealt with. These could be solved by builting a storage barge made of steel Drafting the use of several carriers for a few days/weeks- risk issue for other intended uses. Specifically designed heaters over pipe insulation to prevent icing. In addition they claim to make the rocket out of steel but steel stages have a higher density in air and make recovery mor difficult. I think the concept tries to solve too many problems at once.
-
-
What is the most dangerous chemical that you know about
PB666 replied to Ethanadams's topic in Science & Spaceflight
Without breaking any nondisclosure agreements. Organic hazards can be broken into two categories. chemsafety and biosafety hazrds. Per unit these supercede in any other hazard known, you can look these up on wiki. for chem safety hazards the criteria is known sa LD50 per kg or gram, there are a few that are in the microgram per kilogram of body weight or lower, one comes from a bean and the other comes from a relatively common deep soil bacteria. In th biological saftey things like ebola, anthrax, spanish flu, etc, I have never worked with these biologicall safety hazards. -
lol. I was hoping you would go there! Is there any space at any time were that is not true, if we go to very longest wavelengths, hv that comes from the very edges of the visible universe but are so low energy its not possible to detect them and have extremely high penetrance. I mean one hypothetical explanation is that the hv resonance is interacting with other low energy magnifying simple EM momentum by redirecting these low frequency waves when they approach from certain directions. This is what i mean by uncertainty, at least oneof the devises it does not appear possible that it could be producing a reaction mass at ST. If it was equilibrated in a vacuum then that could be extended to STVac. This is one of the reasons that doing the calcs now is futile, there is no sense in wasting good theory on bad data. Aside from the space experiment, they should place a whole feild of graphene circuited probes inside the fulstrum using fractal geometry to survey the em feild intensites over many wavelengths. The problem is that the longest wavelenths would not interact with probes of the shortest curvature. The only way to deal with this is to vary external frequencies across a wide frequeny range or block them across all ranges. In turns of the crystal experiment, yes, but it also creates other artefacts, for examples transitions in the crystal can also alter the observed c. How many directions did the example the polarization shift and did the break the shift down into distance from beam center and look for increases in variance. I will DL the paper next week. im off.
-
The problem is that it would appear energy is conserved, only the polarization appears to shift. - - - Updated - - - Doesn't matter who has the Ph.D. what only matters is who was right! My point in those post many many topics back (maybe 1000?) is that we should not be so quick to shut the door on new ideas. Let me put it more carefully Camp one, quantum fluctuations occur all the time, but they only have meaning when other known fields interact with them. Camp two, quantum fluctuations occur all the time, but they only have meaning when known fields and potentially unknown fields interact with them. *unknown fields may not actually exist, but if they did this can explain the RF resonator. Camp three, no in the argument, if space gets empty enough then quantum fluctuations may have the ultimate meaning, initiating the next new universe. There is, in light of camp three, a hair shaves difference between camp one and camp two. But more to the point, any person with a historical perspective of science realizes that strongly held theories get overturned, and even really good theories get tweeked. Newton could not fully withstand the test of time, neither could Einstein, and there are some particle physicist out there that are not entirely convinced that the standard model works the way we think. So even if I was convinced that camp one is correct, I would not lay all my eggs in that basket. So that if you read this page the first thing that you see is "In theoretical physics, Feynman diagrams . . . . . " IOW these interactions are not observed. Nor do these interactions exclude other interactions. However certain limitations apply listed below the diagram, such as fermions interact via bosonic fields (wavey line). There is a general position about spatial proximity, which of course has flexibility according to uncertainty. This equation basically argues that any transition in space can be explained by the diagram or a superposition of several of these diagrams. The diagrams are drawn according to the Feynman rules, which depend upon the interaction Lagrangian. For the QED interaction Lagrangian, , describing the interaction of a fermionic field with a bosonic gauge field , the Feynman rules can be formulated in coordinate space as follows: Each integration coordinate is represented by a point (sometimes called a vertex); A bosonic propagator is represented by a wiggly line connecting two points; A fermionic propagator is represented by a solid line connecting two points; A bosonic field is represented by a wiggly line attached to the point ; A fermionic field is represented by a solid line attached to the point with an arrow toward the point; A fermionic field is represented by a solid line attached to the point with an arrow from the point; However if you read down the page you will find that there can be a substantive random or spontaneous contributions Randomly pick the real and imaginary parts of each Fourier mode at wavenumber k to be a gaussian random variable with variance . This generates a configuration at random, and the Fourier transform gives . For real scalar fields, the algorithm must generate only one of each pair , and make the second the complex conjugate of the first. You can think of me as the inverse of K2. Something required in order to keep fixated universe a little bit unpredictable. ^better summary. Argue on.
-
http://phys.org/news/2015-10-team-sampled-electric-field-vacuum-fluctuations.html
-
Well if you only need running shoes, try this https://en.m.wikipedia.org/wiki/SL-1 scroll down to 1961. I don't know why we bring the term toxicity into the equation anyway. A firing rocket engine is extremely toxic, it is also a radiation hazard as well as a trauma hazard. Trinitrotoluene is also toxic, though i think eating would probably circumvent its toxicity.
-
C, C++, C# Programming - what is the sense in this
PB666 replied to PB666's topic in Science & Spaceflight
-
Sucrose can be made to 2M which is around 684grams per 500ml approximate of water. The energy content per gram and about 1/4th that of Oil and with the dilution it would be about 1/7th. That is the bad points The good point is that more of the fuel goes into expansion than into heat because of the high water content, the problem is that in a rocket engine your gas accelerations are tremendous. To get those large accelerations you need also tremendous dV/dt, T = f(V2) v = velocity of gas relative to the comoving coordinate space in the gas. Therefore as the system is heated the ability for gas to accelerate increases. An excellent example of this is action is the speed of sound. As the temperature increases so does the speed of sound, suppose a projectile with a flat leading edge reaches the speed of sound. The molecules of air cannot move fast enough to move around the leading edge, they begin to stack up on the object until the density phase transition is a fusiform shape (taking into account turbulance) such that the air molecules that are not inside the high density phase can accelerate around the that phase. On the edge as those molecules that have stop slide off and accelerate they expand the phase transition maximum width creating a much larger dynamic pressure on the leading edge and the side that immediately follow. So basically inside a nozzle gas may accelerate to 30,000 m/s within the space of say 0.5 meter which means that acceleration is roughly 3,560 = a * t and d = 0.5 = 1/2 a t^2. 0.5 = (3560)t^2/t = 3560t roughly t = 1/7120. This does not seem to bad until we calculate a. 3560 = a/7120 which means a = 2 * 3560^2 or 25347200 of 2.5M g forces on our little gas molecules. Thats a hell of alot higher than the forces on air at the top of a rocket which means we are going to need a much higher temperature. Could we guess at what that might be, e and T are a function of v^2 so if we are generating 1/7th energy we might guess that v would be SqRt (1/7th) so about one third. So in otherwords the amount of thrust is going to be cut down to about 1/3rd the exhaust gas will be substantially cooler and less energy wasted, but that is only because you burnt far less energy and lost alot of power. Its probably not so bad because you could take more syrup and less oxygen, but still you will be less than half the thrust per mass. The bottom line for rockets is not so much about efficiency but thrust production per mass consumed the more thrust you get on these will determine the ability to efficiently carry a load up. Of course if you have to use a engine 1/2 the mass of the fuel that is going to substantially reduce the efficiency. Note that you can make 3M sucrose, for example this is how syrup is made, its viscosity is in the 10s of thousands. You would have to heat it and keep it heated to keep if from crystallizing in your fuel lines. Corn syrup is made of dextrose, which interconverts to its aldehyde form which is much more reactive and can react with metal oxides and fuel lines, etc. So its not as stable is sucrose and at higher temperatures more of it will be in the less stable aldehyde form. At room tempature corn syrup is about 1500 times more viscous than water. So some of the weight would have to be fuel line heaters that kept the viscosity high. Another problem is that because syrup would not be a gas at the rocket inlet, some would burn before reaching the combustion chamber and would leave residue on the sides of the inlet potentially reducing flow to the engine and might even gum up the side of the chamber close to the outlet (though the temperature of the gas should ionize this).