-
Posts
5,244 -
Joined
-
Last visited
Content Type
Profiles
Forums
Developer Articles
KSP2 Release Notes
Everything posted by PB666
-
Career mode, 2/3 rds of the science you will need to get done fast (kerbal elapsed time) is on the Mun.
-
Energetically massice neutrino inferred from IceCube polar detector
PB666 replied to PB666's topic in Science & Spaceflight
One cubic kilometer is 1 billion meters cubed. A cubic meter is around 1000 kilograms and a human weighs lets say global average of 50. So that is 20,000,000,000 human volumes. So unless this is a really rare event it can happen in 3 years, so they argue that it did not reach their detector only the effects of a muon neutrino, so it might happen once every 100 or 1000 years. The other assumption is that it decays in the visceral core, which it probably would not. -
http://www.bbc.com/news/science-environment-33787562 This neutrino had magnitudes more energy than the largest collision produce in the LHC. Yep, thats right is simply a neutrino of the muon variety. The total energy could only be inferred to between 5000 to 10000 TeV, the LHC is now taking date at 13 TeV, ~600 fold lower energy. The particle is moving so rapidly that it has barely aged a moment since it was produced. These things are occurring over year/cubic kilometer of ice, they appear to come from well outside of our galaxy, although where is unclear. Although the detector has found low energy neutrinos produced by radioactive decay within earth. Can you imagine if such a neutrino decayed after interacting with a human body.
-
The case for self sufficient colonies in space
PB666 replied to DBowman's topic in Science & Spaceflight
BTW, you really don't want to be procreating and bearing children in space, the unforeseeable consequences and foreseeable consequences would argue that any 'colony' would be a people who are in the middle of their childrearing years or older, have had their children on earth. -
The case for self sufficient colonies in space
PB666 replied to DBowman's topic in Science & Spaceflight
What this means is that it is possible with a large investment, or conversely stated, the risk of failure is inversely proportional to the amount of upfront work put into it to get it to succeed. The large the volume and mass of the original colony, the more likely it will be to succeed. Possible and easy are 2 different things, a colony of 500 on a martian moon would have a original investment that would require the support (as in a tax scheme) of the entire planet for a couple of generations to set up. -
http://www.mnn.com/earth-matters/space/stories/astronomers-discover-humongous-structure-one-ninth-size-observable-universe What are the possibilities here. The first an most obvious is maybe these were at the center of expansion, the problem is that at 7 billion light years away they are two close, nor could they be at the edge of the universe since they would form a partially sphere covering a large part of the sky. Basically these cannot be true and the universe be 13 billion years old. The conflict here is that based on our view of the universe, perfect euclidian flatness and uniform density of matter that the universe inflated for the tiniest fraction of a second before matter existed and expansion took place, as a consequence when the universe cool sufficiently space was expanding so rapidly outward that its momentum prevents matter from clumping nonuniformly. Of course now there are large structural webs in the universe some have described as the gravitation dance between matter and dark matter, but the scale of these structures is relatively small and extends betweens the modest galactic structures that we we see. Nothing so big as billions of light years across. Seeing as I started this thread I get to throw in my conjecture. These objects are closer to the original center of inflation but they are still relatively close to us. Since they were closer to the center their rate of inflation was marked slower, and because of this these objects are considerably older (they would have formed dense stars sooner, then 2nd and 3rd generations stars) and represent larger condensations of clusters, GRBs however represent the condensation of matter, and beyond them so much matter may have already condense that the GRBs are far less frequent and thus we don't see them. So I would predict that these represent an isoquant but because of expansion we can only see a part of it, the rest is traveling to fast away for us to see.
-
Or we could just wait until they find the dense wreckage and are capable of doing a real investigation.
-
Oxygen content in water/ water breathing.
PB666 replied to magnemoe's topic in Science & Spaceflight
Not stupid brain but small, Some fish are pretty clever, groupers for instance. The metabolism is slower, they do not need to waste energy to cool down (such as sweating) and for freshwater fish they simply dump ammonia into the water rather than process it to something less toxic. They also breath through their skin. If you take a look at the meat of most fish (except the deep ocean bill fish, tuna and like) they have white meat, this is what the call fast twitch muscle, it tends to work only for breif periods and conserves energy by not having alot of mitochondria slow burning fat and glucose. -
Q. Uncertainty testified in court, smiling, my work is complete.
-
The RTG is not efficient, it gets less so as the differential decreases, and it has to get rid of waste heat. I would be cheaper to carry 50 gallons of water, insulate the hell out of the thing, place the RTG well away from the ship, and a drip of water to create steam and cool the ship down to whatever the boiling point of water is at Venus pressure. You could run the exhaust steam passed to RTGs to remove heat as it passed and still not very efficient. RTGs work great in the dead cold of space, at a very high temperature one has to wonder how long a thermocouple will last. Even if the thermodynamics of the model worked, think about the fact that the surface temperature of venus is 462 °C which is above not only the vaporization point of oil required to lubricate the engine, but is at the denaturation temperature of oil molecules themselves.
-
Sounds like you are trying to create a KSP-forumgate here.
-
That's correct and they have the 747-800 coming out that will be even more different aircraft. The 747SP for instance is often equipped with very different engines, and a much shorter body. There are also alphabetical suffixes that apply. In general however the zz designates a common package that a company will buy, since even within the y-variants they may purchase engines from a different manufacturer, etc. But since we are talking about the 777. And I forget who, but one engine manufacturer is equipping the longest range model with a thruster capable of generating 100,000 ft/lbs of thrust which was at the time the largest thruster ever made for a commercial AC, so the types of variants still apply. Since a given company has planes that fly from certain length runways or may have to avoid certain terrain around airports, etc they may opt for a different engine package than the next company, or they may even have multiple engine packages for the same plane. And so I posited a potential caveat below, conspiracy or not, it could be a careless book-keeping. The problem with the base theory is whether this is likely a part that would be swapped. The assumption is that the serial number would be conserved from the point of last contact to the point it ends up on the island. I believe that this has been done, but hypothetically if it were seriously damaged then you would have to go by structural indicators such as materials used in the part during a given time frame and the manufacturer. I don't think anyone is saying that, and even I say its extremely unlikely but you could come up with a scenario. For example suppose you had two AC both just inspected and certified fit to fly, you then remove and swap the component from another aircraft which does the same job but was made a year earlier and had a different part supplier for certain components, then you remove the serial numbers and add the AC specific serial number to the part that was not hijacked, banking on the fact that no-one would notice the minute structural differences. Then I added that the part is kicked from the craft mid-flight and append this with a low intensity charges and a hydrolic line shutoff. Theoreritically if such a plane were to loose alot of weight, it could gain altitude and fly at a lower IAS for another hour or so, be well further south than the last ping, and so no-one would be looking for it. If we really wanted to get into conspiracy, the luggage would not be loaded on the craft, but instead an additional fuel tank was added and somewhat higher takeoff weight and speed. Or they may have simply filled incompletely full tanks for that particular flight with more fuel. Then the part shows up on the beach but the numbers don't match and the planes wreckage is never found. This would be something that and anarchist might choose as a form of terrorism, trying to undermine trust in the investigative authorities, regulators and the government. This is what you call and 'if and if and if and if then argument' and its not something that has good statistical value, but the problem, I must point out, is that in investigations of past events similar complicated scenarios were devised that did not involved equipment sabotage. For example, the Concorde crash is being blamed on debris from a continental flight that was picked up and damaged the Concorde's wing. IOW any scenario that one could come up with is highly unlikely true because of all the complexities. Consider the mid-flight explosion of the 747, a very safe AC, because of an electrical short in the fuel tank. Because of all potential complexities that might be true, there is a relatively good chance that one of thousands or millions might be the actual explanation, and that the current generalized explanation that a rogue pilot flew the plane until it ran out of fuel crashing in a rectangular box in the Indian ocean might not be true. Let me make it clear, someone begs the issue could this bizarre investigators wont sign off that this is the planes part because they have some other explanation in mind, then you can with enough caveats create such an scenario.....but neither the premise or the supposition are very likely. I assume they are nearly impossible. - - - Updated - - - Just a reminder the ID for AF-447's airbus was involved in a taxi-ing tail-strike incidence months before it disappeared. I assume it had nothing to do with the accident. But a previous ground strike incident for a Japanese 747 was found to be causal for a mid-air loss of control incident that eventually resulted in Mountain crash.
-
You can model dark matter in the same way they model electron density for molecular orbitals. There was a paper written recently that actually shows that dark matter doesn't exactly avoid matter, it is simply indifferent to its existance except via gravitational forces, as a consequence it tends to concentrate as well around gravitational clusters as point masses and thus there should be an intersteller density profile that can be developed. You note that matter stratifies itself in objects (solid dense cores, less dense mantle crust, atmosphere, interplanetary space), dark matter just flies right through this as if it does not exist, very fast and then slows down and lingers in interstellar space. A good example would be an SP3 orbital of a C-H bond where the electron speeds alot of time (in a classic model, not the proper quantum mechanical model - which we know that the position of the electron is not defined until we try to observe it) lingering between C and H and a small amount of time in the nucleus and on the other side of the atom. In this case we can propose that for 1000 bonds the electrons are at distances from the nucleus 0-.1A, .1A to .2A, but given quantum mechanics its quite unneccesary to do that we don't ever need to know the exact position of the electron, just its probabilities. In the same way since dark matter is indifferent to matter and since it appears to be composed of isolated darkmatter, and not clumps of matter, you really do not need to know positions, just probabilities. We have to think about it this way, if dark matter is say (just saying) half the mass of a galaxy, but exists not only where stars are but also where stars aren't such as in the extremes and well above and below the galactic plane, then its average density in a spiral arm relative to matter is much lower. Then we factor in the density between stars versus within the sun's heliosphere. And then you reach a realization is that the amount of dark matter in a stellar system is a tiny fraction of the dark matter in the galaxy.
-
BTW you know the MechJeb is the Soul of past Jeb, some kind of accident. So Jeb and MJ can't coexist unless its a clone of Jeb.
-
MJ blacks out during an eclipse, Jeb wins. Or wait 'Open the cargo bay doors MJ, I can't do that Jeb' [spacey music murmering the the background]
-
For Boeing the 7x7 the x designates the basic form. sort of like sizeX in KSP, but not exactly, because couple of craft have same crosssectional profile Adding to this are the base variations such as 7x7-y00 (737-500) which both deal with upgrades and size variations (for example 737 has stopped making certain early variations and has replaced them with later variations of approximately the same footprint, but there has been significant part remodeling and modernization between the two similarly footprinted versions). Then finally there is the clone identity 7x7-yzz, and in each aircraft's cockpit the exact designation is posted. So basically for each base variation x there is a fuselage length and wingspan variant y, and as the aircraft ages various minor part designs are made zz. Both Boeing and the company have a manifest of all these parts because over the life of the aircraft some of the parts will be replaced. If the flaperon matches the specification of the parts that was assembled for type 7x7-yzz on a given date, and/or matches the manifest then it is extremely unlikely that the part remnant came from another source. There is one remote and disturbing possibility. If the plane lost the flaperon in flight, it could probably fly for hours without it, compensating by raising lift on the rest of the wing and reducing lift on the other wing, and slightly shifting the rudder position. The other possibility is that someone on the ground before takeoof intentionally swapped the part with a part on a different but similar 777 in order to complicate the investigation. These both are very unlikely.
-
For Questions That Don't Merit Their Own Thread
PB666 replied to Skyler4856's topic in Science & Spaceflight
^^^^ 5 - - - Updated - - - I read an article about 10 years ago, maybe more in which you have an RTG in which the two radioactive materials were embedded in a rubbery plastic, when they roll into each other the generator core they exchange particles and react with each other producing heat and fueling the thermocouple. I imagine if the craft still had fuel to manuever the rubber would be cut and it would drift into space. This was an idea for fuel deep space spacecraft with lifespans of 50 to 200 years. I think the Voyager craft have another decade left in their RTGs before they do not have enough to power the antenna and they will go silent. - - - Updated - - - Actaully denser will work better but not because of nuetron decay. Denser works better because if you thermocouple is at the center of the mass, the absolute heat gradient increases from the interior to the exterior. So a bigger mass means hotter at the center. But if you look at the RTGs, they modulate the heat the thermcouple runs at a certain position at the center and heat is radiated through the radial fins. The design is not spherical, but cylindrical, which means not looking for more volts, but more amps, which means wider or more thermocouples. -
Airbus presents concept for Mach 4 passenger plane
PB666 replied to Frank_G's topic in Science & Spaceflight
They are probably only giving sketch details at the moment to keep people from psuedo copying the design. The Nacells don't follow the rule for supersonic the total crosssectional area of a near sonic or super sonic craft should be fusiform in graph. Mach 4.5 is fine at 27,000 meters. How does this keep this cool. I suspect some of the engines are retractable into the fuselage. -
I have the steam game, I stopped playing it and bought the amazon version because it could not be played in offline mode (it could it only took an hour to load). I did not like steam playing the in the background and most annoying is when the game crashed a steam popup advertisement would show up. Game is much more stable (fewer not responding issues) since I removed steam from the equation. So . . . . . . . . .
-
I don't have anymods on my current install except MechJeb and the same thing is happening. For example it looks like the kerbal is about to stop cartwheeling and all of a sudden it will bounce a very odd angle and 5 times the velocity into the air. This reminds me of the launch pad physics before 0.25, were a vessel of a certain size simply starts hopping higher and higher until it kills itself. I have a ton of EVA issues, this is not the only one. Another one is when the Kerbal steps out of the pod, the view locks so that you cannot zoom in or out or rotate it. Then you have to hit "]" or "[" or go to space center, "V" wont work. Another problem is with the docking, which is apparently a moot subject so I wont mention it. If the kerbal graps at a grasp it can be thrown away at again 3 or 4 meters per second. In addition if two ships touch each other only slightly on ship will end up being thrown away or into a crazy spin with many time more rotational velocity than was originally apparent. Same thing the kerbals, if they bounce the ship in the right way the ship will develop a crazy faster spin. I've actually used this to restore power to a vessel that had a bad panel/sun angle. IOW there are a bunch of issues that are in the stock game.
-
Quantum Mechanics. This will completely blow your mind away
PB666 replied to Thunder_86's topic in Science & Spaceflight
I see many instances in these videos were they are ascribing widely held hypothesis as dogmatic theories. The one truth is that quantum mechanical universe really revolves around probabilities, and this is the truth that trumps all other truths, because in means what we define as reality can always be described in a set of confidence ranges from the smallest scales of the universe to the largest. As the one video portrays dXdPx = h/4pi, basically defines that particles can behave like waves until they arrive, in which then the become like particles, even the largest particles have a frequency. Simple math becomes immensely more complicated when using probabilities. As the quantum computer denotes Nbits = 2N . Which of course if you are familiar with the binomial probability distribution, the basis for doing things like Fisher Exact Test, the test becomes more math intensive as N increases, the quantum computer would greatly speed up the calculation. People don't realize that most statistics are not perfect, Chi2 for instance is an approximation. Fisher Exact Test works well for comparing groups, but the problem is how do you compare an expected value to an observed distribution. In that case Group 2 in FET is infinity, and cannot be done with FET, but with a quantum computer, you can do it. Even when you have a perfect thing, like a binomial probability distribution, applying it to complicated real world circumstances can be quite computer intensive. An example is the Student's T test, doing this test again is an approximate probability, it assumes in the standard form that there is no variation in the distribution, and in the Welsh form it may overcompensate the effect of variation (I do normalization to fix problems), but we often don't know if there is differential variation and the Fisher test for variation is even more unreliable than the Student's T-test. But there is a way to calculate p-value exactly, its an infinite repetitions of Monte-Carlo (the limit as number the number of times the Monte carlo analysis is performed that any possible outcome will reflect its exact probability of being greater or less than other outcomes) series and the end result is a concatentated. The problem is that with large groups of numbers then number of possible outcomes is also equal to 2f(n) (where f(n) depends on the type of analysis being done). A quantum computer would not need to do that, it could exactly calculate the probability of all possible outcomes at once and all you would have to do is select the accumulated probability from a series that corresponds to a given observation. So the thing is that a "state", for example the state of glucose in a solution we see in the lab as the consequence of the laws of mass action, but that is only true when N is so great that the relative deviation becomes tiny. But as the volume of a solution decreases at some point that stops being true (and this is true for just about any system where there are 2 or more alternative states, an example would be Floridian ballots in the 2000 presidential election). And it is in these circumstance the quantum computing can define the probability that something is true or false (credible, or not credible) much more precisely than the typical computer. The answer is not yes or no, 1, 2 or 3, the answer becomes a credible range of outcomes, outside of which we would say that glucose is not behaving as expected in this solution, but not necessarily knowing why. The problem with quantum mechanics is the above, in a system where there are possibilities of more than one event (or type of event) occurring in a given timeframe (lets say for simplicities sake a time greater than Planck's time, and in some otherwise empty unit of space) then the quantum system needs to calculate all the probabilities at once (and where in the Universe is this not true?) and yet we are still discovering fields that can be added to the complexity of the system, this is the reason the full characterization of quantum mechanics cannot be done at present, IMO, they may be 99.9% correct, but not precisely 100%. Even if they cannot find an clear example of something that contradicts quantum mechanics, because we need to know what feeds we might not be certain of what comes out. Certainly as we accumulate things over time, the results settle into the expected (exhibit the discussion of EM drive and momentum), but when you get down to widely spread quantum events, things may still remain unclear. For the most part and for the great share of scientific history we have been measuring only the effect of accumulated events, even when we have the ability to measure singular events such as in the LHC, we accumulate the results of those events and come up with a statistic. If we had perfect understanding of quantum mechanics we could have perfect observations, and one atomic collision might suffice to say a given Higgs particle was detected (albeit this is an self-contradictory argument given Heisenberg uncertainty). Evolution itself, a product of quantum effects visible on the macroscale, has basically evolved to a state to optimize the benefit it gets from quantum fluctuations, as a consequence living-things control their DNA damage and repair to an extent that they allow mutations to occur at a certain managable level that allows creation of raw materials for future evolution and tolerating the more abundant negative effects. The point is there is a considerable amount of possibilities with potent quantum effects that we really don't understand. These videos make it sound like - we know. Remember one thing, the universe to us ends at CMBR, that's it, that is the finite end, we cannot see before it or beyond it to other parts of the Universe that expanded away from us in time. The quantum entanglement issue is another, it makes it seem like the quantum state can be pushed to a certain outcome, but is that true or are they simply sorting outcomes that present one direction and ignoring (or allowing to pass-by) other outcomes that don't. -
The graphics are nice, but nicer would have been a mean altitude and a standard deviation (also far less band-width) ME Excel function "=AVERAGE([value1],[value2],[value3],...,[valueN])" or "=AVERAGE([Rx1Cy]:[RxNCy])" or "=AVERAGE([RxCy1]:[RxCyN])" "=STDEV([value1],[value2],[value3],...,[valueN]) or "=STDEV([Rx1Cy]:[RxNCy])" or "=STDEV([RxCy1]:[RxCyN])" If you are lazy count 20 roids. Take the lowest altitude, the highest, add the altitudes and divide by two. The stddev is roughly (highest minus median)/2
-
Why would you do that to us, ignorance is blis. lol. Fortunately cfgs are easy mods.