-
Posts
5,244 -
Joined
-
Last visited
Content Type
Profiles
Forums
Developer Articles
KSP2 Release Notes
Everything posted by PB666
-
Where are their controls, by what absolute measure to they measure these distances. Since, in reality even our measurement of the Sun's gravity and mass is biased by an interpretation of G and each bodies gravitational constant, which may vary in cases up to 1/10^6 and may not be constant over the lifetime of the universe (in-fact probably is not constant over the life-time of the universe), because gravity is the warping caused by an amorphous collective quantum space-time, and space and time did not exist at the beginning of the universe, and probably did not resolved until the quantum singularity had 'fractured' in various and sundry way. Gravity as we know it probably evolved and is still evolving (e.g. potentially an influence of dark energy). The other name for dark matter is dark gravity, just to make a point, we have not isolated any particle that qualifies as dark matter. I disagree that the word precision should be used out of a special context, yes. This is exactly the kind of scenario in science were 50 years on, after gathering more information about quantum mechanics and theory and experimentation concerning quantum space-time they come back and say well whoops we had to recondition the measurements back there with these new equations for estimating distances in space, particularly distances above and below the galactic plane.
- 12 replies
-
- astronomy
- astrometry
-
(and 1 more)
Tagged with:
-
To point out the RF does not leave the cavity, but what it might be doing is establishing electron orbitals that exceed the dimensions of the cavity, that circulate around the thruster. The electrons then interact with matter outside the cavity via virtual particles (more classically realized as electrostatic repulsion between electron orbitals). There is no theoretical limit on the size of electron orbitals, they can be as big as the universe, and the lowest energy orbitals in graphene sheets are the area of the sheet, for any 4n+2 pi bonded material there is at least one orbital that includes all the pi bonds. This means that they may be effective at great distances, however, theory allows for alot of things that are not observed, and I don't think long range interactions will be the case, more like a range dropoff that is in the meter range (probably already diminishing) and completely lost in the 10-100 meter range. THe problem here, on both sides is that you are allowing assumptions and preconceptions to lead the data, which is never good in science. The process is to test and retest under a variety of more and more stringent conditions until you have a set of limits that can be tweeked, prodded, altered, etc until one of these conditions demonstrates the physical nature of its bias.
-
4 (+) hold hold the glutenous fungus-fermented dairy products.
-
-9 (+) only through failure does one truly learn.
-
Culture exist in all social animals. The fact we do animal archaelogy (pack-rat middens and stone-age cultural studies in chimps) as evidence. Human-like culture means the following. 1. Intense manipulation of the environment (Mega-fauna extinction events coinciding with human occupation) debated but widely regarded as true in some form. (50000 years ago) 2. A remnent of abstract thinking (such as blombos cave in Africa). (90,000 to 200,000 years ago) 3. Composite tools (100,000 to 250,000 years ago - probably earlier) 4. Language - complex (200,000 to 1,000,000 years ago) Intense manipulation is generally thought be the result of collective behavior (sedentary lifestyles, agriculture, advanced tribalism, domestication of both edible plants and pack animals) For example the LBK (linear band keramic) is a combination of triticeae agriculture (either is food or as a fodder crop), T1 taurids, a certain style of building (semi - subterranian long house), LB ceramic, certain religious practices. This type of manipulation converted areas of European forest (namely evergreen) overtime into non-evergreen mixed grasslands and farmland. If we were standing in Europe today, have just plopped their by visitors we would remark about the well-developed pine and cypress forests, these forests altered the soil favoring their existance after the last ice age. Humans altered that balance. We can say that by 8000 years ago humans had fully reached this capability. What really is different today and for humans versus all other humans is that we can do this over so many ecological backdrops, from the coldest places on earth, to the hotest, from the wettest to the dryest, to the highest. In terms of terrestrial exploitation we and our domesticants are living at the extremes of the complex eucaryotes. Humans include as part of their diet more different types of complex animals and plants relative to any other species on earth. We extract these proximally and at great distances and under hostile circumstances (e.g. King crab harvesting). Abstract thinking is not uniformly agreed to be possessed by all humans, but it is common in every extant human group, and it is generally agreed upon as the seed for the apical progression of art and technology in culture. It is also involved advanced trading systems, etc. So it allows the progression of complex societies. Prolly as old as the AoA preexpansion in Africa (50,000-100,000 year ago, but likely much older. Composite tools. For millions of years hominid existed primarily using simple tools, hand-axes, scrapers, clubs, and simple spear. Within the last 50,000 years or so progression to the adle-adle, the adze, the bow and arrow. In addition complex homebuilding allowed people to move from world areas with amicable night-time temperatures to temperatures at night that were well below human tolerances. This had fully occurred by 30,000 years ago, at least the dug-out is in the 50,000 year range, beyond line of site travel is over 30,000 years ago. Complex tools could evolve through mimicry of plants or animals in the environment, but abstract thinking adds to the boundaries. If we can imagine for example what a pyramid might look like without math and geometry. Language. Again Language evolved, but sufficient human language is old. Other animals can converse, humans probably have the best cognative facility for doing so. This facility evolved looks like on the order of 200,000 to a million years ago. When we talk about sentiency in terms of exobiology, we are talking about the potential for communicating with us, this generally would mean another species that has Abstract thinking, Composite tools and Language as prerequisite features. In and above this, to achieve the level of technology to be recognized except at very close quarter would require something link Intense manipulating (Metallurgy, ore mining, deforestation for charcoal, etc). By realizing this we then set the bar much higher for culture than just any animal culture. It has to be a culture that is capable of fairly rapid technological progression. Otherwise we would probably not recognize them (either we pass them over, or we tromp them over trying to terriform their planet).
-
Not so, you cannot map a billion stars precisely (unless you redefine precision). You can only precisely map the current positions of objects whose velocity is similar to our sun, given space-time. You can map their position at a given distance (@distance/length of light year)t years ago. But without the velocity and acceleration vector you cannot know where they are now, and even if you had these that now is irrelevant because no spatial-temperol reference frame exists in our visible universe. All stars in the Milky Way galaxy are moving relative to one another, all stars in the milky way are accelerating relative to one another (even though they are in typically non-inertial reference frames, those reference frames themselves are in motion). You might be able to discern parallax and astronomical position over several years, but on the scale of local clusters, that precision declines greatly as distance increases. Lets say those distances are accurate to 1/1000000. If we use the equation (xk/4)^3 = y when y = 1 then then k = 4 and when y = 1,000,000,000 then xk/4 = 1000, then x = 4000 ly. Note there is no absolute way of knowing how inaccurate your distance measurement is. At a error rate of 0.0001% the distance error is 1/250 of a light year or about 2.5 x 10^14 meters. If the distance error is 1/10,000 is 1/4th of a light years and 1/1000 (which I strongly suspect is the high end of their accuracy it is 2.5 ly. If we then consider a probable range it means stars have distance measurements of 2.5 to 25 ly in error at the minimum outside boundary of distance. the star field in not flat, and there is a preference against red stars at a distance so the reality is the error on the furthest stars in the billion is more like 100 ly in distance. Ask yourself this question. How is it possible to know with precision the current position of a star at 1000s of ly from Sun whose velocity relative to our sun is different and constantly changing. So imagine you had a space ship and today, right now, you could go 0.1c to a star at a distance of 5000ly based on this precision (50000 years of travel). Precision is this, if I have a high powered rifle and I mount it in a device where the barrel is firmly placed, and repeated clean and fire the weapon, the bullets every time hit the same target (barring hurricances, changes in gravity, and highly unexpected geological events). Its track per unit distance is precisely (key the word) the same, it never changes. But all my targets for a high powered rifle are the same, they coexist in a definable space-time relative to the rifles position. In long distances of space space-time reference frames are not easily defined because of things like dead stars, black dwarves, black holes, and dark matter. So as you travel to your d= 5000ly star you would miss the target by 10 - 100 ly (that would be 100 - 1000 more years of travel) because you got the velocity and arc of motion wrong. Over a period of 50000 years stars in our local cluster travel 3 or 4 light years and we know their positions 'precisely', but a star whose velocity and acceleration aren't know precisely would not be were we expected them to be. First, their original position changes by the time their light reaches earth, and then as we travel their position changes during the travel. It takes more than a few years of study by an orbital telescope to accurately measure distances and relative velocity, there are anomalies in between stars that alter our measurements (warping in space-time). Lets not feed their hype machine. It takes years of measuring stars and their motion to discern variations in velocity that reflect its space-time and the variation of space-time between the object and the sun over time.
- 12 replies
-
- astronomy
- astrometry
-
(and 1 more)
Tagged with:
-
3 (+)
-
3 (+)
-
An exceptionally foolish occupation
-
5 (+)
-
9 (+)
-
7 (+)
-
You can always @Dman979
-
We went through this the last round, right or wrong, you have to wait for someone else to respond then you can post.
-
6. In the case of the current number being incorrect due to intentional or unintentional misinformation/misinterpretation, a player must notify others of the error, and will have to find the last correct number and quote that post (for easier accuracy checking) and the game continues as normal from the last correct number. You appear to have posted a number realized your error and then appended a new number.
-
You can't post twice a number in the same post. sorry " You cannot perform any operation again until another person has successfully* perform an operation to change the current number. " Frybert's 5 is followed by my 6 (+)
-
Nope, they cannot expect Earth will be a good place, the best places to go are to planets coming into main sequence (their parent star) that have undeveloped life, less risk of a culture kiling diseases. Since these bugs can live deep in the earth and at elevated temperatures is would be difficult to eliminate them. If you have the ability to go interstellor, then I think for certain you want the extraplanetary resources, a system with alot of comets and asteroids will do (assuming you have fusion). Occasionally as an insurance colony you would terraform a orange star early in its mainstage, but mostly as a safety. The reason they would attack earth is because of risk we create, and they would want to learn our technology and explore our biota, robotically of course, first.
-
4 (+) If you spot a potential error just mention it the last good post and post a number to move on. Looks like non-contemporary duplications are OK. so just mention the one you are following and post.
-
3 (+)
-
Banned for defacing a canine.
-
1 (+)
-
A detached object mission for New Frontiers 4
PB666 replied to _Augustus_'s topic in Science & Spaceflight
Lots of dead fuel rods at nuclear reactors around the country in which plutonium can be extracted from. You would not need 3 times the power, you would need 9 times the power, so you would need 9 MMRTGs, or 9 times the antenna. This weight gain is the deal breaker. I suspect that most of these distal kuiper belt objects are pretty much the same thing, liquified and frozen gases covering unknown. There is no real utility in studying these unless you intend to exploit them and I can hardly see the justification for that. IMO, in very deep outer solar system have a low-gravity body is a much better choice, because you do not need to invest dV to land or get off. At plutos orbit it takes about 4000 dV to enter a circular orbit about the sun, If you had a great (unrealistic) ion drive system with fusion power with something like 60,000 dV you could invest in the 8000 need to break orbit and the 10,000s needed to decelerate at Earth, thats pushing it. Really its on the boundary of what it theoretically possible with today's technologies. A much, much, much better and more practical idea is to find a >jovian orbit periapsis short period comet or trojan object to attempt a landing on. Very difficult because of the horizontal dV required to gain orbit and the vertical dV required to stop descent (Or take forever to get there). Land on the object collects samples and return to say mars and await a fly-by mission that could be piggy backed to earth. With jovian trojans there is a much more practical benifit, the objects are just beyond the conditional sublimation line which means volatiles are relatively stable for long periods, and there are a mixture of dust balls and comets, providing the diversity. If these are not sufficient then missions could be planned to the saturn trojans. 1. Light sufficient to due passive spectrography. 2. A means of mapping landing sites and studying rotations 3. Many objects in relatively close proximity and in the same gravity well meaning not to much dV required to select best roids for landing. 4, Both rocky objects (covered with ice) and slush-balls are expected. 5. Close enough for newer solar panels to work, close enough for antenna to work. There is no reason at this point other than vanity to go chasing planetoids into interstellar space, we already have vojagers on that mission. -
Asteroids are natural selectors. Which by the way Kimura set down in the fifties or so that just about all the evolution that we see is the result of variable selection, kind of replaces natural selection. Over time selection varies, today asteroids, tomorrow humans, and later on a red-giant sun. If you look at the most evolved genome humans have modified its not genetically modified corn or soybeans, its actually 8000 year old bread wheat, its genome consists of a fiddley wheat, a spelt and a goat grass. Large chunks have been removed. Its not clear how humans managed this, but it appears that the A and B genomes (wheat and spelt) formed first, then the goat-grass as added later. The various contaminants were common in the fields at the time and luck would just have it that. But the question of cultivation, humans are not the first, leaf cutter ants cultivate. What we do is evolution, whatever tag you want to place on it, artificial selection just divides variable selection into a antropocentric view of evolution. The word artificial implies intentional, but some of what is claimed to be artificial was not intentional. The dingo, for instances, was not intended to be a free roaming wild dog, nor was it neccesarily meant to introgress with European breads but alas thats how some of the Australian dog breeds came about.
-
With the basic assumption that we survive, I suspect that in the attempt of humans to solve questions about where the universe comes from we accident create a new Universe and end the current one, oddly we answered the question but could not tell anyone.
-
How good are you at statistics? Im going to say something here that is very theoretical but has very practical implications. At the subatomic level, meaning how we understand electrons and protons, ions, etc, is determined much more by quantum mechanics that we biochemist like to account for. According to the copenhagen interpretation lack of space-time continuity at the quantum level resolves itself at the level of observation, which is suitable for a chemical interpretation. This sounds very theoretical, but on the scope of the cell, process can be practically inferred from the laws of mass action, this basically mean that quantum uncertainty ultimately results in a kind of distribution of outcomes over measurable time, at statistical outcome. So lets say we took a bacteria and found it to be composed of 100,000 processes, and we created a stoichemetric equation, just as you did with chemistry. We would not get the same result, this is because in chemistry you frequently don't list the Nth possibilities which are random, quantum events would create outcomes from some chemistry that would be unexpected unless you ran each or the equation 100 times. So the next question is how much of a bacteria (in-terms of space-time) the scale of which is down in the 10E-40 range. You'de have to have enough to represent the continuoum of hundreds of atomic and molecular processes somewhere between 10 and 1000 times. You would not have to represent quantum events itself, but the statistical outcomes. Consider water all by itself. Water is not water, or what we think is water is almost never in a pure state. You could not take water say at a given moment rotate the molecules in space and get exact replications in nearly quantum space-time. Aside from the bond wobbling that goes, stretching and shrinking, the electrons are moving further away and closer to water, sometimes electrons from adjacent molecules swap in a process where hydrogen is handed off or returned (and is the reason water has a pH of 7). Now we throw in ions such as sodium, potassium, etc, and this becomes a complex mess. You haven't even started with biochemistry. Biopolymers use water to force hydrophobic parts of its structure to interact, while at the same time exchange protons with water in very complicated ways (Histidine being an example). Ions in water flow into and out of pockets on biopolymers, they chelate, So this is the basic chemistry of life, that is pretty much to say that energy flow within biological systems is pretty much conducted by polymorphic aqueous states at some point in their pathways. There are exceptions such as fat mobilization and lipoprotein complexes. But this one super-system in and of itself given quantum mechanics is a nightmare to model.