-
Posts
4,533 -
Joined
-
Last visited
Content Type
Profiles
Forums
Developer Articles
KSP2 Release Notes
Everything posted by cubinator
-
Based on the scale of the photo of Alcor and HD 116798 (the third star next to Mizar and Alcor) I seem to be getting an image scale of 9.895343242e-5 deg/pixel based on the positions of the stars on the date I imaged them. That gives me a Moon radius of .28583 degrees, which is still too large. Interestingly, the Mars scale predicted by this is very close to Stellarium's answer, and the eyepiece scale that would be predicted by an apparent FOV of 43 degrees. So I'm still not sure why the Moon seems to be too big. I may have made a mistake in my calculations, or picked points that were not appropriately on the edge of the Moon, so I am going to write a script to do the circle radius calculation automatically based on given points, and try it with multiple sets of points and on multiple images. I'm also going to check to make sure that this really is larger than the Moon can ever possibly appear in the sky from Earth.
-
What would a LF2 (Liquid Fluorine) exhaust plume look like?
cubinator replied to KeaKaka's topic in Science & Spaceflight
I think the vibration would shatter normal glass. -
I measured the eyepiece apparent FOV using the laser and got 65 degrees. This is worsening my problem by causing the estimate for the Moon's size to be even bigger. I'm not sure what's going wrong in my math: Telescope focal length = 2030 mm Eyepiece focal length = 12 mm Telescope magnification = 2030/12 = 169.17 Eyepiece apparent FOV = 65 deg Telescope FOV = 65/169.17 = .384 deg FOV radius in image = 1271.5 px Moon radius in image = 2888.5 px Moon size/FOV size ratio = 2888.5/1271.5 = 2.27 Moon diameter = 2.27*.384 = .873 deg I think that imaging Mizar and Albireo will give me a better unit for angular distance than trying to figure out the FOV itself from unknown design parameters. I should be able to do that tonight or tomorrow.
-
lol nope, mine is a Schmidt-Cassegrain, so no lens. Looks like this covered in frost: I'm pretty sure I can just set it sideways on my desk or something, with a piece of paper some measured distance away. Then I can draw marks where the laser hits and quickly measure the angle. I can double check it against Mizar/Alcor or such one of these nights too.
-
I tried measuring the size of the Moon using a photo, but the result was slightly larger than I think should be possible. I had to look up the apparent field of view of my eyepiece in order to calculate this, and since my eyepiece is very old it might not be the same as its modern counterparts advertised online...I may be able to calibrate my measurements by taking a photo of some stars with known distances, like Mizar and Alcor, using the same setup.
-
For the same parameter sweep as above, I've calculated the total deviation from the observed situation. The graph is a simple 2D line plot, although it does jump around a lot because it is sweeping over so many variables. The horizontal axis is simply an index number for the location within the 5-layer loop, and the vertical axis is the sum across all 8 observations of the difference between one Moon radius and the distance between the Moon and Mars in the simulation at the observation times. A point on this graph where the error is zero represents a combination of Moon-Mars trajectories that exactly recreates the observation times that were measured in real life (in that the observation times correspond with a time where Mars and the Moon are exactly one Moon radius apart). As you can see on the right, all of the values checked during this run had at least some error, and you can see on the left that this set is far from comprehensive. Now that I know the code works, I can run it for longer across more trajectories, to see if I can find the 'sweet spot' where the error is closest to zero.
-
I'm now able to sweep through possible trajectories for Moon and Mars together. Mars' distance from the Moon during the MN occultation is known, but its velocity and orientation are not. So the position of Mars is determined partially by the position assigned to the Moon and partially by a swept parameter. Then its direction of motion is swept as well to determine the position during other times. This plot shows a range of Moon trajectories (the long lines) and a range of their associated possible Mars trajectories (the short lines, within which each set of trajectories converges at the position Mars is in during the midpoint of the MN occultation, and extends left back in time to the start of the CA occultation and right forward to the end of the UK occultation). Mars' exact orientation relative to the Moon is not known (although I could potentially establish it with a really accurate map of the Moon's surface latitudes and longitudes)) so I sweep over possible orientations 0-180 degrees relative to the Moon's center and the ecliptic plane. ============= Here is the same figure, zoomed out to show the variety of initial Moon positions calculated. Each Moon position has a set of Mars positions associated with it. You can think of this plot as what you'd see if you were at the center of Earth, looking out toward the Moon and Mars against the starry background. It's not exactly what was observed by us, since we are in different spots where the Moon appears in slightly different positions against the stars - for instance, the UK is further north, above the ecliptic plane where this plot is, and so the Moon lines would appear to be shifted downward because of parallax with the Moon. You can imagine the 'big lines' as being in front of the 'small lines'. ==================== The next phase is to make this simulation check for the "target condition", which has to do with the location of the Moon and Mars at the observation times. For a simulation that matches the real positions of the Moon and Mars, the angular distance between them at each recorded time should be exactly one Moon radius for their respective observers, since these are ingress and egress times. Any other simulation will "miss" the times by some substantial amount. Once I do that, I should be able to check combinations of Moon/Mars trajectories for similarity with the real thing.
-
I've begun to be able to create different scenarios to sweep through. Here's a plot of a range of possible trajectories for the Moon: This plot represents a 10x10 grid of possible 'initial' positions (at 0 UTC on Dec. 8th), and for each position the path of the Moon is plotted through space for 100 possible 'inclination' angles between -5 and 5 degrees. As you can see, this plot is not comprehensive but I can simply add more resolution to the arrays in order to eventually achieve that. I created this simple one in order to not take too much time to compute and to clearly see whether my loops were functioning so far. Looks like they are working as expected!
-
The Moon's angular size is probably different enough from when I last measured it that it'll cause quite a change in the timings, like lasting consistently longer or shorter than expected. Rather than try to find that difference out from the times, I think I'll use my known telescope parameters and the photo to calculate a new angular size for the Moon. That's pretty much how we did it in the Moon parallax experiment, and it should get me more accurate values for the following four values: -Moon angular size -Mars angular size -Moon distance (given Kepler's laws, this might help characterize the Moon's orbit further ) -Mars/Moon angular rate I just have to double check which eyepiece I was using, so I'm not guessing. I'm probably going to wait around for the Moon to show up so I can do that.
-
totm aug 2023 What funny/interesting thing happened in your life today?
cubinator replied to Ultimate Steve's topic in The Lounge
"BREAKING: NEW PICTURES FROM MARS ARE COOL" -
It does. Placing the Moon in the wrong position by a few hours causes Mars' estimated angular velocity to be different. What's worse, though is that changing the Moon's direction by one degree causes Mars' velocity to change by 5%, where I need it to be accurate within around 0.1% to get anywhere close to detecting parallax. So I need to know not only exactly where the Moon is, but where it's going. I *think* I can do this by sweeping over only three variables: the Moon's initial position, the Moon's initial direction of motion, and Mars' exact position during MN occultation (which is actually more of an orientation constraint). That's good, because finding a minimum in four or five dimensional data sounds hard. Everything else seems to be dependent on those, the most egregious being Mars' velocity which has both a magnitude and direction dependent on its own position, the Moon's velocity, and MN's velocity. I could calculate the velocity first and then the position but I'm guessing it wouldn't be easier. Once I get all that from those three variables + the observer location information, I should be able to calculate the Moon-Mars angular distance at each of the times we recorded. Since these are all ingress or egress times, that distance should be the same as one Moon radius. If it's not, then that means that the combination of Moon position, Moon velocity, and Mars position is wrong. Repeat with another combination until I find one that matches. My expectation is that it will never match exactly, but will be off by about one Mars diameter in the UK. This is the parallax, and a couple of SOH-CAH-TOAs later the solar system will be my oyster. Hopefully. And THIS is a pretty straightforward, best case scenario! It probably has logic holes and flaws in it that I'm not seeing yet, but whatever. It's been my project for the day, and I will see where it takes me tomorrow!
-
Kerbal Instrument Panel: In-Desk Apollo Themed Hardware Controller
cubinator replied to richfiles's topic in KSP Fan Works
That's really exciting!- 236 replies
-
- 1
-
- totm jan 2022
- arduino
-
(and 2 more)
Tagged with:
-
totm aug 2023 What funny/interesting thing happened in your life today?
cubinator replied to Ultimate Steve's topic in The Lounge
Finally, we can see an object on Mars in a picture and say "It's not a rock!" -
After placing the Moon at an arbitrary location near the "Full" position and calculating its motion in spherical coordinates as viewed from MN, my code suggests that the angular velocity against the background stars is indeed slowed by MN's motion. The angular velocity of the Moon around Earth's center is 2.6617e-06 rad/s, and the angular velocity around MN during the occultation is around 2.0309e-06 rad/s. From this I should be able to reach an approximate value for Mars' angular rate across the stars, which should hopefully be about the same for everyone since Mars is so far away.
-
I think this is wrong. A few words on my hiatus: This is an extremely difficult problem to think about. You have to consider the position of Earth, Moon, and Mars at the same time. You have to consider the position of three different observers on the Earth at different times. Earth is rotating. The Moon is moving. Mars is moving. Everything is at a weird angle. And when I say a weird angle, I actually mean three different weird angles lumped together. A couple of the angles are changing constantly. The rest change non-constantly. Precise positions and velocities turn out to be very important, and they are damn hard to infer based on loosely related data. I am missing some collateral measurements that would have turned out to have been very helpful, because I didn't know I would need them at the time. When I planned this observation, I thought I could measure Mars and the Moon against a background star - a nice common coordinate that would sort out all the insane irregularities between observing locations that I never fathomed. But the star was no match for the Moon's blinding glare, and the three of us - myself, @K^2, and @Starshot - were left measuring against a meaningless void, an infinite ruler with no tick marks. How am I to calculate the distance to Mars in this void? There are so many angles and distances and timings I can get wrong. If my equation for the prediction of the transit times drifts by mere seconds, the final distance estimate could double or quadruple! Also I was busy graduating college and stuff. And writing an orbital mechanics simulator for a donut planet. Anyway. I think this problem can still be solved. But it's going to take more work than I first thought. Maybe a numerical simulation. That's not very "do it using only simple technology" but it's still "do it yourself". I've been trying to come up with solutions on and off ever since the occultation. Here are my thoughts on what I think might be a course of action that gets a measurement of the distance to Mars: Now, why I think I don't know Mars' angular rate yet. I came up with these 'circles' and center distances to figure out the depth of occultation in each location based on the duration. I used a constant Moon angular velocity between the three locations to solve for Mars' angular velocity. In order to be sure, I went to Stellarium to check that the sectors at the different locations were about what I expected. And I found something completely different. The occultation was longest in MN and shortest in the UK. We measured this, we know this for a fact. But in the UK, Stellarium showed Mars traveling right across the middle of the Moon, just about the longest possible sector! How could a sector so much longer than MN's produce the shortest duration occultation of the three?! The answer, I think, is that the inertial velocity of the observers caused by the rotation of Earth is a significant fraction (up to 40% depending on latitude and time of day) of the Moon's orbital velocity, and causes the Moon's angular rate across the background sky to be different between one location and another. During moonrise (CA) and moonset (UK), the Moon's angular rate will be high because YOU don't have much velocity in the same direction as the Moon. But during the middle of the night (MN), when the Moon is high in the sky, the spinning Earth gives you a huge amount of velocity in the same direction as the Moon's orbit. This slows the Moon's motion across the stars quite a bit, and I think this is what's responsible for the...difference in difference in time. I can get the angular velocity of Mars if I know the MN-specific absolute angular velocity of the Moon, since I know the occultation depth and duration. I can get the MN-specific absolute angular velocity of the Moon if I know the position of the Moon at ingress and egress. The trouble is that I don't know exactly where the Moon is. There are a couple ways to go about this. One way would have been to measure the azimuth and altitude of the Moon by hand while I was out there measuring the occultation. I had no idea I would end up needing that, so I didn't do it. Now I'm stuck only sort-of knowing where the Moon is. I could look up what the Az/Alt was at the time using Stellarium, but that would be tacking on more "taken at face value" numbers to the problem. I'm going to see if I can do without for now. I can assign an arbitrary location for the Moon that's "close" to where it actually was, and calculate the Mars angular rate based on that theoretical location. It was a full Moon, so the Moon must have been at least fairly close to the Sun-Earth line, and I figure the difference of a few degrees will probably (hopefully) not change the estimate of Mars' angular rate too much. This is the method I'm probably going to try next. It's extremely important for me to get an accurate estimate of Mars' angular rate because it is a significant factor in the timing of occultation in different locations. If I predict Mars to be in the wrong place by one diameter after three hours, the timing prediction will probably be off by around 40 seconds. Since the timing difference due to parallax is probably not more than 40 seconds (corresponding to a distance between observers of 1 Mars diameter), that extra difference from the motion could cause the distance estimate to double. After I estimate Mars' angular rate, I will ditch the simplistic guess of the Moon's location and start sweeping every possible location and velocity of the Moon to find occultation timings that are close to matching what was observed. I hope that I will be able to figure out where Mars is supposed to be in each case - that will be a problem for another day. My hope is that in the end, I will find a Moon trajectory that *almost* matches the observation, and the reason it is not exact is that Mars is assumed in the simulation to be infinitely far away. The only remaining difference between this simulation and reality should be the parallax, and the distance to Mars will be revealed. As you can see, this is extremely complicated and takes a lot of steps that I don't fully understand. I will hopefully make progress on it and eventually solve it. Maybe I will try looking up the Az/Alt of the Moon and using that. I'd still probably have to mess with the Moon's direction of motion anyway, though. In the end, it is not enough for me to know how far away the Moon is. I need to know where it is. I think I have enough constraints to figure that out: The timings of the occultation in the three locations, and the depth of occultation in MN. This is a very interesting problem to me, and I'd be very proud of myself if I figured it out. I'm still coming back to it for a few reasons. The distance and size estimates would be really cool and impressive on their own, but I think they also can enable some deeper characterization of the solar system that I've dreamed of doing for a long time. If I know Mars' distance and angular rate, I can get the relative velocity between Earth and Mars. From that and the length of the year, I might be able to get the size of Earth's and Mars' orbits around the Sun, and the Sun's size. The size of Earth's orbit precludes a lot of interesting information that can be gained from observations of the other planets. I will take my time, and I will let you know if I come up with anything! Booting up MATLAB...again....again!
-
totm aug 2023 What funny/interesting thing happened in your life today?
cubinator replied to Ultimate Steve's topic in The Lounge
I got a PLL skip and solved a 3x3 in 16 seconds one-handed. -
The flying debris definitely looked a lot smaller than in the orbital test.
-
AI Will Never Be Sentient... Cyborgs On The Other Hand...
cubinator replied to Spacescifi's topic in Science & Spaceflight
Brains definitely use something similar to weighted sums of signals, and there is spatial organization of signals which seems at least somewhat similar to the vector math that AI does. For instance, in the cricket nervous system first sounds are sorted by frequency (as in, there are literally a bunch of neurons in a row that each fire according to a certain frequency like a xylophone) and then the combination of frequencies (along with temporal information stored in a slightly more complex way) goes up to part of the brain where it gets physically sorted in space in a little lobe, where each important combination of sounds causes a corresponding neuron to send a signal to a different part of the brain or body - thus a sound coming from a nearby bat might create a signal in the cricket's body that stops it from singing. Another important trick that factors into this is the presence of logic gates in the brain. You often have one neuron that needs to have two input neurons firing simultaneously in order to start its own signal - this is an AND gate, and it's very common. You can also have inhibitory signals, where an input neuron firing can cause the receiving neuron to go silent, so you can have a NOT gate, and of course OR gates are quite possible. Through these gates and the 'weighted' response of individual neurons and connections, more complicated signals can be interpreted and produce complex actions. -
What I note is the Dragon sitting on Falcon Heavy! I agree. It would still need some effort and maybe a test flight or two. I think taking stepping stones like this in deep space is an excellent way to prepare for Mars, visiting NEOs (and maybe taking a swing past Venus at some point? )
-
AI Will Never Be Sentient... Cyborgs On The Other Hand...
cubinator replied to Spacescifi's topic in Science & Spaceflight
I think we already do know how to give AIs agency. All you have to do is make it respond to input continuously, like being able to watch the world through a camera, and maybe additionally recursively so it can respond to its own response to give an internal monologue, conscience, etc. It's pretty easy to string different models together to accomplish more complex tasks. I think the big barrier to this right now is that really good AI models aren't quite fast enough to react to stuff in real-time on a local computer yet. They are a little too slow to take in all that sensory information. But that's changing astonishingly quickly, and some people are already experimenting in a few cases with letting AIs run loose in Minecraft or on the internet. I think people are still figuring out how to interact with these new AIs in the first place, and we are still at an early stage of learning what we can even do with them. That's why people are focused on experimenting with the more controlled "type a prompt, get a reply" sort of interaction as opposed to much more complicated, versatile, and nuanced systems that are possible a few steps down the line. -
AI Will Never Be Sentient... Cyborgs On The Other Hand...
cubinator replied to Spacescifi's topic in Science & Spaceflight
How well would the average human handle being put in the pilot's seat of an airliner and told to land it? How about a dolphin, rat, or dog? Each of us has plenty of unfamiliar situations that we're not "programmed" for and can't handle. I don't think this is a very solid definition of free will. ========================================== I think the thing that AI doesn't have right now, which defines whether people will perceive it as "sentient" or not, is agency. They sit patiently until someone tells them to do something like make an image or write a response to a prompt. They don't decide to get up and do something on their own yet. But they can very easily be made to. Pretty soon we will start seeing AI 'brains' connected to free-moving robots with sensors and locomotive parts. These robots will be able to see, hear, and feel the world and react to it "according to their programming". This programming will produce varying 'personalities' between robots much like how varying chemical compositions and distributions give personality and preferences to such things as ciliates, tardigrades, ants...and humans. The robots will have the capability of being averse to crossing a busy street, or turning to face a person approaching them and trying to figure out what the person wants from them. You will be able to give it a grocery list and have it go to the store to buy stuff for you. They will have the ability to process feedback AND the ability to continuously observe and engage with the world. This will make it easier for them to start 'thinking' on their own. For instance, after it returns from the grocery store, you will be able to ask it if it saw anything interesting on its way, and it will be able to answer you based on what it considers 'interesting' to itself or you. If its programming is simply given a longer leash, so to say, you could even ask it if there's anyplace it wants to go, and it could answer, or maybe even go there itself. I think it's true that most AI systems aren't 'sentient' in any way currently. But there are a lot of situations that blur the line already. I think an AI that has wishes and goals of its own and the ability to freely pursue them is completely possible today, and that sounds a lot like a living creature to me, even if it's not as smart as a human. Whether it's technically 'sentient' or 'alive' is beyond what I know, and maybe no one can really answer that. What I do know is that whatever is going on in the human brain runs on just 20 watts, fits in a little over a liter, and it's made of molecules and logic gates and a bunch of salt ions running back and forth across membranes. If we are made of just that, then I think a machine can absolutely be sentient, because what are we if not big, convoluted machines? Brain-machine interfaces are a whole other world which can be mediated by AI. Neural interfaces don't have very high resolution right now, and we can only connect to small parts of the brain at a time, but AI will be able to act as an 'interpreter' between brain signals and computer signals, and even back to a different person's brain signals. I think the technological bottleneck in telepathy is the devices that need to go in our heads without causing trouble, not so much the understanding of the strange brain signals, because the AI will already be able to figure that out. -
It won't be. It gets clicks and ad views.
-
The railgun shipping those metals to lunar orbit will be pretty obvious, whichever country first builds it.
- 70 replies
-
- 2
-
- Space resources
- ISRU
-
(and 1 more)
Tagged with: