Jump to content

K^2

Members
  • Posts

    6,181
  • Joined

  • Last visited

Everything posted by K^2

  1. Intel market share is significant, but it's not dominant. And yeah, developers most certainly do take this into consideration. What sort of a user pool are you looking at? I'm talking 2M+ surveys.
  2. Steam hardware survey is by far the most comprehensive report, dwarfing statistical significance of just about any other analysis. And for what it's worth, hardware surveys and crash reports I've seen from our titles are consistent with these figures. I don't know if you are operating on outdated, biased, or simply statistically insignificant data, but nVidia does currently have greater market share among gamers, and game developers most certainly do end up catering to nVidia more.
  3. That's brilliant. I was thinking that there might be a way to do an estimate based on similar assumption, but it felt like a lot of work. Glad to see that someone has already done the work.
  4. Are you familiar with the game development/update process? For starters, nVidia hardware is more common among developers. CUDA is not the last factor in that. While games don't use CUDA, a lot of the tools used in making games do. So any nVidia-related bugs tend to be caught really early on. Then it goes on to QA. nVidia there is also more prevalent. Partly because it's easier to have the same hardware across the entire studio, and partly because it reflects market share. Finally, after the game goes out to users, nVidia generates more bug reports, simply because there are more nVidia cards out there. So any problems with nVidia hardware get fixed faster. Saying that nVidia vs AMD experience is identical simply because hardware is very similar is very naive. Yes, nVidia and AMD both have contracts with devs. There are devs out there that target AMD first. Majority of the devs, however, don't specifically target either, and still end up with better nVidia support. The actual difference is marginal. Most gamers will never be aware of the quality change switching from one to another. But there is most certainly a difference, and it's silly not to go with hardware that gives you even that much better chances of having smooth experience.
  5. Squeezing laser beam has nothing to do with negative energy. The reason squeezing has applications in laser cooling has to do with entropy, not energy. Energy density of any laser beam is always strictly positive. This is very easy to demonstrate by using the fact that photon's 4-momentum is a null vector. The consequence is that a beam with negative energy density cannot propagate freely.
  6. That's because you, effectively, need a DX11 compliant card to use CUDA efficiently. There were drivers to make it work with older cards, but there really isn't a point now. All CUDA software targets CUDA 3.0+, and yeah, that's 600+. There were also some bad generations and bad years for nVidia, when ATI stuff was way ahead. But if the question is which card OP should buy, odds are, he'll be marginally happier with a modern nVidia card.
  7. Ah, I see. Date line runs through the middle of a time zone. I did not anticipate that. Thanks for pointing it out.
  8. NVidia tends to be better supported. They release more driver patches to fix issues in AAA games, and a lot of devs optimize for nVidia.
  9. You just ended up with 25 time zones. But yes, there is an international date line in Pacific which gives you +24h when you cross it going East.
  10. Assuming all else stays the same, radiated energy is proportional to T4. If E0 = kT4 and E1 = k(T - 1K)4, then E1 = E0 * (1 - 4K/T) to first order. At mean T = 288K, 4K/T = 1.4%. So you'd need to reflect 1.4% of incoming energy to bring equilibrium temperature down by 1K or 1°C.
  11. Yeah. You need to numerically integrate the trajectory. There isn't a formula for it. A very simple script can be written for it, though. If you want to be able to do this yourself, I would recommend learning how to use Octave.
  12. There is, indeed, a horizon around an observer falling into a black hole, and it does enlarge as you fall in, apparently. (See: Schwarzschild Bubble.) I'm not sure about the other claims, but it's an interesting idea. I'll have to run some numbers.
  13. Take perfectly spherical planet. Field is stronger as you get closer. This causes a tidal "stretching" force in the radial direction. What's interesting, it also "squishes" things in the two other directions. These sorts of things can actually be measured.
  14. That's not precisely right, either. The Equivalence Principle states that you cannot distinguish between gravity and acceleration locally. Even from a spherical body, gravitational field is uneven. You can measure gradients across all 3 directions.
  15. The fine-grained entropy is actually a conserved quantity. And the coarse-grained entropy can locally satisfy the second law in a static universe. That does, however, contradict the expansion. Given the expansion, yeah you are absolutely correct. But it is intriguing how little would have to be different for our universe to be a static one.
  16. Excell won't cut it for this one. There are too many parameters to search on. I might be able to write a custom program that does a search by trying a whole bunch of possible start dates. I assume we can chose arbitrary starting time in the KSP world? And the challenge is to visit SOI of each planet, right? The Eve - Duna - Jool series of fly-bys should be fairly straight forward. The challenge would be in including Moho, Dres, and Eeloo in the search. GoSlash, I'm not sure if this is a traveling salesman. There are a couple of permutations possible along the way, but your starting options are greatly limited. The delta-V reduction dictates that your first fly-by should be Kerbin. That will only give you enough boost to reach Duna or Eve. And going to Duna first makes visiting Moho and Eve very complicated. So Kerbin - Eve - Moho - Duna pretty much have to be visited in this order. And you'll need Jool fly-by to reach Eeloo. So your options are visiting Dres before or after Jool, which might not even be an option depending on where Eeloo is. What does pick up the NP complexity here is where to budget your burns between fly-bys. But I don't think that is even solvable. I would go with "zero" delta-V plan. That is, assume that any delta-V change required to reach your next planet is due to fly-by. The only fuel usage is then due to corrections you have to make to hit the right part of your next target's SOI. Fuel requirement for the mission should then be the Kerbin escape budget. Perhaps, even just Trans-Munar if you want to include one more fly-by. With "zero" delta-V plan, the algorithm would simply chose the next good Eve fly-by date, and see if it can make the necessary fly-bys. Then pick the next date, try again, etc. If it finds a solution, it will consist simply of series of periapsides you need to hit on each fly-by.
  17. The whole point is that they are observable. Not only that, but if we assume extra dimensions exist, we can predict what these interactions are going to be. We have done that. These interactions would result in certain observables. We have looked for these observables. We have excluded them as possible with countless experiments. Ergo, extra dimensions don't exist by a contrapositive. The alternative is field theory is completely wrong, but gives predictions correct to 12 decimal places by pure chance. There are some versions of extra dimensions we have not completely excluded. But these can be rolled under an umbrella of holographic interpretations. All of the known ones are impossible, but that's not even the main point. Holographic interpretations exhibit duality with gauge symmetries. In fact, they must, because gauge theory works. So if they exist, they simply perform a function of an external degree of freedom given by the U/SU symmetries, contributing nothing new to physics.
  18. It does. Carbon dating has to be calibrated against known dates to be accurate. Trees and artifacts with known age are a frequent source of callibration material. As it's been pointed out, radio carbon dating isn't going to help you on geological time scales. However, if all you are trying to prove is that there are objects much, much older than 6,000 years, radio carbon dating is sufficient. To actually get the 4B+ value for age of Earth, you need to use other techniques. And, of course, the age of the known universe comes from cosmology, so that has no correlation with any geological/archaeological dating techniques.
  19. Pressure is equal to the weight of air above you (per unit of area). As pressure drops, so does density. As density decreases, the amount of air you leave bellow as you go up decreases as well. So pressure drops slower with altitude. If you ignore temperature variations, in fact, pressure drops exponentially. This is the model that KSP uses, with P = P0 exp(-h/H), where h is altitude above reference point, P0 is pressure at that reference point, and H is scale height. On a real planet with real atmosphere, temperature variations are significant, so the shape isn't quite exponential, but it can be reasonably approximated as such for some ranges of altitude.
  20. Essentially. We're still searching for dark energy, because we want to measure it directly, and hopefully map its abundance in the universe, or at least our neighborhood, but we aren't trying to define it. What it does is what it is. By the way, the answer to the gravity question is that mass bends space-time because stress energy tensor is the conserved charge of the Poincare local symmetry, whose gauge field defines curvature. All of this has to do with the fact that Poincare group is one of the local symmetries of the Lagrangian. (It is a global symmetry of QCD Lagrangian when gravity is not taken into the account.) So the current goal post is "Why does Lagrangian obey local symmetry rules, and why these particular symmetries?" There are also some very serious questions on quantizing relevant field when gravity is involved, but that's basically the topic of quantum gravity, so it's something we at least have a handle on. That's all just to point out that goal posts aren't just moving, but moving fast enough for general public, and even specialists from other fields, to be typically a few decades behind.
  21. ZetaX, he got you there. Two lines don't have to intersect in 3D, in fact, any two random ones are guaranteed not to, but if you chose two lines that do intersect, they do still intersect at a point in any number of dimensions. For the third time in this thread, that's what they've been trying to do with bulk in String Theory. This assumption makes predictions that don't check out. And we don't need to invent explanations for dark energy. We already know what it is. We simply don't know if it has other properties or be able to actually detect it directly.
  22. They do. For some special cases. Because it makes computations a hell of a lot easier. When you want to consider gravity just in the neighborhood of a black hole, or when you are working out a design of a theoretical warp drive, you can use embedded manifolds to simplify a lot of the math both computationally and conceptually. It's a great tool. But that's kind of like reducing orbital mechanics problem to 2D. It works in a simple case. It does not solve general problems. That's why I've mentioned superstrings and bulk. We can't detect these things directly, but we can ask what would be different if these dimensions existed in some way that's topologically relevant. We get some variation of String Theory. Well, String Theory makes predictions. We've been testing these predictions for over two decades on larger and larger accelerators. By now, String Theory has a standing of, "An interesting idea which shed some light on particle physics, but has no direct practical applications." We simply don't live in a universe where String Theory applies. There are still some people who are trying to work out some variation that might work, but essentially, if you need proof that there are no higher dimensions, accelerator facilities at RHIC and LHC have provided it.
  23. Depends on resolution you want to get. The ratio of the "pixel" real size on the image you get to the distance to the object is the same as ratio of light wavelength (about 0.5 microns) to the diameter of the objective lens or mirror. So if you want to be able to see 1mm from 1km away, you need a telescope with a 0.5m mirror. 0.5m / 0.5 microns = 1km / 1mm. You can do the math for all of the objects you are interested in. There are astronomical interferometers that can give you better resolution, because for them, it's distance between telescopes in array that gives you resolving power rather than mirror diameter on any one telescope. But without optical processing, these will not have the same versatility as radio arrays.
×
×
  • Create New...