-
Posts
6,173 -
Joined
-
Last visited
Content Type
Profiles
Forums
Developer Articles
KSP2 Release Notes
Everything posted by K^2
-
how do u think the ksp 2 hardware requirement will be?
K^2 replied to quangdinh's topic in Prelaunch KSP2 Discussion
The numbers look great. I just don't know how much difference it will make on KSP1. That game's really badly optimized to use multiple cores. It basically runs as fast as the fastest thread you can give it, so I don't think 12th gen will make a lot of difference. Though, one thing I have to say is that the fact that E cores can take on any of the background tasks means that you should see full turbo speeds on your P cores while running KSP1, so you'll really be giving it the fastest core you possibly can. That's on top of 12900K being a little faster thread-for-thread than 10700K to begin with. So I'm sure you'll see an improvement. It just won't be as much improvement as it can make for other games. Hopefully, including KSP2. -
how do u think the ksp 2 hardware requirement will be?
K^2 replied to quangdinh's topic in Prelaunch KSP2 Discussion
To be fair, I can see a world where it has a problem running KSP2 under Windows 10. Gen 9 consoles run 8/0/16 (P/E/T) core config, and 12600K is a 6/4/16. A game optimized for a gen 9 console will have workload spread across 14 performance threads of the 7 cores that are usually available to the game. So on 12600K, that means at least two threads ending up on efficiency cores, which can royally screw up frame timing if they aren't scheduled properly. And scheduler that can properly distribute work between P and E cores is a Windows 11 feature. So some games that are well-optimized and still push PS5 processor to the limit can struggle on 12600K under Windows 10 simply due to scheduling issues. This might get addressed with a future Windows 10 patch, or maybe 12xxx owners will just have to accept the awful UI of Windows 11. That said, KSP2 is a Unity title, so I highly doubt the core load is going to be anything remotely that even. So long as your main and physics end up on P cores, which it really should even with W10, I think you'll be fine even if some game threads end up on E cores. In which case W10 or W11 shouldn't make a difference, and 12600K will do just fine either way. So it's probably a moot point anyways. And yeah, if 12600K can't run KSP2 well, then neither can the consoles. So we really ought to hope. We've seen examples of some techniques that are generally VRAM hungry. This is why I specifically call out volumetrics, like clouds and such, and procedural vegetation. Vegetation placement, for example, looks a lot like technique used by Horizon Zero Dawn, in which case, it's a GPU placement based on multiple additional density maps that have to be in VRAM. And any sort of volume rendering techniques take up a lot of memory. And this is on top of the planet surfaces, which will require LoDs for absolutely everything. And I wonder if Intercept will end up using virtual textures to optimize that, which will eat up even more VRAM. I haven't seen anything to absolutely confirm the latte, mind, but it would be a good idea, and it is something recent versions of Unity support. What might allow KSP2 to run on a lot less VRAM is if some of these features can be disabled or allowed to run at much lower resolution. E.g., the shadow map cascades look way higher resolution in KSP2 than KSP, but also, it is something you can, at least in theory, tune to the capabilities of a particular graphics cards, and you'll simply have slightly worse-looking shadows. I honestly don't think that's happening anymore. They'd have to do a full back-port, potentially cutting a lot of features. By late 2022, I'm not entirely sure it will be worth the effort. Yes, I know there will still be a lot of PS4 and XB1 players out there, but given that KSP2 isn't going to be sold as many copies as some more popular title, I don't think it will be enough to offset the costs of back-porting. I fully expect KSP2 to ship as a gen 9 title. Even XB Series S support is a big question. -
how do u think the ksp 2 hardware requirement will be?
K^2 replied to quangdinh's topic in Prelaunch KSP2 Discussion
Core i5 has been around for over a decade, so there are a lot of CPUs in that category. A 2011 Core i5-2500 will certainly not run KSP2. On the other hand, the latest entry in that series, Core i5-12600K, outperforms CPU in PS5, so I would hope it will have no trouble running KSP2. In between, there is some gray area. Similar situation in graphics, except 1GB is not a lot of VRAM. I actually suspect KSP2 will want more than 2GB due to planet textures, atmospherics, and procedural vegetation. But also, it will almost certainly want DirectX 12 features. So a lot of older GPUs are going to be out just on that. A good starting point is to look at gen 9 consoles. PS5 has a CPU very similar to Ryzen 7 3700X and a GPU somewhere between Radeon RX 6600 XT and 6700XT. It also has 16GB of RAM, but that's shared between CPU and GPU. So a system with 8GB of system RAM and 8GB of VRAM on, say, RX 6600 XT should be very close to what you get on PS5. I wouldn't recommend this as an actual build for this game, but it's a good place to start the conversation. Of course, KSP2 is likely to be much more forgiving of graphics than CPU, so you can definitely get away with an older graphics card. It's hard to say where the cutoff is going to be, but if we go with DX12 feature set, you are looking at GeForce 600 series or Radeon 7000 series as hard cutoff. But these came with little VRAM and are rather slow by modern standards. Depending on how much you can tune down the settings, the lowest I can imagine it going is GeForce GTX 960 or Radeon R9 380, both of these at 4GB of VRAM. On the CPU side, I think developers will be coming close to the wire on console specs, with PS5 being the bottleneck. In that case, you don't want to go far bellow PS5 caps. That would mean aforementioned Ryzen 7 3700X or Intel Core i7-10700K. There is a bit more room to breathe with newer chips. As I said in the opening paragraph, Intel's 12600K should do just fine, and so will Ryzen 5600X. Both of these are at a lower price point now than the older alternatives mentioned. That said, these are still mid-high CPUs, so I do hope the game runs on lower spec or it might leave a lot of people unhappy. -
Note the sin/cos flip on X/Y coordinates and the fact that 0° heading points along Y because of it. This allows it to be right-handed with Z-up while heading still runs clockwise. Effectively, the reflection across X=Y plane compensates for the reflection across the XZ plane that you get from heading running clockwise, which introduces the sign change for the Y component. Net result of these two reflections is a 90° rotation, so that heading of 0° ends up along Y instead of X. Which, I mean, Y might as well be North, right? So this is a bit unconventional in terms of notation, but perfectly serviceable.
-
z = sin(pitch) Also, keep in mind that this assumes that you are using the local XYZ coordinates with respect to which pitch and heading are given. It will mean that your Y axis points North, X points East, and Z points to Zenith. This is fine, if these are the coordinates required, but might need to be adjusted for a different convention. If your coordinates are relative to SoI, however, with Z or Y always pointing along the planetary axis, then you'll have to adjust the transformation based on where you are located. It's significantly more math to transform from local pitch and heading to SoI XYZ coordinates, so @Cannon if that's the conversion you need, please reply or mention, and I'll give you the steps. It's just too much to type out if it's not what you're looking for.
-
Good news? Recent KSP2 job postings: Community Manager positions...
K^2 replied to TLTay's topic in Prelaunch KSP2 Discussion
Unfortunately, this doesn't refine timing better than "Some time in '22," that we already know, but it is a good indication that T2 isn't foreseeing any further delays. -
Everything that becomes a cloud of debris above the orbit of ISS will eventually be on the path of ISS as the orbit degrades. Orbital mechanics doesn't take bribes. This will put ISS in the hazard area within a few years. The debris cloud should disperse by then to make the odds of impact low, but there is absolutely zero way to predict or avoid this danger now.
-
totm dec 2019 Russian Launch and Mission Thread
K^2 replied to tater's topic in Science & Spaceflight
If I present my comments in the language they are streaming from me right now, I'll get banned from the forum. *deep breath* This was among the most irresponsible things done in space in the recent history, maybe all of history, of space flight. Keep in mind that general vicinity of debris cloud from 500km impact is guaranteed to cross paths with ISS eventually as their orbits degrade. The odds of an actual impact are still on the low side, and since larger debris are tracked, ISS will likely be moved to avoid the highest risk areas, but the odds that ISS crew will have to deal with puncture holes in the hull in the next few years has gone up by orders of magnitude from this event alone. -
For Questions That Don't Merit Their Own Thread
K^2 replied to Skyler4856's topic in Science & Spaceflight
Gravity is a phenomenon caused by curvature. If you stand inside a rotating room, it seems like there is a force pushing you out from a center. In fact, if you describe dynamics from perspective of coordinate system attached to the room, you actually have to add this term as an external force to make the math work. Likewise, if you are in an accelerating car, a similar kind of force pushes you into the car's seat. These are inertial forces, also known as fictitious forces. A bit of a terrible name, since they technically work the same way as absolutely any force, and therefore aren't any less real, but we're stuck with this historical terminology. Gravity is such an inertial force. The thing that makes gravity special is that no choice of coordinate system makes it go away. If you describe a rotating room from a non-rotating coordinate system, centrifugal force goes away. Instead, you have wall pressing into you to keep you going in circles. Likewise, a car's seat has to press into you to keep you accelerated along with the car. You can't make gravity disappear from your equations of motion in the same way. You can choose a coordinate system where gravity isn't experienced locally - like if you are free-falling, you don't experience gravity. But the fact that ground is rushing to meet you at an accelerated rate suggests that gravity is still there, plotting your demise on the global scale. This is because of the space-time curvature. In flat-space time, which, technically, only exists in school physics problems, we can choose an inertial coordinate system that makes all fictitious forces go away - no gravity. In curved space-time, in general, that is not possible, and we call the force that accounts for this curvature "gravity". The source of curvature is stress-energy tensor. That is a generalization of energy and momentum densities to arbitrary coordinate system. People often casually simplify it to just saying that energy causes gravity, but it's really energy, momentum, and how they flow that all together influence the curvature. In practice, though, unless you're dealing with something very exotic, like a neutron star or a black hole, or something very large, like clusters of galaxies, you really can reduce it to just how much energy there is. And if you want to get super technical, it's not even that the stress-energy is directly causing the curvature, but rather that the stress-energy is the conserved current of the Poincare symmetry, which is the extrinsic symmetry of the Lagrangian, and is therefore the source of the gauge field, which happens to be related to the metric tensor, which determines the differential curvature. But that's, like, some years of lectures in a sentence, so I hope previous paragraphs provide you with some intuitive sense for gravity and curvature. -
For Questions That Don't Merit Their Own Thread
K^2 replied to Skyler4856's topic in Science & Spaceflight
I don't think it's an ethics debate. Even if you were to consider life that could evolve as an ethics question, there is basically no chance of that happening in either ecosystem. Both are only going to become more inhospitable as time goes on if left to the natural course. And treating simple life from perspective of ethics is silly. Your immune system commits genocide against simple organisms on a daily basis. So the question is purely utilitarian. Is there something to be gained from studying these organisms in their natural environment? Or can we collect samples, catalogue, and lose little to nothing from complete replacement? I think by the time we actually are in a position to terraform either environment, we'll be decidedly the latter. -
Solution to the minmus paradox
K^2 replied to Newgame space program's topic in Prelaunch KSP2 Discussion
If it's very clean naturally formed ice, yes. Though, icy bodies tend to have higher albedos as their surface tends not to be that pristine. It's fine for an estimate, though. You are playing very loose with heat conduction there. It takes time for a linear gradient to establish. As materials start to warm up, gradient starts out a lot steeper, meaning the surface temperature is going to be higher, and the skin layer is going to be very close to peak. The boundary condition here is nasty, so Mathematica told me to get lost, but I'm curious now, so I'll set up a quick simulation tomorrow. Looking at situation with real bodies at ~1AU, though, I don't expect it to be even close to 270K. But I owe you some numbers. Yeah, that's kind of the bigger problem here. The evaporation rates for ice I was able to find vary between about 1mm/hour and 10mm/hour in vacuum, but yeah, it's significant. And while temperature drop drastically reduces the rates, you have to have them go decidedly cryogenic if you want the ice to last for anything like geological time scale. Between liquid water and atmosphere. Earth is pretty good at distributing the heat. Water can absorb a lot of day/night variation, and atmospheric circulation can actually help move heat from equator to poles, so that it can be radiated away from larger area. So under the right conditions, temperature on Earth can be prevented from rising much above average, and average can be well bellow freezing if the planet is covered with ice. The second part is the aforementioned evaporation. If ice evaporates into an atmosphere, and your entire planet is frozen, that ice is going to get re-deposited somewhere else. Some amount of water will always be in atmosphere as moisture, but it's not a significant amount compared to ice on the surface. On the other hand, if you have no atmosphere, the vapor is going to be blown away by the solar wind. So anything that evaporates is effectively gone for good. Yes, some amount of re-deposition is still going to happen, but you are going to be losing most of the ice to the vacuum of space unless it's really, really cold and evaporation rate is negligible. -
Solution to the minmus paradox
K^2 replied to Newgame space program's topic in Prelaunch KSP2 Discussion
But then you'll have to explain gravity, which is even harder. And before anyone suggests putting a chunk of neutron star or a tiny black hole in the center to generate the gravity, that would cause too much stress on the shell, causing it to collapse. So unless Ringworld Engineers have built Minmus out of scrith just to mess with Kerbals, I don't think that's going to work. -
Solution to the minmus paradox
K^2 replied to Newgame space program's topic in Prelaunch KSP2 Discussion
That's not how it works. At relevant time scales, more than a few meters of rock might as well be perfect insulation, so the surface can be considered as a infinite flat plane in equilibrium with stellar flux and background of the cosmos for purposes of thermal analysis. While the average surface temperature without greenhouse gasses and with high albedo can be bellow freezing, without atmosphere you also have very high fluctuations between day and night, and day high is going to be about 70% above average equilibrium, which is way, way beyond the freezing temperatures. Yes, near poles, situation is different. Stellar flux incidence angle is very low and it's possible to have regions of near permanent shade. But Minmus has "ice" lakes in equatorial regions. That's flat out impossible by a margin that's not even close. -
Solution to the minmus paradox
K^2 replied to Newgame space program's topic in Prelaunch KSP2 Discussion
It's not about atmosphere. Given proximity to Kerbol, Minmus is too hot for ice. Even if it's cooled by sublimation of said ice, it'd disappear entirely too fast. And it cannot be replenished by external sources, as that would generate additional heat. In short, icy Minmus just isn't possible. But I do like glassy Minmus solution. -
Kerbal Space Program 2: Episode 4 - Celestial Architecting
K^2 replied to CoolRanchAJ's topic in Prelaunch KSP2 Discussion
So that's Charr, Gurdamma, and Glummo confirmed for Debdeb system, I guess, with Donk and Merbel being two of the moons there as well. I hope that Ovin is in another, yet unnamed star system, otherwise, Debdeb is getting all the rings. -
For Questions That Don't Merit Their Own Thread
K^2 replied to Skyler4856's topic in Science & Spaceflight
The underlying problem is not transportation, but the die shortage. There are only so many factories in the world capable of processing silicone wafers into chips of various quality. The process is energy, resource, labor, and equipment intensive. That means that coming up short on even one of these can cause problems. COVID has acted as a catalyst for a perfect storm of shortages across the sector, and now the demand has backed up. The problem is that absolutely everyone with need for a high end chip is competing over the same manufacturing capacity. PC and console components (RAM, CPU, GPU, SSDs...), cell phones, car computers, components for servers, routers, and gateways, and a bunch of other tech you might not even think of in day-to-day. Some of the lower end stuff can be shifted to older processes, but a lot of it can't simply because it was designed with more modern components in mind. The recent releases of new generations of consoles, GPUs, and CPUs kind of matched up to bring the problem to the attention of the consumer, but just because demand for these died down a bit and supply had a chance to catch up a little, doesn't mean the problem went away. On the contrary, we are expecting the situation with some of the existing manufacturing to get worse, as some areas are now also impacted by draught, labor shortages, and potentially makes it difficult to replace some of the equipment used on production lines. There are new facilities being built, and supply will eventually catch up. But some people expect another year or two of shortages and outrageous component prices. -
I would prefer each SoI to have its own coordinate system and the attitude indicator to just change with SoI change just like the altitude indicator does now.
-
For Questions That Don't Merit Their Own Thread
K^2 replied to Skyler4856's topic in Science & Spaceflight
Yes, a little. The actual semiconductor physics of it is complex - I've spent a year on a condensed matter track before switching to particle theory, and most of it is still bizarre to me, but in terms of practical consequence, what you have control over is clock frequency, voltage, and temperature of the die. The thing you actually care about with performance is the clock frequency. The number of instructions performed by CPU, outside any time wasted waiting on memory or whatever, is a multiple of CPU clock frequency. (You can also overclock memory and GPU also with their own limitations and caveats.) The problem is that there is a limit to how fast transistor states can switch, so if you get the clock speed too high, you end up with errors. You can compensate for that by increasing the voltage applied - kind of equivalent of throwing switches harder to get them to move faster to a new position, but there's a limit to that before you risk physical damage to the circuits. (Again, not unlike physical switches.) Finally, cooling the die helps to reduce thermal bounce, which both allows you to go a bit higher on the voltage and helps given voltage settle the transistors faster. So if you can cool the die more, you technically can get away with even higher clock speeds, which is why you see all the records set with liquid nitrogen setups. That really only helps you to squeeze the last drops of performance, though, so just a good cooling setup is adequate for most uses. But where it comes back to power consumption is that both increasing the voltage and increasing the clock speed increase the amount of power consumed by the chip. And 100% of power consumed becomes heat, so even if you aren't trying to drop the temperature down, overclocking usually requires a good heat management setup just to prevent overheating. And laptops are disadvantaged on both of these. You have limited amount of energy available to power the CPU - certainly on batteries, but even when plugged into a wall, that portable power supply can only do so much, and you are limited on the thermals as well. Laptop just doesn't have room to fit a good cooling setup. Because of that, the laptop CPUs are usually designed to be more power efficient and less performant, but they do usually come with multiple power modes. You need very little CPU power to browse the internet, and being able to throttle down is important to conserving battery power. What usually happens is that the clock speeds will go down, voltage might adjust down as well, and some cores might become disabled to make the batteries last as long as possible, and only kicking into high gear when you're playing a game or something. Even with a laptop, the manufacturers will usually stay on the safe side of keeping the thing reliable, so you can usually squeeze a bit more performance out of the system, especially, if you're plugged into the wall. Whether it's easy to configure will depend on the motherboard and CPU used. I have been able to overclock some laptops a little bit to make the games run better, but there is really not a lot of room to work with usually before you start running into heat throttling. That is, CPU detecting that it's overheating and starting to drop the clock speeds. (If it continues, it may shut down entirely.) So if you want to tinker with performance, you really ought to be doing that on a custom-built desktop actually designed to take the punishment. -
For Questions That Don't Merit Their Own Thread
K^2 replied to Skyler4856's topic in Science & Spaceflight
Oh, man, layers upon layers. Optimization is a huge topic. Very broadly speaking, if we limit it to games, you are looking at one of the following categories. 1. Algorithm improvements. Sometimes you can just rearrange how operations are performed and make the code faster. This can be literally reducing number of instructions that have to be executed, reduce time lost to memory latency, or reduce time interacting with OS or hardware. Usually, you don't see big wins in something like this after the game is released, but every once in a while there's a stupid mistake that somebody missed that makes a huge difference. 2. Threading optimizations. Sort of related to the above, but when you are working with multiple threads running in parallel, you are sometimes losing time on threads having to wait for each other. So by simply rearranging when operations are performed you can sometimes get huge performance wins. Again, usually, you get that out of the way before the game is released, but sometimes, improvements like that can come in after release. A particular case is if the code was originally optimized for a very specific core count (*cough*consoles*cough*) but later, re-optimized to cover broader range of possible CPUs. 3. Removing unnecessary code. Things like writing logs can really slow performance down, and sometimes that's accidentally left in the final game. Finding and removing that stuff helps and it's more common than you'd think. 4. Engine/Library/Driver improvements. Especially if you're using 3rd party engine like Unreal or Unity, just because you're done working on the game, doesn't mean they're done improving the engine. Sometimes, it makes sense to switch to a new version of an engine, and sometimes, it runs a lot better. (Also, sometimes worse, but that's what you get with relying on 3rd party software sometimes.) Likewise, an update to something like a graphics drivers might fix something your game has been relying on, in which case, it's a welcome surprise of better performance. It's rare, but it happens. 5. Hardware improvements. Just because your hardware didn't change, doesn't mean the code wasn't updated to make better use of hardware improvements you already have. This could be done with an explicit change to the game code or be picked up as part of the engine or library updates as in the previous section. In either case, you end up with your hardware better utilized giving you better performance. 6. Code optimization. If computer executed the code the way it's written by programmer, things would run at least ten times slower than they do. With a modern compiler, code is first converted into some sort of internal representation, with compiler removing anything that's found to be unnecessary and simplifying some loops and function calls. Then the representation is converted into machine code for particular target architecture, and compiler removes redundancies, shifts things around to make better use of registers, and may even rearrange order of instructions to make them fit better into the pipeline. When CPU executes instructions it will also convert them into micro-code and potentially re-arrange them to improve execution. Now, the programmers have very little control over any of that if any. But updates to compiler and associated libraries can result in better code produced by simple recompiling the project. Likewise, the way your CPU converts instructions into microcode is subject to firmware updates. Some of the optimizations also have to be enabled, and again, you'd be surprised how often games ship with some of the code unoptimized. Obviously, if it was a global disable, somebody would notice, but a few unoptimized functions in a core loop can really slow the game down. There are tools that let you examine what's going on with the code. We can look at time spent on specific function calls, how busy individual cores on the CPU are, when various calls to OS and hardware are made, how much time has been spent waiting for other resources, and so on. But it's still a lot all at once, so learning how to improve your own code and how to look for problems in code handled by someone else is a huge part of being a games programmer. -
You can start losing surprisingly heavy elements if you dip close enough to the star early enough in the formation through combination of heat and gravity. I'll grant you that a planet in that situation is on its last legs before being swallowed outright, and it would take a miraculously well-timed boost from something else falling into the star, but out of all the stars in all the galaxies... Unless that something else got ejected or absorbed into the primary. Again, we have no idea what sort of dynamics that system could have been going through before settling down. Merging binary star system, for example, can produce an absolutely wild distribution of planets and compositions which will otherwise will look tame and normal a few billion years after the stars merged. I wouldn't expect to encounter these kinds of system very often, but if we give developers a fiat of picking an interesting system to exemplify in the game, none of this goes outside plausible. I don't disagree. And it'd be nice to get a nod to that somewhere in the game, pointing out how bizarre it is for a planet to have these properties. I mean, obviously that's why developers want it in the game. I just don't think it's a bad reason to put a planet like that in the game, so long as it merely stretches the limits of what's physically possible for a real world, after the 10x scale correction, of course. If it was just absolutely impossible, I would be against it too. Merely very, very, unlikely is acceptable if it's in the name of good gameplay and interesting place to explore. I mean, if we start getting technical about it, Lathe is already a good example of a world that's exceptionally unlikely to happen. But it's a fun moon to have in the game and I'm glad it's there.
- 143 replies
-
- 3
-
- ksp2
- kerbal space program 2
-
(and 2 more)
Tagged with:
-
There are several processes that can strip planet of lighter elements and prevent it from accumulating helium. Two most likely are spending a lot of time very close to the primary and then migrating out to its current location, potentially by redirecting another planet into the star in the process, or a close encounter with a gas giant early in the formation. Both of these can result in a planet that's basically a core of a gas giant with a thin crust of rocky material on top and an insubstantial atmosphere for that heavy of a planet. Since a larger fraction of the planet's mass is due to its larger iron core, the average density is also a lot higher. There was a bit of a discussion of an exoplanet candidate with similar apparent characteristics a while back. Though, perhaps, still not quite as dense as Ovin, it shows that such events happen in star system formation.
- 143 replies
-
- 3
-
- ksp2
- kerbal space program 2
-
(and 2 more)
Tagged with:
-
You have to land on it without crashing first. At 4G, unless the atmosphere is as thick as Eve's, parachutes aren't going to do much good. And I strongly expect it to have a very weak atmosphere based on the images. That planet is going to be a graveyard of broken ships.
- 143 replies
-
- 2
-
- ksp2
- kerbal space program 2
-
(and 2 more)
Tagged with:
-
For Questions That Don't Merit Their Own Thread
K^2 replied to Skyler4856's topic in Science & Spaceflight
Are you just trying to build a chain? Or do you need at least one tower visible from every point on Earth? Even in the later case, you can come up with a space-filling pattern that's more efficient. Think a branching out fractal-like structure. Simply filling all space with a grid of tower is not the optimal solution if you're trying to build fewer towers. -
Wasn't there something about Rask/Rusk situation being re-worked, because there were problems with n-body? It might be just patched conics throughout the game.
-
Oh, absolutely. What I'm talking about is stuff that's been built directly into the game executable. Actual example from a recent game, we had to generate convex hulls for some geo for a feature involving physics particles for FX. It was handful of objects with low vertex counts, so it was cheap enough to do the build on demand. But because the library generating the convex hulls was under GPL (or similar license), the shared code containing that library would not get compiled in for release version. Which meant we had to move the generation of collision hulls to the builder, and now we had to track the dependencies to make sure that the collision hulls are built for all the right object and not for all the objects, and then platform-specific dependencies became a factor, and it was all in all a couple of weeks of work just to get the build process for these sorted, whereas the initial proof-of-concept on-demand loading was done in an afternoon. And yeah, in principle, the entire builder could be released as open-source project with all of its libraries separately from the game. (Edit: Or as you point out, still as proprietary tool, but with open-soruce libraries that contain the actual GPL code delivered separately.) That's not going to happen for that particular game, because that's a whole another can of worms that the studio doesn't want to deal with, but it's not because of any legal restrictions. My point, however, wasn't about situations where tools are intentionally released by the game developers to be used by modding communities. It was entirely about the situations when tools get left in because they weren't worth a hassle to remove. In the above example, if we could ship the game with that hull generator, we would have. There just was no reason to pull it out of the game other than licensing. And there are plenty of games out there with dev features left in the game either disabled or simply unused. And when you are trying to ship a game that is easy to mod, sometimes, all you have to do is absolutely nothing. If you already have build tools integrated into your executable, you just ship it as is.