Jump to content

I'm worried about the possible system requirements of KSP2


Recommended Posts

9 minutes ago, GoldForest said:

Also, PS4 and XB1 have been dropped.

Oh, good. But yeah, dev PCs would typically have SSDs regardless, and now people are shifting to M.2 drives on dev simply to speed up builds and reloading, even if consoles aren't a factor. The PS4/XB1 dev kit that sat on your desk was usually the place where you'd get to see how the game runs with older HDD. With these out of the picture, you really don't see mechanical HDD performance outside of QA tests, and that sort of thing tends to fall between the cracks.

Obviously, I can't talk specifically for KSP2 development. They might be a lot better about testing on older drives, but M.2 or some future equivalent as min-spec is probably something we should all be preparing for in PC gaming in general.

2 hours ago, mcwaffles2003 said:

Does a decision really need to be made on whether textures are streamed from the HD vs stored in RAM? Can a game not be made to recognize the available RAM and utilize that as a buffer while also streaming? I'd figure that kind of strategy would be a best of both worlds thing.

Can? Yes. But it's actually a pain in the butt to build and maintain. Any time you have two different branches depending on spec, one of these will almost never happen on dev machines. (We've typically had 64GB+ of RAM for builds, etc.) And when this happens, the unused path invariably breaks.

A good example is a major optimization we've had on a certain MMO, where on the server, map assets could be shared between several instance. Each instance is its own process, but since RAM was a bigger limiting factor than CPU, a single server could run 4-5 dungeon instances so long as it shared "static" assets. Problem is, memory sharing had to be enabled (I don't know why this wasn't standard,) and so we'd CONSTANTLY get bugs where someone finally runs the code with shared memory enabled and everything crashes.

Edited by K^2
Link to comment
Share on other sites

56 minutes ago, mcwaffles2003 said:

what games struggle on a 3070? Are you playing with RTX on?

By 'struggle' I mean anything not over 80fps.  The whole purpose of this build was to get above 60fps and have really smooth performance on a bigger screen with a pixel density that did not annoy me.  Squad and Tarkov and MSFS come to mind - but I'm not playing shooters atm, and yes, I like the goodies active.  But tbh - in most everything I do have RTX off b/c it does eat performance quite a bit.  

Link to comment
Share on other sites

20 minutes ago, K^2 said:

Oh, good. But yeah, dev PCs would typically have SSDs regardless, and now people are shifting to M.2 drives on dev simply to speed up builds and reloading, even if consoles aren't a factor. The PS4/XB1 dev kit that sat on your desk was usually the place where you'd get to see how the game runs with older HDD. With these out of the picture, you really don't see mechanical HDD performance outside of QA tests, and that sort of thing tends to fall between the cracks.

Obviously, I can't talk specifically for KSP2 development. They might be a lot better about testing on older drives, but M.2 or some future equivalent as min-spec is probably something we should all be preparing for in PC gaming in general.

I doubt we'll see HDD be phased out of development, purely because no one goes full SSD build. People still use HDDs for bulk storage. You can get an 8 or even 10 TB hard drive for the price of a 1 to 2 TB SSD. M.2s are nice, but if you're not using PCI-E NVMe, the performance difference of a M.2 Sata vs Sata 6 gb is neglible at best, and that's what most people today are probably using are either m.2 sata or 2.5 sata ssd. NVMe M.2s are still quite new, so I don't see a LOT of people having them. Even then, PCIe M.2s makes your GPU 16X slot run at 8x speed. Yes, its been proven that 16x vs 8x isn't that big of a performance hit, but some people would still consider not using that PCIe m.2 slot to get those frames that 16x provides. 

Link to comment
Share on other sites

36 minutes ago, GoldForest said:

AND older machines'

They actually used the phrase 'mid-tier'.  I think if folks are hoping to keep playing KSP2 on a 2010 laptop... they're gonna need a different box

On 12/29/2020 at 10:07 AM, mcwaffles2003 said:

Current gen Ryzen processors are killing intel single thread performance scores currently and the 5600X seems to be the best bang/$

Check this out:

Apparently some really good deals on last gen stuff 'flooding the market'

Link to comment
Share on other sites

10 minutes ago, GoldForest said:

I doubt we'll see HDD be phased out of development, purely because no one goes full SSD build.

That's what people said about 3D gfx cards back in the late 90s, and then the devs jumped on them, and you just couldn't play any games without having one. Yeah, most games still had software rendering (for a while), but they ran terrible and looked horrible. So everyone sighed and bought graphics cards. At some point, if devs only bothered to implement a certain subset of hardware properly, there isn't much you can do but roll with it. If game streaming starts relying heavily on NVMe M.2 speeds, and that's the only configuration that gets properly tested throughout development, pointing out that a lot of people still have older hardware isn't going to help.

Link to comment
Share on other sites

1 hour ago, K^2 said:

That's what people said about 3D gfx cards back in the late 90s, and then the devs jumped on them, and you just couldn't play any games without having one. Yeah, most games still had software rendering (for a while), but they ran terrible and looked horrible. So everyone sighed and bought graphics cards. At some point, if devs only bothered to implement a certain subset of hardware properly, there isn't much you can do but roll with it. If game streaming starts relying heavily on NVMe M.2 speeds, and that's the only configuration that gets properly tested throughout development, pointing out that a lot of people still have older hardware isn't going to help.

Sorry, I meant HDD won't be phased out anytime soon. Everyone relies on HDDs, even today. I will admit they are getting used less and less in the consumer market and more in the enterprise market, but they are still used by consumers. 

NVMe M.2s won't be mainstream for another few years. Like I said, they're still fairly new. They won't be the standard for a while. If Intercept Games is using anything to test KSP 2 on it's HDDs and/or Sata SSDs. 

And tbf, storage drives are not too much of a factor when it comes to testing I feel. Not today at least. Sure they might test load times to make sure they're within expected results, but really, people use a whole host of storage options, some of which the devs won't think of. Something like booting the game off a USB 2.0 Thumb Drive. Ridiculous yes, but I have heard of people storing their game libraries on Thumb Drives. Then you have portable HDDs and SSDs that use USB 2.0/3.0 interfaces. You have PCIe SSDs, PCIe M.2s, SATA SSD, SATA M.2, USB thumb sticks, USB external drives (Both HDD and SSD), etc. There's no way to account for all of them in development in a realistic manner, so really, HDD is the best way to test performance. A middle ground between SSD and USB drives. At least imo. 

HDDs today have far better performance than HDDs of the 90s or even 2000s. 

Link to comment
Share on other sites

4 minutes ago, GoldForest said:

If Intercept Games is using anything to test KSP 2 on it's HDDs and/or Sata SSDs.

In the QA, yes, but it really doesn't matter nearly as much as what the devs are going to be actually working on. This is about the processes within an actual studio. It's about how we make games in the real world. In a perfect world, there are constant tests on the entire spectrum of target hardware, both done by humans and automated, any issues are promptly recorded, reported, and are scheduled to work on by both the engineering and content teams to rectify.

In reality, I have not seen this happen in a single studio I've worked at, nor have I heard this perfect world scenario from any co-workers. And I've worked on everything from start-ups to decades old gaming giants, having shipped games with budgets well into 9 figures. The reality of what we do in the industry is that the game is tested on the hardware that sits on the developers' desks. This is why having everyone work with a console dev kit is a huge deal, even if you do expect the game to ship on PC first. I don't know if Intercept has a PS5 and/or XBSX on every dev's desk, but regardless, since the PS4/XB1 are out of the picture, the slowest drive the developers will have to deal with is going to be in their work desktop. And we really want to have the fastest SSDs available. Dev studios started switching to NVMe M.2 drives on dev machines basically as soon as they became reasonably priced for 512MB+. The reason is that we sometimes have to reload fifty times in a day. If the game takes a minute longer to load from a slower SSD, that's an hour of your work day wasted. A $1000 M.2 drive and compatible mobo is still cheaper than the engineering time lost in a year with a fast SATA SSD.

Is this an ideal situation from perspective of reaching your markets? No, not at all. But if we were simply targeting the hardware people have, we'd still be making PS2 games. The reality of gaming business is that it's not just flashy tech sells. There is a vicious feedback loop where the engineers have to work on much better hardware than min spec, and because of that, the games end up running like garbage on min spec most of the time. No matter how much QA complains, there are usually higher priority problems, like performance hitches or outright crashes, and these have to be solved before we can work on optimizing performance for the min spec. So with every release, the plank for what's acceptable gets pushed further out, even if there was no hype from the gamers themselves to buy flashy new hardware. This cycle is a big part of why gaming hardware becomes obsolete, and right now is the time for mechanical drives to die.

In truth, even the SATA SSD should have killed the mechanical HDD, but the part in all of this that might not be obvious is the relationship between game developers and console companies. If we want to ship on MS and Sony systems, we have to follow their license. And the license says that we have to support all of the "current gen" hardware. The exact SKUs change over time, and until recently it included at least all versions of PS4 Pro and XB1X. These still had mechanical HDDs on some SKUs, meaning we still had to support these. With gen 9 finally here, this requirement is dead. Game developers are no longer required to support HDDs. At all. There's going to be push-back from marketing, but it's going to be overwhelmed fast.

By the way, if you're wondering, it's the same reason why ray tracing is gaining so much traction. The visual quality is not that much better, if at all, compared to a good light probes implementation. But the light probes can take hours to build. And you have to rebuild them any time you change level geometry. I can wait for that to go through the builder, or I can just disable the probes and run with ray tracing, which is instant. So any game that has RT support is getting more love and polish on RT than on probes, and this is only going to get worse. Yeah, it's expensive having all of these extra RT cores, but it saves us a ton of money making the game, so we're going to keep pushing this tech until everyone agrees that RT is just a standard requirement now and you have to have RT capable graphics card to play games. Even cheaply-made asset flips.

So yeah, this is the direction things are going. Of course, we're still all going to use HDDs for bulk storage for years to come. But games will have to be installed on something a lot faster. And we're likely going to see a push to get better DMA hardware on mobos as well in the coming years, to get some of the same on-the-fly decompression features that we see on PS5 and XBSX.

 

Sorry about the giant wall of text and a bit of a detour into game development in general. Hopefully, this isn't too off topic.

Link to comment
Share on other sites

@K^2 I don't think it's off topic at all. I mean, the thread is about KSP 2 requirements which means hardware. 

Anyway, to reply to your post:

I can't really argue with you there. NVMe is the way of the future and devs might have to support it for consoles, but PCs still use HDDs and devs have to take that into account as well. Since KSP 2 is a PC game first with consoles, while not an afterthought, are definitely a secondary concern. Intercept stated as much. With PC being the main focus (or possibly their only focus atm), they would have to keep HDDs in mind. Most 'mid-teir' gaming machines are budget-oriented systems which go for a very small SSD for the OS and a large HDD for game storage. They want to improve performance on 'mid-teir' gaming machines, which kind of nictitates a small focus on hard drive performance. 

Of course, this can be done in optimization, which it most certainly will be, but I doubt we'll see an NVMe SSD requirement on the Steam game page.  

Link to comment
Share on other sites

I'd hate to learn that KSP2, when it's done, may load faster on my PS5 NVMe, than on my PC NVMe. It has to be optimized for such storage from the start. But then, PS5 games that load in few seconds, also can work on PS4 HDDs, albeit load much slower. So both can be done.

Link to comment
Share on other sites

1 hour ago, The Aziz said:

I'd hate to learn that KSP2, when it's done, may load faster on my PS5 NVMe, than on my PC NVMe. It has to be optimized for such storage from the start. But then, PS5 games that load in few seconds, also can work on PS4 HDDs, albeit load much slower. So both can be done.

Well, they might just do that. PS5 and XBSX both have DMA. Of course, that matters little because... well, the CPU and GPU are practically one in the same... so... yeah... 

AMD needs a Ryzen 5000 (Select 3000) and Radeon 6000 or higher models for DMA to work.  Of course, they still need the CPU and GPU to talk, but their "Smart Access Memory" will increase performance by allow the CPU to talk to the GPU using a full PCIe bandwidth, instead of talking to one memory module at a time. 

NVIDIA has DMA, but it looks like it might be just for their server clients atm. No Geforce implementation that I can find. 

So, unless you're rocking the latest AMD CPUs and GPUs, you're SOL on the PC platform. 

Edited by GoldForest
Link to comment
Share on other sites

22 hours ago, K^2 said:

We're honestly moving away from the assumption that the primary storage is going to be a mechanical HDD. Most modern games are developed with consoles as a primary platform, as they tend to be the cause of performance bottlenecks, and gen 9 consoles are absolute monsters as far as streaming data from SSD to RAM goes. On top of that, most studios will have very fast SSDs on dev machines to speed up the workflows, so with gen 9 development, even if somebody is testing their builds on PC, there is no longer a point in the development cycle where you're going to be streaming from a slow HDD. There's a bit of a time lag until these kinds of studio changes start impacting the actual games, but we should be just about entering a generation of games that are developed entirely with the, "Oops, we forgot to test the streaming performance with HDD before going into beta. Oh well," kind of attitude. Because of that, I fully expect M.2 to start showing up as part of the system requirements on the new games.

Will this impact KSP2? I don't know. The PS4 and XB1 are still listed as targets, but with the release having been delayed as much as it was, and with these consoles having such major CPU bottlenecks, I am not fully confident that Intercept has been doing a lot of tests on these. My honest expectation is that we'll probably see at least the planets shifted to streaming. There's so much more data for the terrain in KSP2 compared to KSP, that I'm having hard time believing that the team would opt to keep this in memory. On top of that, they seem to be using a lot of techniques common to open world games, which are heavily relying on streaming. So planets should be covered. But that does still leave ships with custom parts, and that might still have to be resident in RAM, and part mods might still have just as much RAM impact as they do to day. More, actually, since there will be more material textures.

This, also unlike CPU or GPU its only give more loading time. More so you can combat this with more ram. Have 64 GB for other reasons and I moved Elder Scroll Online, who is an MMO in the Elder Scroll universe from an HD to an m2 SSD. Did not notice any real difference. Its an MMO so you has to load lots of stuff then traveling around including outfits for other players, but as game is 90 GB I could comfortable hold half of it in memory and other half of that is probably npc dialogue. Now on PS4 with an HD loading screens is an issue in that game. 

As for terrain, you use layers who has been used by games for years. Unlike MS flight simulator KSP 2 don't have to make an location look like London or Madrid. 
You have the color, biome and height map, then you use RND with an fixed seed to add rocks and minor stuff like ground texture based on biome and elevation. 
Used it myself to add roughness to an to plain texture, just add some RND noise. 

Link to comment
Share on other sites

16 hours ago, K^2 said:

Oh, good. But yeah, dev PCs would typically have SSDs regardless, and now people are shifting to M.2 drives on dev simply to speed up builds and reloading, even if consoles aren't a factor. The PS4/XB1 dev kit that sat on your desk was usually the place where you'd get to see how the game runs with older HDD. With these out of the picture, you really don't see mechanical HDD performance outside of QA tests, and that sort of thing tends to fall between the cracks.

Obviously, I can't talk specifically for KSP2 development. They might be a lot better about testing on older drives, but M.2 or some future equivalent as min-spec is probably something we should all be preparing for in PC gaming in general.

Can? Yes. But it's actually a pain in the butt to build and maintain. Any time you have two different branches depending on spec, one of these will almost never happen on dev machines. (We've typically had 64GB+ of RAM for builds, etc.) And when this happens, the unused path invariably breaks.

A good example is a major optimization we've had on a certain MMO, where on the server, map assets could be shared between several instance. Each instance is its own process, but since RAM was a bigger limiting factor than CPU, a single server could run 4-5 dungeon instances so long as it shared "static" assets. Problem is, memory sharing had to be enabled (I don't know why this wasn't standard,) and so we'd CONSTANTLY get bugs where someone finally runs the code with shared memory enabled and everything crashes.

Don't get it, servers in an MMO does not care about graphic at all outside of catching people trying to cheat by to passing trough obstacles. Yes you need to handle npc including enemy creatures but WOW has servers with thousands of players. 
For something like Stadia your comment makes sense but not for an pure game server. 
 

Link to comment
Share on other sites

21 minutes ago, magnemoe said:

As for terrain, you use layers who has been used by games for years. Unlike MS flight simulator KSP 2 don't have to make an location look like London or Madrid. 
You have the color, biome and height map, then you use RND with an fixed seed to add rocks and minor stuff like ground texture based on biome and elevation. 
Used it myself to add roughness to an to plain texture, just add some RND noise. 

It's a little bit more complex than this, because you build geo out of textures, and how you handle boundaries between LoD tiles gets a little involved. There are also non-trivialities about how textures are mapped, especially once you get into virtual texturing. As for procedural placement, the technique used in KSP2 is some sort of an evolution on Horizon Zero Dawn's approach. I've linked a couple of videos in a spoiler bellow if you're interested in a deeper dive.

Spoiler

 

13 minutes ago, magnemoe said:

Don't get it, servers in an MMO does not care about graphic at all outside of catching people trying to cheat by to passing trough obstacles.

Collisions run on the server. Not only to stop players from cheating, but also because mob AI and movement runs on the server, and it can also be relevant to some power effects. So the entirety of the level geometry, nav beacons and meshes, various event scripts, FSMs for the AI... There are a lot of game assets the server needs access to. Obviously, without textures, animation, and audio, you get to cut a lot of the memory footprint compared to the client, but it's still a non-trivial amount of data. If you duplicate it across the multiple server processes running on the same physical server, it eats significantly into RAM you can use for game state, meaning you can run fewer game instances on the same physical server. When memory sharing lets you run 5 instances of a dungeon instead of 4 on the same machine for the same $$$, you spend the effort to implement memory sharing.

There's also a bonus that spinning up new instances becomes a little faster. But that honestly wasn't the economic driving force behind maintaining this feature.

Link to comment
Share on other sites

It is unlikely that the hard drive will limit the performance of the game. The game has about 500 details, as far as I remember, the planets are covered with the same textures. All textures are not 4k resolution at all, the models are not too complex compared to cp2077. All this will be easily loaded into RAM. But it is much more difficult to calculate the interaction of a bunch of details or draw graphics.

Link to comment
Share on other sites

KSP1 has less and it still takes a moment to load all that into memory, even on NVMe drive. And with mods... Devs been saying how you could go make a coffee while making long burns, but it's the same story with starting the game. Yes the save opens immediately, but I'd rather have the game load only what it needs at the moment and not wait too long at any point. I mean come on, KSP1 lists 8GB RAM as minimum and sure enough, it does take most of it. KSP2 will have at least double that, buut why would it load a high res surface texture of the planet I haven't seen yet? Just stream it when needed.

 

Also appreciated random HZD moment, I'm pretty sure I've heard/read something about it very recently because they use the same method in their latest game.

Link to comment
Share on other sites

8 hours ago, The Aziz said:

KSP1 has less and it still takes a moment to load all that into memory, even on NVMe drive.

I think this is not loading from the hard drive to RAM, but unpacking the game assets. At startup, my drive shows traffic at 5-10 Mbps, while the 7700k processor is loaded at 50%.

Link to comment
Share on other sites

3 hours ago, Alexoff said:

I think this is not loading from the hard drive to RAM, but unpacking the game assets. At startup, my drive shows traffic at 5-10 Mbps, while the 7700k processor is loaded at 50%.

And that can be greatly improved by packaging the assets differently. I really hope Intercept has done something about that.

Link to comment
Share on other sites

19 hours ago, K^2 said:

And that can be greatly improved by packaging the assets differently. I really hope Intercept has done something about that.

Is it possible? I thought it was because of the game engine. In any case, the initial loading of the game is a fairly common thing, only Windows loads quickly. If every switching between the hangar and the starting table, and even between crafts became faster, then this would be great improvement to the game. We load the game once, but switch back and forth many times, and each time it's pretty boring to wait. Now open-world games are everywhere that do not require constant uploads of maps.

Link to comment
Share on other sites

2 hours ago, Alexoff said:

Is it possible?

The answer to "Can this be done in Unity if it can be done in a custom engine?" is always yes. But, of course, the real question is "Is it possible with the time and resources?" And that's, of course, a much more relative question.

I have not analyzed how the KSP loading works in a while. I'm mostly going off the fact that the current loading is still CPU-bound, not IO-bound, so I don't expect much changed from the early days. One thing that used to take up a lot of loading time was the part config parsing. This is absolutely something you can fix in Unity if you're prepared to make some sacrifices. The standard C# way is to make sure that your classes holding your configs are public POD classes marked as serializable, and you can then dump their contents to the file, or read these contents back. The sacrifice is having to build assets for every platform and every version from scratch and if there's a major change from version to version, like a new field in one of your configs, you may end up with an offset change that will result in a large patch size. Still, even if some patches have to contain the entire binary dump of the configs, it's not the worst thing in the world, and you're probably doing clean builds of your project for every platform and version anyways. At least, you really ought to be doing that.

Similar techniques can be applied to loading things like geometry and textures. Unity allows you to lock off relevant buffers for read/write operations, meaning you can go as far as grab the mesh you have in GPU RAM, dump it into a binary, and save it to disk. (You can also compile these yourself, which is a better option, but that's a tangent.) Then read just these binary blobs into memory when the game starts, and build the mesh for the scene when it becomes relevant by doing exactly the same operation in reverse: construct a blank mesh object in Unity, lock its buffers, copy relevant data from RAM, and unlock the buffers. I've done something similar to load 3rd party assets in 3rd party format that Unity doesn't know how to parse without needing a converter. It's not the most pleasant code and API to work with, but it's there, it works, and it is stable and performant if you write your loaders right. There might also be shortcuts in modern Unity that make this a little cleaner and require less engineering work.

Finally, there's the question of time it takes to switch between different game screens, and I believe it's because they are actually different Unity scenes, and switching between scenes takes a while. If that's the reason, the standard solution is to not switch scenes. It shifts a lot of responsibility for managing the resources from Unity onto the developer, but this is precisely how most open world games used to solve this problem. You don't rely on Unity, and you do your own loading or streaming as necessary. Maybe recent versions of Unity provide for a better way to do this, but at least the option is on the table either way.

How much of this can we reasonably expect in KSP2? I think serializable configs should have been on everyone's mind, because you kind of use similar techniques for networking. You have to turn your classes holding the ship states and turn them into something you can send across the wire. There are different ways of doing it, but some kind of serialization is going to be a part of it anyways, and you might as well recycle the tech for resource loading.

Looking at the planet tech, I can also say with certainty that Intercept has people that know how to make fast resource loading a thing. So the capability is definitely there, and we're just looking at whether they had the time to do it. My guess would be that delays on some other game systems and multiplayer will hit gameplay and multiplayer engineers a lot harder, giving some slack for the engine tech people to do some additional improvements. So hopefully, this is one of the things that made the cut. If they can stream planets, they ought to be able to stream ships.

Finally, scene switching is probably the most painful part. It's not so bad if you designed the game that way from the start, but if Intercept already had the VAB/Hangar as its own scene separate from the rest of the game, it might be too time consuming to fix it once the game became big enough to start causing loading delays. It's very hard for me to say if it's something they'd be willing to spend time to optimize given the development timeline.

And again, since I kind of expect the feature work during early access to fall on gameplay and multiplayer engineers, I fully expect additional engine work being targeted at performance, both runtime and loading. So what we get day one and what we get when the game gets its full release might be very different.

Link to comment
Share on other sites

22 hours ago, K^2 said:

So what we get day one and what we get when the game gets its full release might be very different.

But the release version will be much larger than the early access version. KSP version 0.90, for example, just flew on my old computer, but the modern version slows down on a new and much more productive one.

Link to comment
Share on other sites

1 hour ago, Alexoff said:

But the release version will be much larger than the early access version. KSP version 0.90, for example, just flew on my old computer, but the modern version slows down on a new and much more productive one.

We had some major planet quality overhauls since .90 and the part count and fidelity exploded. These things should be close to final in KSP2 early access. The only part that's going to be big, new, and add a lot of additional resources will probably be colonies and space stations. I'm a bit concerned about these too. The part that makes me somewhat optimistic is that this shouldn't be a surprise for anyone, so hopefully, it's something the team thought about a lot and has a plan to make sure there is no (significant) performance impact. Colonies can probably be folded into the terrain tech, for example. I've never had to make an engine support a full planet with player ability to build a base anywhere they want, but I have had to support an engine that supports dozens of square kilometers of complex terrain with various environments and a team of artists who might decide to add some buildings anywhere they want. And in my experience, treating terrain, buildings, and randomly placed objects (trees, bushes, rocks...) as part of the whole at the high level, assigning them to LoD tiles, etc, and then taking specific low level optimizations where you can (like terrain being a height map gives you some neat shortcuts) only once you have the high level taken care of. If that's the approach taken, I don't think colonies should present any overwhelming obstacles to keeping the load times down.

Stations are a bit trickier. In a lot of ways they're like colonies, but you're much more likely to be zipping past one at 5km/s. And while that's obviously a problem with KSP stations already, these have always been small enough to where you can gloss over that. (E.g., collision usually doesn't even work, as it's only checked once per frame in KSP) With KSP2, stations can get much larger, so you can't just pretend they're not there if they haven't had time to load yet... So I don't really know what the plan is. One way to address it is to expand the LoD approach to space as well, with the space divided into some form of sectors. SoI are a good starting point, but you'd need to further subdivide these, and then start loading things in advance as you're sort of moving from sector to sector... *shrug* Hopefully, people who have been thinking about it for a few years, rather than five minutes, have better solutions. :D

Ah, right, almost forgot: Interstellar. Yeah, there are additional challenges with that one as well. But it really comes down to all the same problems you have to deal with when you have one star system. If you manage to load your resources without a CPU overhead and your planets stream, well, you don't have anything to worry about. If you already have some loading hitches in Kerbol, they'll get roughly twice as bad when you add Deb Deb. Fingers crossed!

Link to comment
Share on other sites

4 hours ago, MechBFP said:

Don’t forget that KSP 1 load times are tied to frame rates as well. A lot of people are unknowingly handicapping their load times as a result of that. 

I beg your pardon?

KSP 1 load times are tied to the frame rate!?

Link to comment
Share on other sites

×
×
  • Create New...