JoeSchmuckatelli Posted March 16, 2023 Share Posted March 16, 2023 (edited) 6 minutes ago, RocketRockington said: So overall the same, basically? If you're gonna keep going with this tabulation, it'd be easier for people to understand if you tracked the delta, but of course thats up to you. I don't know if I'll continue. There was one area I did see ~ 5 fps improvement; flying around the KSC in a spaceplane. That was noteworthy. But I was surprised to see my numbers were virtually identical. I guess if they have another performance improvement I should do something like a spreadsheet. Edited March 16, 2023 by JoeSchmuckatelli Quote Link to comment Share on other sites More sharing options...
Streetwind Posted March 16, 2023 Share Posted March 16, 2023 It's not unusual to expect different systems to see different changes in performance after a patch. If the patch focused on addressing a specific bottleneck, for example memory usage*, then those people who were hard limited by available memory would see a significant improvement, while those who already had memory to spare would see only minor differences. Meanwhile, if shader processing efficiency was increased, then the people hard limited by memory would see little to no change, whereas the others would get higher FPS. So it always depends on a lot of factors how high exactly one single system's improvement will be. But averaged across the whole community, we will likely see a noticeable improvement. * Note, that was an example. I don't presume to know what areas the changes in this particular patch addressed. Quote Link to comment Share on other sites More sharing options...
pandaman Posted March 26, 2023 Share Posted March 26, 2023 My current specs are... Quad Xeon W3550 @3.07 GHz 24 GB RAM GTX 1650 1080p (27") 550 watt Power supply It runs KSP2 rather better than I feared it might, but naturally very very sluggish on launches etc. RAM and CPU don't appear to struggle, but (4GB) GPU maxed out as one would expect. I want/plan to upgrade the whole thing to a 'recommended spec' at some point but would rather wait a year or two... Is there a suitable, and worthwhile, GC upgrade that I can simply plug in as a direct replacement now to tide me over for a couple of years. Quote Link to comment Share on other sites More sharing options...
Streetwind Posted March 26, 2023 Share Posted March 26, 2023 (edited) @pandaman Depends entirely on your budget and PSU. A RTX 4070 Ti is suitable, and a worthwhile upgrade, and will last you a good couple of years. It's also very expensive and may not run in your PC, depending on the power supply. The Geforce 1650 runs just off of the PCIe slot it's plugged in, requiring no additional power connectors. That means that it puts zero requirements on your PSU. If you upgrade to almost anything else, you'll run into a situation where you'll need to connect additional power connectors. And if your PSU doesn't have them, the cards won't run. 550W is a good size, and actually quite oversized for your system as-is; it should be able to power a stronger video card with no issues. Keyword being "should". There are $30 bargain bin 550W PSUs that burn down if you ask more than 300W from them. There may also be ultra old PSUs that don't come with the requisite connectors even if the quality is right. So please take a look at your PSU. If you can name the exact model, I can look it up - or, you can simply tell us how many 6pin/6+2pin/8pin power connectors you have available, if any. And then name a budget you're comfortable spending. Edited March 26, 2023 by Streetwind Quote Link to comment Share on other sites More sharing options...
pandaman Posted March 26, 2023 Share Posted March 26, 2023 1 hour ago, Streetwind said: @pandaman Depends entirely on your budget and PSU. A RTX 4070 Ti is suitable, and a worthwhile upgrade, and will last you a good couple of years. It's also very expensive and may not run in your PC, depending on the power supply. The Geforce 1650 runs just off of the PCIe slot it's plugged in, requiring no additional power connectors. That means that it puts zero requirements on your PSU. If you upgrade to almost anything else, you'll run into a situation where you'll need to connect additional power connectors. And if your PSU doesn't have them, the cards won't run. 550W is a good size, and actually quite oversized for your system as-is; it should be able to power a stronger video card with no issues. Keyword being "should". There are $30 bargain bin 550W PSUs that burn down if you ask more than 300W from them. There may also be ultra old PSUs that don't come with the requisite connectors even if the quality is right. So please take a look at your PSU. If you can name the exact model, I can look it up - or, you can simply tell us how many 6pin/6+2pin/8pin power connectors you have available, if any. And then name a budget you're comfortable spending. Thanks , very much appreciated. My PSU is... Sea Sonic SS-550HT Active PFC F3. As for budget it depends really on what's compatible and available at what prices. Don't want to spend an awful lot if I'm thinking of upgrading the lot in a a year or two, but if a GPU upgrade is viable and keeps me going for longer then I can consider. Probably looking at around £500 absolute max. New PC at recommended specs is around £1700 minimum. RTX 3080 is about half that (is it even compatible), but if I'm spending that much on GPU alone then why risk any compatibility issues anyway? RTX 3050 around £300, which looks interesting. Is my basic system good enough to warrant a reasonable GPU upgrade? Gut feeling is probably, but I'm no expert. Quote Link to comment Share on other sites More sharing options...
The Aziz Posted March 26, 2023 Share Posted March 26, 2023 3050 is weaker than 2060 which you can probably get cheaper. It's just the lowest of the lowest, it doesn't even have raytracing capabilities if that's worth anything to someone. I'd honestly go for either 2060 if you're going for a budget build (buut my own is struggling whenever there's ground in sight, 15 to 20fps at most so eh, but smooth elsewhere) or, if higher, 3070, that's what I'm aiming for. Once the price goes down that is. 3080 could last longer but then there's price issue. I know nothing about cable requirements though. You probably need one for a newer card, if you have a spare cable that would fit into GPU slot, you should be fine. Another thing worth reading is general GPU power requirements, NVidia has it all on their website. Quote Link to comment Share on other sites More sharing options...
Streetwind Posted March 26, 2023 Share Posted March 26, 2023 (edited) 1 hour ago, pandaman said: Is my basic system good enough to warrant a reasonable GPU upgrade? Gut feeling is probably, but I'm no expert. System memory is great, 24 GB will easily satisfy KSP2's needs. The CPU is... honestly, ouch. That's a Nehalem based product from 2009. It's from before Intel even started using the "core i" moniker for its mainstream CPUs, meaning it is a solid thirteen generations outdated at this point. You are not merely below the minimum spec with this, you are deep in "I'm amazed the game actually launched" territory. You're lucky the devs aren't targeting any advanced CPU instruction sets (yet). They could well do that, at some point, when they start optimizing in earnest. There are no CPU upgrades for that platform that are worth it anymore. Play with this system as long as you are happy with it (or at least, can bear it), and then, do a completely new build. So yeah, video card. That's the thing you can swap. The platform is going to be a bottleneck, so you don't need to buy anything fancy. The important part is 8 GB VRAM. KSP wants tons of VRAM, and 8 GB is what I'd call the minimum for 1080p. The Nvidia 3050 offers this, and is about twice the performance of your current card. There's also the Radeon 6600 XT or 6650 XT (depending on which happens to be cheaper at the time), which would ordinarily be my recommendation since they often cost only barely more than a 3050 yet yield 40%-50% more FPS than it. However, I'm going to tell you to not take either. Both of them have a half bandwidth PCIe interface to save costs. This isn't an issue on modern mainboards which run PCIe 4.0 or 5.0. Your mainboard, however, is running PCIe 2.0. Even at full bandwidth that'll slightly throttle data throughput to modern GPUs. Halve it, and a VRAM-loving title like KSP2 is not going to be happy. Instead, look for a used RTX 2060 Super 8GB. Make sure it's the "Super" model, and not the regular 2060. That one only has 6GB, which is not enough for KSP at 1080p (trust me, I have a 6GB card, it does not suffice). Where I live, these cards can be found used for significantly less money than a 3050 costs new, despite being around 33% faster. Your system is going to be limiting this video card, preventing it from going as fast as it could... but you still won't get better price/performance out of any other 8GB card, I wager. At least if your area's used prices are anywhere near what they are over here. I could pick up a used 2060 Super for 170€ from a reputable-looking private seller right now, where a brand-new 3050 starts at 270€. A steal, if you ask me. When you do buy a 2060 Super, look at pictures of the card to find out if it has a single 8-pin PCIe connector. If necessary, message the seller and ask. Some third party OC models would have shipped with an 8-pin plus a 6-pin, and your PSU won't be supporting that, so you need a model with the default single connector. As it is, your PSU only comes with two 6-pin connectors, but given this is Seagate - a very good brand - I would be surprised if they didn't include a 2x 6-pin to 1x 8-pin adapter in the box. I hope you still have that box lying around after nearly fourteen years... you'll need it. If you don't, you'll need to buy such an adapter online. Edited March 26, 2023 by Streetwind Quote Link to comment Share on other sites More sharing options...
LoSBoL Posted March 26, 2023 Share Posted March 26, 2023 1 hour ago, pandaman said: Thanks , very much appreciated. [snip] around £300, which looks interesting. Is my basic system good enough to warrant a reasonable GPU upgrade? Gut feeling is probably, but I'm no expert. I think that with your basic system it would be quite a gamble upgrading the GPU, an Xeon W3550 is 14 years old, that is most deffinately holding performance back, the 300 pounds would get you an Mainboard, CPU and 32 GB DDR4 well within the recommended specs, and you can buy an GPU later. Quote Link to comment Share on other sites More sharing options...
pandaman Posted March 26, 2023 Share Posted March 26, 2023 (edited) Thanks for your help and advice everyone. I think my best plan is 'new build' before too long, and just plod along with what I have until I do it. The game runs 'ok' in space, but the 'novelty' of 3FPS when launching things or flying around Kerbin is wearing a bit thin now . Edit... Took the plunge and got a 'recommended specs' machine knowing it will last me a good few years . Holy cow the difference is HUGE, I'd got accustomed to KSP1 chugging quite a bit too, so to play in actual 'real time' is taking some getting used to. Edited March 29, 2023 by pandaman Quote Link to comment Share on other sites More sharing options...
magnemoe Posted March 26, 2023 Share Posted March 26, 2023 On 2/27/2023 at 11:50 AM, LoSBoL said: At this point in time, 3 days after release, there is no telling in what hardware really is needed to play the game. Sure you can get yourself a high performant gaming rig if you have money to spare, but if the game needs it in a couple of months from now is a question nobody can answer. So if you got a computer that runs other games just fine, my advise would be, keep your money in your pocket right now untill there is more insight. Did you already opt to upgrade and KSP2 is not the only game your going for, go ahead, buy yourself a performant gameing rig. You got yourself a laptop with integrated GPU? That will probably not become playable ever, don't just buy yourself a new gaming laptop, if you got room to spare consider a PC in the future, and just stream the game to the laptop you already have. Agree, now more an more games will drop the previous console generation, or at best they will come later on them as Hogwarts Legacy will do. It makes perfect sense as the coders can work on optimizing and cutting/ down scaling while artists craft new areas and quests for DLC. Now this might drop the PC requirement for the game but pc already has very flexible in settings. Quote Link to comment Share on other sites More sharing options...
magnemoe Posted March 26, 2023 Share Posted March 26, 2023 4 hours ago, Streetwind said: @pandaman Depends entirely on your budget and PSU. A RTX 4070 Ti is suitable, and a worthwhile upgrade, and will last you a good couple of years. It's also very expensive and may not run in your PC, depending on the power supply. The Geforce 1650 runs just off of the PCIe slot it's plugged in, requiring no additional power connectors. That means that it puts zero requirements on your PSU. If you upgrade to almost anything else, you'll run into a situation where you'll need to connect additional power connectors. And if your PSU doesn't have them, the cards won't run. 550W is a good size, and actually quite oversized for your system as-is; it should be able to power a stronger video card with no issues. Keyword being "should". There are $30 bargain bin 550W PSUs that burn down if you ask more than 300W from them. There may also be ultra old PSUs that don't come with the requisite connectors even if the quality is right. So please take a look at your PSU. If you can name the exact model, I can look it up - or, you can simply tell us how many 6pin/6+2pin/8pin power connectors you have available, if any. And then name a budget you're comfortable spending. I got an 4070 ti, it was nice for an day then I shut it down and strap and redirect some cables who hit fans. System died on powering up, the 750 W corsair PSU died and killed the motherboard. I assume its because I started using another of the psu 12 V GPU lines as the 4070 needed 3 while the old 98 ti only needed 2 and its probably 4+ years old. I also had an option to acquire an better CPU but then the cooling failed or rater the corsair water cooler gave its stupid internal data who was an lie so I had to under clock to run. So I got an new water cooler from ASUS who is design to work with the motherboard. Returned the MB and cooler as still under guarantee. But why is CPU package 10 degree hotter than CPU? Quote Link to comment Share on other sites More sharing options...
Streetwind Posted March 26, 2023 Share Posted March 26, 2023 (edited) 38 minutes ago, magnemoe said: I got an 4070 ti, it was nice for an day then I shut it down and strap and redirect some cables who hit fans. System died on powering up, the 750 W corsair PSU died and killed the motherboard. Sounds like the classic issue with the new 12+4-pin 12VHPWR connector that Nvidia started using with this generation. The way it's built, it's really hard to tell if it's properly seated and locked in. And if it isn't, and you just happen to bend the cable the wrong way, some of the pins no longer contact properly, all the power tries to go across the few remaining contacts, which are not rated for this and melt down, shorting the card and/or PSU. You're not the only one, it happened all over the globe with the launch of the 4000 series cards. Nvidia has tried brushing it off as user error, but behind the scenes there's been a quiet redesign of the ATX v3.0 spec for this connector that makes it much easier to plug in and seat properly... And all this just because Nvidia wanted to avoid having to mount three to four 8-pin connectors on their absurdly unreasonable flagship. (I'm not kidding, that's literally the cited reason for why Nvidia worked with the ATX committee to introduce the new connector: because it allows for higher power ratings without consuming an undue amount of PCB space.) 38 minutes ago, magnemoe said: But why is CPU package 10 degree hotter than CPU? Different sensors are mounted in different parts of the CPU and measure different kinds of temperatures. Don't worry about it, so long as all sensors stay below their individual safety thresholds. Even then, the CPU is supposed to throttle to prevent overheating. You don't have to memorize all the threshold temperatures - typically, whatever monitoring software you use should warn you when they are exceeded. Edited March 26, 2023 by Streetwind Quote Link to comment Share on other sites More sharing options...
JoeSchmuckatelli Posted March 28, 2023 Share Posted March 28, 2023 On 3/26/2023 at 4:54 PM, magnemoe said: 750 W corsair PSU died Check out @Nuke's problem with a 750W Corsair. Sounds like either 750 really isn't enough for this generation, or Corsair might have a problem with their 750s. His is a headscratcher. I'm still thinking it is a problem with properly seated cables... which should not really be a problem. The user should have relative confidence that plug-n-play parts plug in well - and the only way we really have to determine that is the Mark 1 Eyeball. If it looks good, it should BE good. FWIW - I generally like overhead in the PSU department after frying a system with an underpowered one. Quote Link to comment Share on other sites More sharing options...
Nuke Posted March 28, 2023 Share Posted March 28, 2023 (edited) 2 hours ago, JoeSchmuckatelli said: Check out @Nuke's problem with a 750W Corsair. Sounds like either 750 really isn't enough for this generation, or Corsair might have a problem with their 750s. His is a headscratcher. I'm still thinking it is a problem with properly seated cables... which should not really be a problem. The user should have relative confidence that plug-n-play parts plug in well - and the only way we really have to determine that is the Mark 1 Eyeball. If it looks good, it should BE good. FWIW - I generally like overhead in the PSU department after frying a system with an underpowered one. im really not sure its the psu's fault at all. so far ive only had a problem in one game, and i can solve it by turning down the settings, at least until they make a better sfx supply. it just doesnt have the headroom id like. i blame gpu manufacturers for putting performance above all costs, efficiency being the big one. im also not putting it beyond just being a ksp2 bug. so far ive only had crashes in the vab, and not the game actual. id be somewhat interested in seeing if i could make it crash in game, but since i currently have no budget for a replacement, i dont think id be doing that. i think the psu is working as intended, opting to shut down rather than to destroy itself is good design. some games have had problems with their ui overloading the renderer, i think starcraft 2 had this problem where the ui would render a lot faster than needed for a ui and could result in cooking your hardware. perhaps ksp2 is the new crysis. that game has melted more psus than any other. Edited March 28, 2023 by Nuke Quote Link to comment Share on other sites More sharing options...
JoeSchmuckatelli Posted March 28, 2023 Share Posted March 28, 2023 58 minutes ago, Nuke said: crysis. that game has melted more psus Never heard of Crysis melting a PSU. GPU? Yep. Quote Link to comment Share on other sites More sharing options...
Nuke Posted March 29, 2023 Share Posted March 29, 2023 (edited) 1 hour ago, JoeSchmuckatelli said: Never heard of Crysis melting a PSU. GPU? Yep. crysis was spikey. its cooked off many an overspec'd underperforming psu. i dumped so much money into that core 2 rig and the thing could heat a room in alaska. Edited March 29, 2023 by Nuke Quote Link to comment Share on other sites More sharing options...
JoeSchmuckatelli Posted March 29, 2023 Share Posted March 29, 2023 (edited) 1 hour ago, Nuke said: spikey. its cooked off many an overspec'd underperforming psu. i Gotcha - googled a bit and found a Crysis story of a guy who hadn't upgraded his PSU and the power demand from the GPU made the PSU pop. Similar story to my several times upgraded comp with the original Dell PSU that fried my Mobo and Proc. But mine is not a Crysis story - it's an underpowered PSU story pushing something AAA back in the day. Edit - the whole thing is just weird. Edited March 29, 2023 by JoeSchmuckatelli Quote Link to comment Share on other sites More sharing options...
Nuke Posted March 29, 2023 Share Posted March 29, 2023 (edited) 7900xt is supposed to be pulling around 307w and the 5800x draws about 190w under load, and between the two thats only 500ish watts and assuming 100w for everything else the 600w should have been good enough, but i opted for an upgrade to 750 anyway to buy the headroom needed to allow for graceful aging of the hardware. unless the mobo, ssds, and the 5 cooling fans (all slim fans too, about 9.12 watts total if going full tilt, that includes the cpu fan) are somehow pulling a lot of power. e: i looked it up and it seems that the 7900x can spike as high as 400w in some games. Edited March 29, 2023 by Nuke Quote Link to comment Share on other sites More sharing options...
ondrafialka Posted March 29, 2023 Share Posted March 29, 2023 (edited) I plan on buing a new laptop this year. I need it mostly for programming work so I need a powerfull processor and lots of RAM anyway, but I'm wodering what laptop GPU is reasonable. I like Lenovo Legion Pro because of the 16:10 screen which is usefull for programming and reading texts (and I also want a keypad). They come with variety of GPUs - RTX 3060 6GB, which I understand may not be enough to enjoy KSP 2 smoothly, RTX 3070 8GB, which might be better. I cannot afford going for anything higher. I am wondering if I should find a good deal for a laptop with 3070 now or rather wait for one with 4060 (which offers slightly better performance than 3070 according to tests). Or maybe all of these GPUs are still not powerfull enough? To sum it up: could you advice me, which laptop GPU is a good choice for KSP2? Edited March 29, 2023 by ondrafialka Quote Link to comment Share on other sites More sharing options...
Streetwind Posted March 29, 2023 Share Posted March 29, 2023 The 4060 mobile variant is within striking distance of a desktop 3060 Ti. Just about 10% slower. That means the 4060 mobile is above the KSP2 minimum system spec line. It also has the 8 GB VRAM you'll want for what I assume is 1920x1200. If the 3070 mobile is equal or only a little slower than the 4060 mobile, the same statements count for it too, so either is fine. So, I'd say go for either the 3070 mobile or the 4060 mobile, whichever suits you better in terms of price and availability. I wouldn't take a 6GB model. Even disregarding KSP2, VRAM needs of games in general have started going up steeply over the past half year, with things like the first UE5 games hitting the market and such. You can expect this trend to continue. Quote Link to comment Share on other sites More sharing options...
Nuke Posted March 29, 2023 Share Posted March 29, 2023 (edited) i was looking at the pinouts for my gpu connectors and i noticed that the only difference between a 6-pin and an 8-pin, is the 8-pin has 2 extra grounds. i think the rule of thumb is 120 watts per 18 gauge wire pair. but im wondering what the advantage is for having an extra pair of grounds. whatever advantage the 12vhpwr gives you, it omits the extra grounds found on dual 8s, and you might as well run dual 6s. frankly im glad amd did not re-invent the wheel on this one and used 8-pins. i think id have done a mini-16, and the i2c lines in the housing instead of having this stupid overengineered (and therefore expensive) composite connector. id rather they just put a power limit slider in the gpu drivers so i can tell it what the power budget is without it needing to talk to the psu. and to think while they were specifying a stupid new connector to replace something that was already ok, they would have specified new structural members in the gpu mounting hardware to support ever growing render bricks. Edited March 29, 2023 by Nuke Quote Link to comment Share on other sites More sharing options...
Streetwind Posted March 29, 2023 Share Posted March 29, 2023 (edited) @Nuke The 8-pins have a somewhat different setup than the 6-pins. They load the pins higher and use a second sense pin to keep tolerances tight. The 12VHPWR connector takes this to a whole other level and really pushes the tolerances *hard*. Really, looking at the spec, no one should be surprised that they melted down the moment they were minimally off perfectly seated. I'm honestly not sure I'll ever accept this thing in my case unless it gets revamped at least one more time. A simplified breakdown (the actual spec lists amps, not watts): PCIe x16 mainboard slot: 75W Each 6-pin PCIe power connector: 75W (2 power, 2 ground, 1 auxiliary, 1 empty) Each 8-pin PCIe power connector: 150W (3 power, 3 ground, 2 auxiliary) Each 16-pin 12VHPWR connector: 600W* (6 power, 6 ground, 4 auxiliary) *The 12VHPWR connector usually comes as a 12+4 pin, to give PSU makers a head start with getting them out in time for the first video cards. The 12-pin configuration omits all sense pins that would let the video card communicate with the PSU and negotiate how much power it wants directly. Lacking that capability, max power draw is reduced for safety reasons. Thus, unless both video card and PSU are set up to use all 16 pins on each side, the connector will be limited to 450W max. ...which, as you will notice, is still 50% more current per pin than the 8-pin connector did with two active sense pins! Edited March 29, 2023 by Streetwind Quote Link to comment Share on other sites More sharing options...
Nuke Posted March 29, 2023 Share Posted March 29, 2023 (edited) pinouts i had were specifically for my supply (in fact the whole sf line). they were intended for cable modders i assume and lack some details you would find in the full atx spec. everythings either labled as ground or vcc, and one is not used. i think its up to the psu manufacturer to decide what aux means, and corsair decided it was ground. would not be surprised if the aux lines on the squid connector just go to an i2c eeprom to fetch the psu spec data. this seems to be all the rage in the tech world. smart anything just means a look up table. had it not been for that connector, nvidia wouldn't have lost a long time customer. Edited March 29, 2023 by Nuke Quote Link to comment Share on other sites More sharing options...
Hlivno Posted March 29, 2023 Share Posted March 29, 2023 My specs: GPU: GTX 1660 Super 6GB CPU: I3 10100F RAM: 16GB 3200Mhz I haven't purchased the game yet but I am thinking about it, Do you think that I can achieve 30FPS on low settings 1080P or even 720P. This game seems very fun by the looks of it but I'm very worried my system wont get a playable FPS and I was hoping one of you with similar specs or a good understanding of the game and my specs could tell me if its adequate. Quote Link to comment Share on other sites More sharing options...
Streetwind Posted March 30, 2023 Share Posted March 30, 2023 (edited) You will get >30 FPS in space, with only your craft in view of the camera, and no celestial bodies in sight. You will also get >30 FPS in the VAB. Maybe even >60, but I'm not sure. On the space center screen, and when launching and landing or looking at a celestial body from orbit for any other reason, you'll tank to between 20 and 30 FPS. That's for 1080p. On 720p you might barely approach 30 FPS in some, but not all of those bad situations. Reason is the 6GB VRAM on your video card, and KSP2 using an extremely performance intensive way to render planet surfaces at the moment. This chokes out any video card with 6 GB or lower, especially on 1080p and up, regardless of what graphics settings you choose. I've got a 1060 6GB, same VRAM but slower GPU, and that GPU isn't even being fully loaded because the card is desperately out of memory at all times. Your card has a fair amount more memory bandwidth, which will definitely help alleviate some symptoms, but it won't change the fact that the VRAM is too small for the game to run well. So you can do what I'm doing - plan an upgrade in the next months - or hold off on buying the game until a few more patches have gone down the road. Performance already improved a bit going from initial release to the first patch. Intercept Games are also already trialing a replacement for the planetary surface renderer, but I would be surprised if it came out before summer. Such fundamental revamps take time. Your CPU and memory are fine. Both are within the range of published system specs. Edited March 30, 2023 by Streetwind Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.