Jump to content

KSP1 Computer Building/Buying Megathread


Leonov

Recommended Posts

4 minutes ago, XB-70A said:

Having often been in the lowest during computer classes, I could not answer this question without making a mistake. However, I think you may have even more answers and advice by asking on the KSP Unofficial Official Computer Building/Buying Megathread topic, which is still active :)

Thx. Much obliged.

Link to comment
Share on other sites

Good morning all. I have an eGPU question.

I'm trying to downsize from my desktop to just my laptop. The only games I play are KSP and (rarely) Civ VI, so I don't need cutting edge graphics. 

I have a Lenovo ThinkPad X1 Carbon with an Intel i5. It's perfect for when I travel. I tried KSP on it and it's ok on the laptop screen but on my big monitor it crawls. Integrated graphics obviously.

I don't play KSP on trips so I just need something for home. I've been looking at getting an external GPU. Either the Lenovo Graphics Dock (Pros: tiny and quiet. Con: Not upgradeable.) or the Razer Core X Chroma (Pros: upgradeable GPU. Cons: Larger, more costly, and apparently not super-quiet).

Anyone have experience with eGPUs and KSP? Obviously the graphics performance will be orders of magnitude better, but will the laptop CPU become a big bottleneck?

G0A10170-pdp-main.jpg?context=bWFzdGVyfC6ecdf854-826a-4917-beb0-7eeadb8a38df._CR

 

 

Link to comment
Share on other sites

6 hours ago, Starlionblue said:

Obviously the graphics performance will be orders of magnitude better, but will the laptop CPU become a big bottleneck?

Well that would depend on what CPU you actually have, how well your machine can handle the heat, what GPU you're going to use and on which resolutions you're trying to play at. The GTX 1050 in the Lenovo dock is already outdated and was already pretty meh when it was released. To see if your CPU is bottlenecking you could open the system monitor and then play the game at the lowliest resolution and on reduced details. Is one CPU core running at full speed? Then you're at the limit.

Link to comment
Share on other sites

16 hours ago, Starlionblue said:

I have a Lenovo ThinkPad X1 Carbon with an Intel i5. It's perfect for when I travel. I tried KSP on it and it's ok on the laptop screen but on my big monitor it crawls. Integrated graphics obviously.

Im am no expert in this field but I would recommend this video (if you haven't already watched it): 

 

When it comes to eGPUs, it would be useful to know what resolution your monitor is, and how much money you are willing to spend.

Link to comment
Share on other sites

 
 
 
2
4 hours ago, xXKSP_playerXx said:

Im am no expert in this field but I would recommend this video (if you haven't already watched it): 

 

When it comes to eGPUs, it would be useful to know what resolution your monitor is, and how much money you are willing to spend.

Thanks for the video. Useful info there.

My monitor is an LG very similar to the curved one shown in the video. Resolution is 3440x1440. My desktop has a GTX770 and no problem running KSP at excellent frame rates in fullscreen full res on that.

I'd probably be willing to spend US$600-700 to get a decent eGPU solution, of course depending on if that is possible with my current laptop.

Link to comment
Share on other sites

 
 
1
21 hours ago, Harry Rhodan said:

Well that would depend on what CPU you actually have, how well your machine can handle the heat, what GPU you're going to use and on which resolutions you're trying to play at. The GTX 1050 in the Lenovo dock is already outdated and was already pretty meh when it was released. To see if your CPU is bottlenecking you could open the system monitor and then play the game at the lowliest resolution and on reduced details. Is one CPU core running at full speed? Then you're at the limit.

 

Excellent testing idea. I ran the graphics down to 1024x768 with most of the details stuff off or "low". Then I launched a 340 part vehicle. CPU usage by KSP stayed at 30-35% throughout and everything was nice and crisp. (GPU usage was also quite low, of course.)

(Yes, I know bits are falling off during staging. I think some mod didn't make it through the latest update. :/)

This would indicate that there is quite some CPU headroom and an eGPU could be a workable solution.

Time to go shopping. :D

7xtJ7BQ.png

Edited by Starlionblue
Link to comment
Share on other sites

1 hour ago, Starlionblue said:

CPU usage by KSP stayed at 30-35%

Maybe the system monitor was a bad idea because on your screenshot it doesn't actually tell the actual core usage. Remember that KSP will in most cases only use one core for any craft. So a CPU usage indication of more than 25% for a quad core could already be deep within the CPU limit. You should try that again with a more fancy tool like Afterburner.

Edited by Harry Rhodan
Link to comment
Share on other sites

1 minute ago, Harry Rhodan said:

Maybe the system monitor was a bad idea because on your screenshot it doesn't actually tell the actual core usage. Remember that KSP will in most cases only use one core for any craft. So a CPU usage indication of more than 25% could already be deep within the CPU limit. You should try that again with a more fancy tool like Afterburner.

Good info thanks. I'll try to find time for more testing in the next few days.

Either way, it ran nice and smoothly. Night and day compared to high graphics resolution and settings.

Link to comment
Share on other sites

18 hours ago, Starlionblue said:

 I'd probably be willing to spend US$600-700 to get a decent eGPU solution, of course depending on if that is possible with my current laptop.

I would recommend the Gigabyte GTX 1070/1080 Gaming Box (It kind of depends what version you can find for your money since the price jumps a lot), it is a graphics card + enclosure combo. But even the 1070 version is fine for 3440x1440.

nYBKXWkm.pngPm8qyNpm.png

I don't see any problems with your laptop, apart from you CPU maybe getting a bit slow when you have large crafts (like 250+).

Edited by xXKSP_playerXx
Link to comment
Share on other sites

Moving from another thread:

3 hours ago, kcs123 said:

Slightly offtopic, but each time I purchase CPU,motherboard or RAM, I put PC on stress test. Most simple and broadly available free test is 7zip archiver. It have benchmark function with ability to choose size of dictionary (iinfluence RAM usage) and number of CPU cores used. CPU usage is hammered to 100% most of time and I aim to use whole RAM as well, as much as I can (some need to be used by OS itself).

And I leave PC to run like that for 1-2 hours. Not that it allows me to test hardware, if there is fault somewhere, but also to check if cooling works properly.
Only for GPU if necessary I use some gaming benchmark, to test any faults in VRAM or overheating issues. After that, each PC worked for years with no issues at all.

This is Kerbal Space Program. Physics work slightly different here. :sticktongue:.

Virtual Machines as Mono and Java makes things work differently. I would suggest moving this discussion to another thread, where I will gladly explain how exactly things work under the bonnet, but in a nutshell:

  • GPU is marginally significant as long you have a minimally decent one and you are not using any visual Add'Ons
  • MOAR VRAM is good, but it's only significant if your installment has a lot of new textures.
    • If you blow up the VRAM's capacity, main RAM is used for texturing and the PCIe bottlenecks everything
  • Less cores with higher MHz is better than more cores with lower MHz
    • Concurrency on KSP is still incipient, each craft uses it's own thread so the bigger craft bottlenecks the whole frame, rendering the remaining cores idle for the rest of it.
  • The lowest framerate the gaming is comfortable to you is the best one
    • This is Mono's and Unity's fault. Each frame generates a awful amount of garbage that piles up until the Garbage Collector needs to act.
    • The faster the framerate, more frequently the GC has to act.
    • Welcome to stuttering.

My son runs games on a Geforce 9600 or something with 512MB of VRAM on a old Xeon 3070 (a PE 850 I hacked to be a game rig), and that damn thing IS FAST - easily the best machine in the house for gaming, KSP included (as long you don't blow up the VRAM)

Edited by Lisias
Moving from another thread
Link to comment
Share on other sites

13 minutes ago, Lisias said:

Moving from another thread:

This is Kerbal Space Program. Physics work slightly different here. :sticktongue:.

Virtual Machines as Mono and Java makes things work differently. I would suggest moving this discussion to another thread, where I will gladly explain how exactly things work under the bonnet, but in a nutshell:

  • GPU is marginally significant as long you have a minimally decent one and you are not using any visual Add'Ons
  • MOAR VRAM is good, but it's only significant if your installment has a lot of new textures.
    • If you blow up the VRAM's capacity, main RAM is used for texturing and the PCIe bottlenecks everything
  • Less cores with higher MHz is better than more cores with lower MHz
    • Concurrency on KSP is still incipient, each craft uses it's own thread so the bigger craft bottlenecks the whole frame, rendering the remaining cores idle for the rest of it.
  • The lowest framerate the gaming is comfortable to you is the best one
    • This is Mono's and Unity's fault. Each frame generates a awful amount of garbage that piles up until the Garbage Collector needs to act.
    • The faster the framerate, more frequently the GC has to act.
    • Welcome to stuttering.

My son runs games on a Geforce 9600 or something with 512MB of VRAM on a old Xeon 3070 (a PE 850 I hacked to be a game rig), and that damn thing IS FAST - easily the best machine in the house for gaming, KSP included (as long you don't blow up the VRAM)

I agree on the above and pretty much known already to me. My digression was more about how to detect faulty hardware, so you can get waranty if something is wrong while PC is still new, rather than performance. So, GPU benchmark utility is good enough to occupy most of VRAM and keep GPU busy most of time for testing when you are unable to watch screen all times.

Prefered way is to play KSP or some other game, of course, but it is not always possible. At the end, if case and CPU's are not hot enough to fry eggs, it is passed quality control :D.

Link to comment
Share on other sites

7 hours ago, MDZhB said:

Just bought a ThinkPad T530 for real serious business. Can't wait for it to get here on Thursday! Anyone here a ThinkPad fan? Anything I should know about them?

Love ThinkPads. I've had about 15-20 of them through the years. I've always found TrackPoint easier to use than a trackpad, but maybe that's just habit.

On 6/9/2019 at 11:37 PM, xXKSP_playerXx said:

I would recommend the Gigabyte GTX 1070/1080 Gaming Box (It kind of depends what version you can find for your money since the price jumps a lot), it is a graphics card + enclosure combo. But even the 1070 version is fine for 3440x1440.

nYBKXWkm.pngPm8qyNpm.png

I don't see any problems with your laptop, apart from you CPU maybe getting a bit slow when you have large crafts (like 250+).

Thanks for the tip.  The big fan is a plus for noise reasons. However it still has one of those dinky PSU fans, and I'm worried about a high pitched whine.

Link to comment
Share on other sites

  • 2 weeks later...
On 6/3/2019 at 6:00 PM, xXKSP_playerXx said:

Yep, Im looking at the 3900x. If you can trust AMD it should perform like the 9900k in single threaded workloads like games, which can only use maybe 6 cores at best.If you go for the 3900x, I wouln't really recommend the 3900x if you don't need all the cores. You can go for the 3800x save yourself $100 and only loose 100Mhz boost clock (overlocking can fix that). If you go for the 3900x I would recommend high speed memory 3200+ because it is possible that the 12 cores are bottlenecked by RAM speed.

In gaming benchmarks that leaked, it performs slightly under it's Intel counterparts in gaming, the 3600x trails the 9600k, etc., where it WILL do well, will be in heavily threaded workloads.

Price will be more or less equivalent, with the 3600x costing about $20 less than a 9600k, this holds true going up through model numbers, aside from the 3900x which costs exactly the same as a 9900k, but AMD's new mb chipsets are more expensive, which kind of negates that slight cpu price advantage.

Since you mention overclocking....well, that doesn't appear to be something that the new Ryzen chips will actually be good at, you might hit 5ghz for a single core boost, maybe, it's not a given, and if you want a decent all core overclock.....well, it's reported that they needed 1.35v to hit 4.5ghz all core, none of this paints the cpu in a great light as far as overclocking goes.  

Link to comment
Share on other sites

2 hours ago, _Aramchek_ said:

Since you mention overclocking....well, that doesn't appear to be something that the new Ryzen chips will actually be good at, you might hit 5ghz for a single core boost, maybe, it's not a given, and if you want a decent all core overclock.....well, it's reported that they needed 1.35v to hit 4.5ghz all core, none of this paints the cpu in a great light as far as overclocking goes.  

Interestingly my FX can hit 4.5GHz all 6 cores with 1.35-1.37V and its stable doing so. But since you have a great IPC, 100MHz more is still a good improvement.

Link to comment
Share on other sites

51 minutes ago, xXKSP_playerXx said:

Interestingly my FX can hit 4.5GHz all 6 cores with 1.35-1.37V and its stable doing so. But since you have a great IPC, 100MHz more is still a good improvement.

I mean, yeah, but you'd be hard pressed to find any 8th/9th gen Intel cpu that won't do at least 5ghz on all cores.

Mine's sitting at 5.1 @1.27v, and it could go higher.
 In gaming, that's still a nice advantage as most games value clock speed and ipc, over more cores/threads, and it would seem Intel still has a slight edge in ipc, and a definite advantage in clock speed, at least from what we've seen so far.

I look forward to seeing them out "in the wild", but I would guess, based on what we know, that not much has actually changed.

Link to comment
Share on other sites

6 hours ago, _Aramchek_ said:

but AMD's new mb chipsets are more expensive, which kind of negates that slight cpu price advantage.

You wont need the X570 chipset for the CPUs, they will run just fine on most B350/X370 boards and all B450/X470 ones. You just dont get PCIe 4, which wont bring any noticable performance improvements in the next years.

Link to comment
Share on other sites

4 hours ago, Elthy said:

You wont need the X570 chipset for the CPUs, they will run just fine on most B350/X370 boards and all B450/X470 ones. You just dont get PCIe 4, which wont bring any noticable performance improvements in the next years.

There's a difference between "working" and working well.

I wouldn't expect that the mid-higher end chips will work well, and I wouldn't really expect any of the older chipsets to be good for overclocking any of the new chips on, due to power draw.

There's a reason why the newer chipsets require much more robust-to active cooling and themselves also use around double the last generation mb's power.

Intel isn't stupid either, they just announced price drops, which I think will be the main benefit for all stemming from the new Ryzen's, I think people were, and in some cases still are, expecting way too much from AMD.

Don't get me wrong, I think it's great that they've gotten much closer, but in gaming they were already 20-25% behind Intel, a 15% ipc increase can't fix that, and didn't, they needed these chips to reliably clock higher than they do.

 

Edited by _Aramchek_
Link to comment
Share on other sites

On 6/27/2019 at 4:12 AM, _Aramchek_ said:

a 15% ipc increase can't fix that

zUKuDbh.jpg 

See that? 4.2GHz Ryzen beats the 9900k 5GHz. Doing some quick math you would get: 3116 for the 4.4GHz  3700X,  3187 for the 4.5GHz 3800X, 3258 for the 4.6GHz 3900X and 3329 3950X.

Edited by xXKSP_playerXx
Link to comment
Share on other sites

1 hour ago, xXKSP_playerXx said:

zUKuDbh.jpg 

See that? 4.2GHz Ryzen beats the 9900k 5GHz. Doing some quick math you would get: 3116 for the 4.4GHz  3700X,  3187 for the 4.5GHz 3800X, 3258 for the 4.6GHz 3900X and 3329 3950X.

In AMD's own gaming benchmarks, every single one of their cpu's was around 5-10 fps slower than it's Intel counterpart, in MOST, not all, but MOST scenario's.

The 3600x trailed the 9600k, the 3700x trailed the 9700k, etc., in AMD's very own benchmarks which were undoubtedly set up to show AMD in the best possible light.

You cherry picking a single scenario, in a synthetic benchmark doesn't prove anything.

I'll say it again, a 15% ipc increase, which is exactly what AMD claims they have done, can't possibly make up for a 20-25% difference in gaming.

Link to comment
Share on other sites

1 hour ago, xXKSP_playerXx said:

 Doing some quick math you would get: 3116 for the 4.4GHz  3700X,  3187 for the 4.5GHz 3800X, 3258 for the 4.6GHz 3900X and 3329 3950X.

Based on your own assumptions, and nothing else, anyone can just make numbers up.  Also, I just ran this on my pc.

NRfVmZ9.jpg

Edited by _Aramchek_
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...