Jump to content

I'm worried about the possible system requirements of KSP2


Recommended Posts

 

On 12/26/2020 at 3:05 AM, Kerbal Productions said:

My laptop's processor is an i3-1005G1, which they said is stronger than the other laptops that I have in here that their processors is i3-8145U and i3-10110U.

I'm worried, will my laptop be enough when I run KSP2 with the same laptop I have today? ...

To stop worrying about it just assume that the minimum system specks  are for a Ryzen 9 7950X3D. 

Yes I know this is an older post. It’s just that I find some of the people who stress about it semi amusing, .

I tend to respond to them by taking the specks of a not yet released prossesor, and doubling the number of cores for every year in the future the game is estimated to launch.

 

somehow given that Starfield is going into a year and a half delayed from the first time I did that you can actualy get at least one card close to what I thought it would be then next year …. Oops

Edited by Drakenred65
Link to comment
Share on other sites

I feel that most developers now are running up against a wall GPU pricing as well. 

While there are cutting-edge things that could go in game, likely 70% of their playerbase cannot afford GPUs with the newest specs.  Likely 40% or more cannot afford GPUs in even the 2nd-to-latest specs.  

Right now, nVidia has just released 40 series.  They're booking in at over $2,500 retail, over a 1k more than MSRP.  They're also setting up the 50 series for 2023 release, which will see GPUs skyrocket into the 4-5k range for a single card.   Neither the 40-series, nor the latest 50-series about to release next year, will be available for regular consumers to purchase at reasonable prices, until 2024/2025.    Part of nVidia's strategy has been banking on this.  By the time the 30-series hits the 600-700 price range for the really good iterations of it, we won't blink twice at paying it, overpriced as that may actually be, because we're seeing consumer extortion so blatantly paraded with the 40/50 series prices, that it seems like a screaming deal.  (It isn't.)  About that time, the 30-series will drop below their original MSRP, to become 'affordable' for the vast majority of folks needing to upgrade.  

This means that, in designing a game for mass-reach, you've got to keep it low enough to run at least 3-4 years behind the current technology curve, because that's where most consumers really are.  

So don't worry about it being compatible with your system.  If they're developing the game in a smart way, it will be.   Only a fool developer would try to create a game that locked out folks playing 'older' systems.  It's a bit of an incorrect idea, that new technology means a new system.   A 'newer' computer system, spec-wise for the common consumer this year, would be a 10/11th gen processer, with a 2070/2090 graphics card.   "Cutting Edge" cards like the 30 series or 40 series, are simply not commecially available for most poeple yet, and won't be for another year.   They've been printed, and they're on market the way a luxury mega-yacht is on the market.  No one buys it, and no one plans on having one any time soon.  

Truly - the newer cards and processors coming out are really a lot like science articles saying "We've finally acheived fusion!"  Well, technically, yes, they did get a positive fusion reaction....to be commercially viable and available, they still haven't figured out how to do it.   In a graphics-card or processor comparison - just because gen-13/14 processors, or 40-50 series cards exist - doesn't mean they're available or commercially viable for the market to digest yet.   So when we articles about fusion and dismiss them from our mind with a thought of that being terribly not relevant to anything important in the next 20 years, because it's still this nebulous thing that no average person will interact with in any way, that's what new graphics cards are like to the market.   The 40 series is out - but I wouldn't bank on getting one until another year or two, because the 50 series needs to knock the price of the 40s down quite a bit before it's approachable.  

(I've never understood the computer-gear market.  In any other industry, if you made a new product to release to only 10% of your consumer base, through price exclusivity, your model would fail and you'd be out of business - but somehow, it works with GPUs and processers.  It baffles me.)  

So game developers will still be developing for systems running on 9th/10th gen processers, and 10-20-series graphics cards, as their 'Full Graphics' capability.   No one wants to buy a game they cannot fully enjoy with a house downpayment to build a new computer.  

Edited by Bosun
Link to comment
Share on other sites

16 minutes ago, Bosun said:

I've never understood the computer-gear market.  In any other industry, if you made a new product to release to only 10% of your consumer base, through price exclusivity, your model would fail and you'd be out of business - but somehow, it works with GPUs and processers.  It baffles me.

It's because there will always be those consumers that will pay the "bleeding" edge tax generation after generation just to say they have the latest and greatest rig. That's the type of consumers the CPU and GPU manufacturers are banking on. Not the consumer that saved for a long time to by "bleeding" edge gear every five to ten years or so.

Link to comment
Share on other sites

27 minutes ago, Bosun said:

I feel that most developers now are running up against a wall GPU pricing as well. 

While there are cutting-edge things that could go in game, likely 70% of their playerbase cannot afford GPUs with the newest specs.  Likely 40% or more cannot afford GPUs in even the 2nd-to-latest specs.  

Right now, nVidia has just released 40 series.  They're booking in at over $2,500 retail, over a 1k more than MSRP.  They're also setting up the 50 series for 2023 release, which will see GPUs skyrocket into the 4-5k range for a single card.   Neither the 40-series, nor the latest 50-series about to release next year, will be available for regular consumers to purchase at reasonable prices, until 2024/2025.    Part of nVidia's strategy has been banking on this.  By the time the 30-series hits the 600-700 price range for the really good iterations of it, we won't blink twice at paying it, overpriced as that may actually be, because we're seeing consumer extortion so blatantly paraded with the 40/50 series prices, that it seems like a screaming deal.  (It isn't.)  About that time, the 30-series will drop below their original MSRP, to become 'affordable' for the vast majority of folks needing to upgrade.  

This means that, in designing a game for mass-reach, you've got to keep it low enough to run at least 3-4 years behind the current technology curve, because that's where most consumers really are.  

So don't worry about it being compatible with your system.  If they're developing the game in a smart way, it will be.   Only a fool developer would try to create a game that locked out folks playing 'older' systems.  It's a bit of an incorrect idea, that new technology means a new system.   A 'newer' computer system, spec-wise for the common consumer this year, would be a 10/11th gen processer, with a 2070/2090 graphics card.   "Cutting Edge" cards like the 30 series or 40 series, are simply not commecially available for most poeple yet, and won't be for another year.   They've been printed, and they're on market the way a luxury mega-yacht is on the market.  No one buys it, and no one plans on having one any time soon.  

Truly - the newer cards and processors coming out are really a lot like science articles saying "We've finally acheived fusion!"  Well, technically, yes, they did get a positive fusion reaction....to be commercially viable and available, they still haven't figured out how to do it.   In a graphics-card or processor comparison - just because gen-13/14 processors, or 40-50 series cards exist - doesn't mean they're available or commercially viable for the market to digest yet.   So when we articles about fusion and dismiss them from our mind with a thought of that being terribly not relevant to anything important in the next 20 years, because it's still this nebulous thing that no average person will interact with in any way, that's what new graphics cards are like to the market.   The 40 series is out - but I wouldn't bank on getting one until another year or two, because the 50 series needs to knock the price of the 40s down quite a bit before it's approachable.  

(I've never understood the computer-gear market.  In any other industry, if you made a new product to release to only 10% of your consumer base, through price exclusivity, your model would fail and you'd be out of business - but somehow, it works with GPUs and processers.  It baffles me.)  

So game developers will still be developing for systems running on 9th/10th gen processers, and 10-20-series graphics cards, as their 'Full Graphics' capability.   No one wants to buy a game they cannot fully enjoy with a house downpayment to build a new computer.  

The game that put the fear of god into me recently is The Witcher3 Next Gen update. It has performance issues on my i9 9900K which has a  RTX3070ti. Cyberpunk 2077 performs well with all the ray tracing on but The Witcher 3 does not. Also Cyberpunk 2077 is going to have an update to have even better ray tracing to show off the 40 series. I have to ask will my PC be able to play it still at the old settings when it comes out or will this so called improvement actually cause a drop in performance for the same detail I was getting before hand.

So people should be concerned about performance issues on upcoming games. I bought the RTX 3070ti in 2022 when the prices dropped to what I considered acceptable levels and yet its already failing me on The Witcher 3. I doubt that KSP2 will be that much of an issue but for other AAA games I am starting to get concerned.

 

Link to comment
Share on other sites

1 hour ago, Bosun said:

(I've never understood the computer-gear market.  In any other industry, if you made a new product to release to only 10% of your consumer base, through price exclusivity, your model would fail and you'd be out of business - but somehow, it works with GPUs and processers.  It baffles me.)  
 

Individual computer chips are fairly easy to store. With multi core it’s easy for them (or was)  to print a chip. Test it. Set it to a performance below what they tested and say yep it’s actually this  chip. If necessary turn off any underperforming cores and sell it as a older chip or as a bargain version of a chip that has as many cores as it has left turned on.

 

that way they are selling to as many levels of the market as they have chips for. Basically it’s like we’re dealing with an airline that has more than one class of seats on its planes. 

Edited by Drakenred65
Link to comment
Share on other sites

On 12/31/2022 at 1:59 PM, shdwlrd said:

It's because there will always be those consumers that will pay the "bleeding" edge tax generation after generation just to say they have the latest and greatest rig.

Each year, as the 'bleeding' of the edge gets gushier, the consumer base shrinks.  5 years ago, bleeding edge was around $800-1000.  Now it's approaching $3,000 and climbing fast.  Already, we've seen reports of scalpers unable to get rid of them, and a slump in the sales where, despite selling out, a majority of folks that want one still don't have it, in part because of scalpers, but also because they're just too expensive.  

I think, if the price trends hold for the 50-series, and they ship the same number as the 40s, they'll be lucky to see them all sale.  Scalpers don't think it's a great deal any more, and most consumers simply can't make it there until the prices drop back down.

That's what I was referencing.  Are there folks who'll pay for bleeding edge? Absolutely.  I'm one of them.  I didn't blink at $600.  I splurged for $1200.  But I'm priced out now.  I cannot justify spending 3-4k on a GPU, because I'm not a professional gamer who can write that gear off of my taxes.  So there are folks willing - yes.  But as a business model, every iteration, you're only hooking a smaller and ever decreasing consumer base.  

As an investor - I'd be worried about my return with a business model like that.  nVidia can do it because GPUs are not their main source and money maker.  But they've effectively blockaded the Intel-based GPU market.  

I'm super excited to see where Intel's ARC chips go.  They're not massively competitive yet - but I bet the will be, and I hope they make nVidia rethink the market when they are.  

Link to comment
Share on other sites

10 hours ago, Bosun said:

As an investor - I'd be worried about my return with a business model like that.  nVidia can do it because GPUs are not their main source and money maker.  But they've effectively blockaded the Intel-based GPU market. 

The crux is that the nVidia main business is GPUs, but in the B2B space. So their B2C is effectively a marketing arm of the operation, meaning, they don't need to sell quantity. They need to sell prestige. Saying "something has to give," about the way nVidia has been cranking up the prices on their flagship GPUs is a bit like expecting Lamborghini to start marketing cars to general population to stay in business. The only reason we have a mid-range market on nVidia GPUs at all is that you end up with a lot of dies that don't quite make the cut for the top tier, and so it makes sense to have a tiered approach to your product to avoid taking a pure loss on sub-standard dies.

So I don't expect any of this to change. But yes, it does leave room for alternative mid-range and low end chips. I'm just not sure just how much interest Intel and AMD have in that market either, and besides them, there aren't really any serious alternatives.

10 hours ago, Bosun said:

I'm super excited to see where Intel's ARC chips go.

Likewise, but with a healthy dose of skepticism mixed into that as well. I happen to have crossed paths with a lot of Intel's graphics engineers both professionally and socially, and I still don't think I have any better idea of what exactly Intel's plan for ARC is than a general well informed consumer does. Intel is clearly invested in making sure that their integrated graphics can handle at least the entry level gaming, but beyond that, I have no clue.

Fortunately, I don't think all of this impacts KSP2 a whole lot. It doesn't look to be a graphics beast. I'm sure there will be a lot of bells and whistles you can crank up and make a mid-high graphics card work to keep up in 4K, but I haven't seen anything that can't run on a five year old solid middle of the range graphics card with some settings turned down.

Link to comment
Share on other sites

I'm not worried about the CPU nor the RAM. I'm worried about the GPU. My 670 is old, while the rest of my computer is only 1-2 years old.
The problem is, the GPU market is impossible right now, I can't spend that much money for a card.

Link to comment
Share on other sites

3 hours ago, Sesshaku said:

I'm not worried about the CPU nor the RAM. I'm worried about the GPU. My 670 is old, while the rest of my computer is only 1-2 years old.
The problem is, the GPU market is impossible right now, I can't spend that much money for a card.

That's an old GPU. Have you tried the 2nd hand market? You can find really good deals on 20 and 30 series cards. If you want new, the 1060 and above will be able to handle the game. Granted, you may not be able to run with all settings maxed out, but you should be able to get decent frames with a medium to high settings.

Link to comment
Share on other sites

6 hours ago, Sesshaku said:

I'm worried about the GPU. My 670 is old

The game might run ok on this. It kind of depends on how many visual setting Intercept will add. 2GB VRAM is not a lot to work with, but if Intercept lets you cut texture quality and you don't mind running in something like 720p, that's the easiest part to fix.

670 should have the bells and whistles necessary to run all of the game's visuals, but it's on the slow side for some of them. Not hopelessly so, however. Just like with memory, a lot of these scale really well, and tunning down some rendering distances and fidelity of things like clouds might make the game run well enough. Hard to say for sure, but we should have a better idea once early access is out.

Link to comment
Share on other sites

  • 2 weeks later...
39 minutes ago, OnlyLightMatters said:

In your opinion, which quantity of VRAM should be a target for a KSP2 modded game? Let's say with RSS in several months.

We don't know if KSP 2 is CPU heavy or GPU heavy, so asking about VRAM is kind of pointless. But the more VRAM you have, the better in most cases. 

Link to comment
Share on other sites

×
×
  • Create New...