Jump to content

How long before a performance increase?


Motokid600

Recommended Posts

Has there been any direct developer response to this issue ever?

With their sales clearly through the roof now that they're on steam a lot more people are going to start getting rather frustrated when their creativity is extremely stifled by a game that doesn't appear to make use of the hardware sitting in their box. We can all agree that it's no small matter to fix this but the more burning question in my mind is: will they?

The problem is, as I understand it, the game engine, which Squad isn't changing.

Link to comment
Share on other sites

@Ming

What exactly do you expect from a alpha game? As so many have stated no one even predicted that people would build ships using thousands of parts. And even with the litte optimization we have had since .17 there has been a considerable boost in framerates with larger ships.

My last project was sending up a giant 1700+ parts ship/bus to duna and that was on a 4 year x4 phenom 2 system.

That thread you linked claims it will lag after 300 parts, but yet i am still able to play using 1700 parts on a computer that is seen as outdated.

300 parts ship would give me pretty smooth framerates.

And with time it might actually get multicore support which would give us a real boost.

When would people be happy anyways? Because how it has been until now is that people just build to the new limits once it has been optimized. Who is to say this wont happen if it one day supports 5000+ before you reach that limit?

No matter what engine they used there would still be such performance limits if they dont put limitations into how many part counts we can build.

I just think people should stop expecting impossible things from a damn alpha game.

Link to comment
Share on other sites

Has there been any direct developer response to this issue ever?

With their sales clearly through the roof now that they're on steam a lot more people are going to start getting rather frustrated when their creativity is extremely stifled by a game that doesn't appear to make use of the hardware sitting in their box. We can all agree that it's no small matter to fix this but the more burning question in my mind is: will they?

They said earlier that they were looking into some kind of partial multicore support even if unity at it current state dosent allow full use of multicores.

But havent heard any update on that in a while.

Link to comment
Share on other sites

Oh crap, I just realised something... and a reason why they are probably reluctant to change engines.

They would NEED to find an engine that runs on Windows, Mac's and Linux... because the Alpha purchasers have bought the game for those three platforms.

Imagine if they switched to an engine that only ran on Windows and Macs... Or Windows and Linux. Think of the screams of "REFUND, REFUND!" they would get from the purchasers who wouldn't get the finished game on their platform. I'm thinking that Squad have accidentally painted themselves into a corner here.

Edit: Wow... just found a list of all the game engines on Wiki... and Unity is freeware. Does anyone know if they are using the Free version or the PRO version?

http://en.wikipedia.org/wiki/List_of_game_engines

Edited by NeoMorph
Link to comment
Share on other sites

Right after .20 came out, I was experiencing similar FPS issues. With a moderately sized rocket, I was getting 10FPS(per FRAPS) on the launchpad. Turning down all the settings to low would only pick up about 5FPS or so. Once launched, if you moved the camera so there was no terrain on screen, FPS shot up 40-50 range. I turned the physics delta way-way down. I was still getting 10FPS, but the camera movement was smooth.

Luckily, I was already in the process of a video upgrade. My original setup(used with the above) was a Radeon HD4870X2 (DX10). New setup is a Radeon HD7950(DX11). The rest of the computer is identical. I now get 50+FPS(per FRAPS) using the same rocket as above, with all the settings maxed. Whatever the issue is, it seems pretty clear it has something to do with the new terrain. For whatever reason, my DX11 card handles it with ease.

Off topic: Oh hai, a performance issue thread devolved into a language/engine flame war!

Link to comment
Share on other sites

Luckily, I was already in the process of a video upgrade. My original setup(used with the above) was a Radeon HD4870X2 (DX10). New setup is a Radeon HD7950(DX11). The rest of the computer is identical. I now get 50+FPS(per FRAPS) using the same rocket as above, with all the settings maxed. Whatever the issue is, it seems pretty clear it has something to do with the new terrain. For whatever reason, my DX11 card handles it with ease.

I've got a GTX 670, 12 gigs of ram, and a [email protected], and my 150 part ships still lag like hell. (5-10 fps) I don't have anything to upgrade my card to. The game doesn't come close to using all my gpu power, and never passes 20% cpu while in flight.

Link to comment
Share on other sites

I've got a GTX 670, 12 gigs of ram, and a [email protected], and my 150 part ships still lag like hell. (5-10 fps) I don't have anything to upgrade my card to. The game doesn't come close to using all my gpu power, and never passes 20% cpu while in flight.

Get a quantum computer, only 10 million USD. Maybe one of those can run KSP 300 parts with 40 fps, but thats wishful thinking.

Link to comment
Share on other sites

I've got a GTX 670, 12 gigs of ram, and a [email protected], and my 150 part ships still lag like hell. (5-10 fps) I don't have anything to upgrade my card to. The game doesn't come close to using all my gpu power, and never passes 20% cpu while in flight.

Uh... I have less ram, and a worse video card, with the same CPU, and I don't lag until after 350.

Link to comment
Share on other sites

Uh... I have less ram, and a worse video card, with the same CPU, and I don't lag until after 350.

Define lag, because to some lag is anything less then 120 fps 60 fps.. Some play the game w. 10 fps & think thats smooth enough for them, some turn on Max Physics Delta-Time per Frame all the way up and play the game at 1/20th speed and consider that not lag.

Link to comment
Share on other sites

I don't drop below 60fps (the point where it becomes noticable to me) until 820 parts after the last patch

AMD Phenom II x6 1055t currently clocked at 3.2ghz 6gb ddr3 1024bandwidth memory, and Nvidia Geforce 8800 ultra

Link to comment
Share on other sites

Define lag, because to some lag is anything less then 120 fps 60 fps.. Some play the game w. 10 fps & think thats smooth enough for them, some turn on Max Physics Delta-Time per Frame all the way up and play the game at 1/20th speed and consider that not lag.

True, true. I was, admittedly unscientifically, defining lag as being visual. That is to say, the game lags so much that it appears to be individual frames.

Link to comment
Share on other sites

I don't have much lag in my laptop, but I fly simple machines.

Maybe Squad should finish this game in a neat pack with career, more parts and a planet or two, better maneuver nodes, etc release for sale and with the profits make:

"Kerbal 2, what we wanted to do, but had only 5 dudes and no budget to start with".

Pretty much what Maxis did with Simcity before it sucked.

Link to comment
Share on other sites

True, true. I was, admittedly unscientifically, defining lag as being visual. That is to say, the game lags so much that it appears to be individual frames.

In other words, when you don't know if the game is running off Unity or PowerPoint.

Link to comment
Share on other sites

Pretend that Unity is an engine that threw a rod right through the block, had all it's valve keepers let go simultaneously, and that hasn't had its oil changed for 50k miles because the owner was too dumb to know that you need to maintain your car.

Link to comment
Share on other sites

If by KSP's API you mean the .cfg files only.

The scripting API is almost the same as Unity's so we can't say it can stay the same after an engine switch, most likely it can't. And I don't see why KSP's API has to provide mechanisms that are already available with Unity.

Modders don't usually deal with UNITY if we don't have to; the only commonly used thing that really belongs to unity is MonoBehaviour, Texture2D, and pointless WWW... (Err right, and the GUI graphics... actually Unity /is/ used a bit more than I initially thought about...)

Has there been any direct developer response to this issue ever?

With their sales clearly through the roof now that they're on steam a lot more people are going to start getting rather frustrated when their creativity is extremely stifled by a game that doesn't appear to make use of the hardware sitting in their box. We can all agree that it's no small matter to fix this but the more burning question in my mind is: will they?

The only game I've ever heard to have switched engines mid-project was Duke Nukem Forever. They did that 4 times, actually.

I think we all know how that turned out. :/

It's not impossible, certainly, but I think it would be more accurate to call it cancelling the first project, and starting a new one on the new engine.

In any case, we're all very happy with Unity here. It's never stopped us from doing anything (all our workarounds would have been necessary on any other engine, unless we had written our own, in which case we would be a middleware company and KSP wouldn't exist), and it's generally a very good all-around engine, not even mentioning the fact that it's practically free compared to what other popular game engines out there can cost.

Cheers

Link to comment
Share on other sites

Pretend that Unity is an engine that threw a rod right through the block, had all it's valve keepers let go simultaneously, and that hasn't had its oil changed for 50k miles because the owner was too dumb to know that you need to maintain your car.

If that was the case, yes replace the engine. Unfortunatly engines are specific to the vehicle, true you can put larger engines into a car, but you usually have to change out a lot more things to upgrade the engine from a small inefficient one to a power house, this repair makes it not worth the effort, and you scrap the car. I imagine the same holds true with a game.

Link to comment
Share on other sites

The human eye sees fluid motion at 20 - 25 FPS, this is the typical rate by which we watch movies, TV, and other media. Under 20 FPS the eye begins to see the segmentation of the images and you are effectively looking at a slide show.

So if anyone is not getting a minimum of 20 FPS. then yes.. It can be called lag.

Notes of interest:

- Early silent films had a frame rate from 14 to 24 FPS which was enough for the sense of motion, but it was perceived as jerky motion.

- Thomas Edison said that 46 frames per second was the minimum: "anything less will strain the eye.

- In the mid- to late-1920s, the frame rate for silent films increased to between 20 and 26 FPS.

- From 1927 to 1930, the rate of 24 FPS became standard for 35 mm sound film.

- In the motion picture industry, where traditional film stock is used, the industry standard filming and projection formats are 24 frames per second.

- The first 3D first-person shooter game for a personal computer, 3D Monster Maze, had a frame rate of approximately 6 FPS

- Modern action games, including popular console shooters such as Halo 3, are locked at 30 FPS maximum.

Source:

http://en.wikipedia.org/wiki/Frame_rate

Link to comment
Share on other sites

To carry on the car analogy...

Squad made a VW Beetle for driving the family around, fans have pulled the poor Beetle apart and added a Porche chassis with electrical windows , mirrors adjusting motors , ICE systems , LED's , Hydraulic controlled suspension systems...

... and other fans are complaining the Beetle engine cant match the speed of the Porche it now looks like.

I would imagine the game got a lot bigger than Squad had originally planned, and got caught up trying to add features to keep the fanbase happy but now is near or actually beyond what the engine is capable of providing.

Link to comment
Share on other sites

To carry on the car analogy...

Squad made a VW Beetle for driving the family around, fans have pulled the poor Beetle apart and added a Porche chassis with electrical windows , mirrors adjusting motors , ICE systems , LED's , Hydraulic controlled suspension systems...

... and other fans are complaining the Beetle engine cant match the speed of the Porche it now looks like.

I would imagine the game got a lot bigger than Squad had originally planned, and got caught up trying to add features to keep the fanbase happy but now is near or actually beyond what the engine is capable of providing.

Regardless, a 400 part ship causes lag and is considerably less fun to launch and pilot than a 100 part one. This has always been the case. The above quote by harvestR doesn't acknowledge the presence of this as a serious problem and that is what I was looking for.

Edited by Daisy Blossom
typo
Link to comment
Share on other sites

To be accurate : Assembly is the low level language (machine level code), C is a high level language and LUA is a script.

Assembly is hyper fast but is NOT portable between makes of CPU (even AMD and Intel PC CPU's) and is extremely difficult to read even with years of experience.

C is a human readable language that tries to make fast code but also portable code, but is slower than assembly.

LUA scripts are very slow but offer the ability to change parameters in the game without needing to compile the code each time you try a new change... like say you have a variable that controls the ISP of a engine, var_Engine_ISP = 220 but something isnt quiet right, in C you would change that value in the code and recompile and then run the program, in LUA you would simply change the value in the script and run the program again. Scripts also allow you to change parameters without access to the source files, and without needing to provide command line options to adjust each setting.

*mind you I am not a coder (budding hobbyist game programmer) so it is possible I am entirely wrong, but thats how I understand things.

ALMOST.

The "Assembly language" is still a human readable language. It is, however, literally only one step up from real machine language (ML). The CPU knows only ones and zeros in groups of 4, 8, 16, 32 and 64 BITS (Not bytes), and we humans don't remember ones and zeros in 4, 8 16, 32, or 64 bit format easily, and also don't easily remember what the ML byte code for NOP is versus JMP. (Some of us don't even remember our PIN for our bank cards when standing in line in a grocery store. .. .. ahem....) Hence its called "Assembly Language". Assembly is also a compiled language, unlike Basic which is a purely interpreted language. C/C++ is also a compiled language, but, it knows how to take something like "printf" and convert it into a lot of ML that will output something to a screen, file, or whatever the device in question is. The "Assembly Language" is also a type of direct mapping to whatever instruction set is available for a set of CPUs and its mapping is directly set for a particular CPU type. That is how close to "ML" ASM is.

Assembly *is* also portable between makes of CPUs, but it depends on the instruction set itself, and/or whether or not the compiler itself knows how to convert the assembled code into a ML instruction set for that CPU. If you use a really, really, REALLY basic instruction set, the compiler will be able to translate what you've done into something the CPU of any flavor can handle, provided ALL CPUs know how to deal with the instructions in question, AND you tell the compiler WHAT language to compile for. The ML itself, however, IS NOT cross CPU compatible as far as AMD to Intel to ARM to whatever else. I would bet my lunch money on the fact that every CPU out there knows what a NOP and JMP instruction is, but some may not be smart enough to know what a CMPX is (Extended compare - Typically reserved for 32-bit and better CPUs IIRC). You wouldn't find CMPX on a 16-bit type of CPU. I'll also bet tomorrows lunch money that NOP and JMP are both different instruction IDs as far as the CPU is concerned.

My weigh in on Unity and KSP itself, I'm on the boat that more FPS is better. But KSP isn't even alpha, so I give it a very VERY wide margin for my tolerances in regards to speed. Sooner the better, IMO, but in a battle of stability versus performance, to me, is much MUCH more important.

The human eye sees fluid motion at 20 - 25 FPS, this is the typical rate by which we watch movies, TV, and other media. Under 20 FPS the eye begins to see the segmentation of the images and you are effectively looking at a slide show.

I've seen this spewed all over the place, and that is absolutely, most definitely, completely, entirely 99% false. First, the human eye doesn't work on an FPS system, and neither does the brain. If it did, we'd see the spokes on a wheel hub while driving down the highway instead of a blur, or solid mass with the the calipers behind said "solid mass". The brain makes a "best guess" to what the situation is and blurs things together if necessary so it "makes sense" and reduces strain. Second, it depends on the specific individual (Which is where that 1% is) and what they're used to seeing on a monitor (Yes, real life is different than a refresh rate on a computer monitor). Third, you put two monitors, side by side, one running at 60hz and another at 120hz and do anything, even running just calculator on your desktop and dragging it around, you will see a difference in what kind of response you get. The 120 will look smoother. This is, of course, assuming that the video card is setup to work at 120hz.

Before I bought my 42" TV about a year and a half ago, I stood in walmart, best buy, future shop, and other electronic stores, watching a wall of TVs for HOURS (in total, and yes, literally stood and watched commercials for a good 30-45 minutes at a time at different stores, sometimes the same store on a different week) to decide on what model I wanted. SPECIFICALLY I was looking to see if there was ANY difference in frame rates, or at least what the difference would look like, just to see if there was a difference between 60 and 120, as well as picture quality between manufactures and models, and THEN went up close to look at specific capabilities. I noticed the difference even BEFORE I looked at the individual capabilities of each TV. Standing about 20 feet back, I could see that the 60hz TV was a bit more "jerky" between refreshes. What was showing on the wall of TVs was just through a video repeater, so it wasn't like I was looking at two different signals. Now, there are a few stipulations that I don't have the answer to, and someone can come bite me in the rear for...

1> First, was the repeater working at 120hz? Doesn't matter as it was a stream to whatever the capabilities of the the repeater were AT MINIMUM. So if the repeater threw it off at 60hz, the 120s threw up at 60 and may have "flushed" the two different images together. Point is, I could see a difference.

2> Were the TVs themselves setup for 120? One would like to assume that the TVs come out of the boxes at whatever their top rate speeds were, not at less. Even if there was a chance, not all of them across several walls of TVs were configured as such.

3> Were they hooked up via HDMI, Component or Composite? I COULD see this being a significant difference, especially at the Composite point, but, this isn't the 90s and not many TVs that are being put out on display are run off a composite cables anymore. ESPECIALLY to show off its high end capabilities. (Composite = single cable that splits the RGB signal; Component = Three colors sent across three wires)

Now, if you ask the guy NEXT to me who was wondering WTF I was doing, and asked him if he could see a difference, he MIGHT have said no. Why? With absolutely no disrespect, his brain may not have been wired well enough to SEE the difference, or his brain didn't understand HOW to tell the difference, or he just wasn't paying attention to that kind of detail. Even with all that said, with my seeing that there was a difference AT ALL indicates that either I'm special, like my mommy and daddy told me so, or, the whole FPS thing is just an excuse for bad performance.

Edited by Pontiac
Didn't want to double-post
Link to comment
Share on other sites

The human eye sees fluid motion at 20 - 25 FPS, this is the typical rate by which we watch movies, TV, and other media. Under 20 FPS the eye begins to see the segmentation of the images and you are effectively looking at a slide show.

You compare apples to Mangos. TV Shows and movies are generally recorded wit a camera, in a movie with 24fps a single frame was exposed for about 1/24th of a second. Fast motions become blurry, as you can easily see for yourself if you have a photo camera with variable exposure time. TV shows which are recorded at 60 fps tend to look very sterile if compared to movies because 1/60th of a second almost eliminates this natural motion blur.

As pontiac above me has already mentioned you eyes work kind of similar. fast movements appear as a blur to you (very subjective however, since this varies for each person and alertness (adrenaline, alcohol, etc.) levels).

In a Video game however each frame is a snapshot of the game scene without temporal extent - it has an indefinitely short exposure time if you will. If you show 24 of these you will easily see individiual frames as opposed to a 24 fps movie scene.

There are of course attempts to indroduce motion blur in games but at this point thats more to mask awful console graphics than anything else and it looks awful.

Hope that cleared something up.

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...