Jump to content

The Linux compatibility thread!


Recommended Posts

Same here, ctd on lauch pas whenever I am using a kethane detector. It looks like a memory allocation issue, but the KSP logs reveal nothing so there is nothing to pass on to the developers.

Link to comment
Share on other sites

I would have thought so, but it failed to keep the executable tag when I moved it to a ntfs drive off my desktop. Moving it back onto my desktop fixed it. YMMV, but it's not the first time I've had it happen.

You're right that NTFS and VFAT don't preserve permissions on files - if you copy everything to a USB stick and back then all of the permissions will be reset. But whether you can execute files on an NTFS drive depends on a few mount options.

First of all, there's the exec and noexec options. They do pretty much what they sound like.

Then there's the umask and fmask options. These set the default permissions for all files on the filesystem. umask applies to everything, fmask applies to regular files (there's also a dmask for directories).

And then there's the showexec option, which only allows files with common Windows executable extensions (like .EXE) to have the execute bit.

Looking at the Debian system I'm on now, with an NTFS USB drive and VFAT USB stick plugged in, they both have default mount options that won't let me run anything useful.

Link to comment
Share on other sites

Same here, ctd on lauch pas whenever I am using a kethane detector. It looks like a memory allocation issue, but the KSP logs reveal nothing so there is nothing to pass on to the developers.

It seems I understated things upthread. a.g.'s fix not only eliminated my .x86_64 CTDs on initial load (so far, anyway), but also made ISA Map and Kethane work perfectly.

Link to comment
Share on other sites

It seems I understated things upthread. a.g.'s fix not only eliminated my .x86_64 CTDs on initial load (so far, anyway), but also made ISA Map and Kethane work perfectly.

THANKS! That seems to have solved the issue. I've only tried at the launchpad, but at least I can fly ships with kethane detectors now. l'll post a link to this fix in the kethane thread after testing.

$ xxd -s +0x7cebc7 -l 1 KSP.x86_64

07cebc7: 01

$ echo "7cebc7: 00" | xxd -r - KSP.x86_64

$ xxd -s +0x7cebc7 -l 1 KSP.x86_64

07cebc7: 00

$ xxd -s +0x7cebcc -l 1 KSP.x86_64

07cebcc: 01

$ echo "7cebcc: 00" | xxd -r - KSP.x86_64

$ xxd -s +0x7cebcc -l 1 KSP.x86_64

07cebcc: 00

Link to comment
Share on other sites

So, just run those on the terminal in the KSP folder?

Basically. "xxd -s +0x7cebc7 -l 1 KSP.x86_64" should return "07cebc7: 01", and after running "echo "7cebc7: 00" | xxd -r - KSP.x86_64", "xxd -s +0x7cebc7 -l 1 KSP.x86_64" should return "07cebc7: 00"

So, the commands to run are the lines starting with "echo".

And it seems to work.

Link to comment
Share on other sites

Next time I get the chance I'll be bringing this up with the devs, as they can forward this to Unity on the Unity bug tracker.

But how come this does not effect Ubuntu?

It does, it has been happening to me regularly on Ubuntu 12.04 LTS today, ag's patch fixed it and everything.

Link to comment
Share on other sites

It does, it has been happening to me regularly on Ubuntu 12.04 LTS today, ag's patch fixed it and everything.

Same here, Ubuntu 13.04 64-bit. I think the only commonality is when a 64-bit OS is used to launch KSP's 64-bit executable.

Link to comment
Share on other sites

Stock, so I guess that's it.

As for libpng I'm using libpng12-0 (1.2.49-1ubuntu2) and libpng12-0:i386.

Same libpng's here. I reinstalled KSP fresh for 0.20 release and played stock/64-bit for about 4 days with no problems, and only started seeing CTDs as I added mods. No problems at all since applying a.g.'s fix.

Other than this thing that's fixed, the 64-bit version is noticably smoother than the 32-bit version for me, and really solid. Please thank the appropriate people for their efforts. :)

Link to comment
Share on other sites

Hi everybody,

I have this type of laptop running on the last version of Mint 64bits OS :

Processor

Intel® Core™ i7 720QM Processor

Chipset

Intel® HM55 Express Chipset

Memory

DDR3 1333/1066 MHz SDRAM, 2 x SO-DIMM socket for expansion up to 4 G SDRAM *1

Display

15.6" 16:9 Full HD (1920x1080) LED Backlight

Graphic

NVIDIA® GeForce® GTS 360M 1GB GDDR5 VRAM

But when i'm in orbit i see some very disturbing lag.

i run the game in 1080p, why my configuration is not able to run the game with almost full graphics options?

Thanks in advance for your answer.

Link to comment
Share on other sites

But when i'm in orbit i see some very disturbing lag.

i run the game in 1080p, why my configuration is not able to run the game with almost full graphics options?

Can you be more specific ? Orbit of which planet ? Lag as in 'low fps' or 'stutter spikes'? What are 'almost full graphic options' - you did not put the light count setting to the far right - did you ?

Also Linux Mint Ubuntu Based or Debian ? Which graphics driver (nvidia binary or noveou, see /var/log/Xorg.0.log)? How much RAM do you have.

The more info you provide, the more precise it is, the better are your chances that someone helps you - also Mint is not offically supported.

Link to comment
Share on other sites

Is there no way to get anti-aliasing to work in Ubuntu 12.04 with an AMD graphics card? I'm using the 12.11 beta drivers wfor my AMD 6870. If not when are the devs going to fix this issue?

Link to comment
Share on other sites

Is there no way to get anti-aliasing to work in Ubuntu 12.04 with an AMD graphics card? I'm using the 12.11 beta drivers wfor my AMD 6870. If not when are the devs going to fix this issue?

Unfortunately it's not something the devs can fix, there's something wrong with Unity on Linux with AMD graphics hardware.

It's not possible to get AA to work in KSP with AMD cards and the proprietary driver at this time, not till there's a fix from Unity anyway, and the open source driver does not support the AA shader used by Unity.

Link to comment
Share on other sites

Unfortunately it's not something the devs can fix, there's something wrong with Unity on Linux with AMD graphics hardware.

It's not possible to get AA to work in KSP with AMD cards and the proprietary driver at this time, not till there's a fix from Unity anyway, and the open source driver does not support the AA shader used by Unity.

Ah ok. Well, I'm dual-booting with W7 anyway, so might as well play it on Windows. It looks much better with AA on. Great game!

Link to comment
Share on other sites

So after updating from Fedora 18 KDE to Fedora 19 MATE and KSP 0.19.x to 0.20.x my game's performance has decreased substancially (when I was expecting the exact opposite) - tried a whole bunch of stuff (different window managers, tweaks, drivers, etc - no mods, different settings) but there's still too many variables to account for and cant figure out what's causing such a slowdown, when Windows runs always without a hitch.

Anyway, I found a small tweak that's given me a very noticeable performance gain and allowed me to use my space stations on linux again, which is more than I was hoping for after all the hustle. It's called "Threaded Optimizations" for the Nvidia propietary drivers, which theoretically "offloads computations from the CPU which typically benefits CPU-intensive applications" LINK, see at the bottom. To enable it, you should use this command to start KSP:

LD_PRELOAD="libpthread.so.0 libGL.so.1" __GL_THREADED_OPTIMIZATIONS=1 ./KSP.x86_64

Be warned however, that apparently it's still in experimental stages (the reason why it's not enabled by default)

and for some people it could reduce performance instead. For me, its been magic. Try it out!

Link to comment
Share on other sites

So after updating from Fedora 18 KDE to Fedora 19 MATE and KSP 0.19.x to 0.20.x my game's performance has decreased substancially (when I was expecting the exact opposite) - tried a whole bunch of stuff (different window managers, tweaks, drivers, etc - no mods, different settings) but there's still too many variables to account for and cant figure out what's causing such a slowdown, when Windows runs always without a hitch.

Anyway, I found a small tweak that's given me a very noticeable performance gain and allowed me to use my space stations on linux again, which is more than I was hoping for after all the hustle. It's called "Threaded Optimizations" for the Nvidia propietary drivers, which theoretically "offloads computations from the CPU which typically benefits CPU-intensive applications" LINK, see at the bottom. To enable it, you should use this command to start KSP:

LD_PRELOAD="libpthread.so.0 libGL.so.1" __GL_THREADED_OPTIMIZATIONS=1 ./KSP.x86_64

Be warned however, that apparently it's still in experimental stages (the reason why it's not enabled by default)

and for some people it could reduce performance instead. For me, its been magic. Try it out!

Thanks! I'd completely forgot about Nvidia's threaded optimizations. It gave me a bit of a performance boost for Witcher 2 as well.

Any idea how to add it make it fire from within Steam?

LD_PRELOAD="libpthread.so.0 libGL.so.1" __GL_THREADED_OPTIMIZATIONS=1 ./Steam

Would launching the client as such also apply to any games launched?

Link to comment
Share on other sites

Any idea how to add it make it fire from within Steam?

LD_PRELOAD="libpthread.so.0 libGL.so.1" __GL_THREADED_OPTIMIZATIONS=1 ./Steam

Would launching the client as such also apply to any games launched?

Environment variables should be inherited by child processes, yeah. But I haven't actually tested it.

Another way is to modify Steam's launch properties as described in the first post:

Instead you can add the command to the Steam on a per game basis, this is also useful for things like Optirun with Bumblebee.

To do this, you need to add the following to the "Set Launch Options..." box in the Properties page for KSP:

LC_ALL=C %command%

Skarhu informs us that:

If you want to start (experimental) 64bit steam version by default just add following line into game launch options:

LC_ALL=C %command%_64

Don't know where to do this? See here and here.

Just replace the LC_ALL=C with the LD_PRELOAD and __GL_THREADED_OPTIMIZATIONS variables.

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
×
×
  • Create New...