Search the Community

Showing results for tags 'unity'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General
    • Announcements
    • The Daily Kerbal
  • General KSP
    • KSP Discussion
    • Suggestions & Development Discussion
    • Challenges & Mission ideas
    • The Spacecraft Exchange
    • KSP Fan Works
  • Gameplay and Technical Support
    • Gameplay Questions and Tutorials
    • Technical Support (PC, unmodded installs)
    • Technical Support (PC, modded installs)
    • Technical Support (PlayStation 4, XBox One)
  • Add-ons
    • Add-on Discussions
    • Add-on Releases
    • Add-on Development
  • Community
    • Welcome Aboard
    • Science & Spaceflight
    • Kerbal Network
    • The Lounge
  • Making History Expansion
    • Making History Missions
    • Making History Discussion
    • Making History Support
  • International
    • International
  • KerbalEDU Forums
    • KerbalEDU
    • KerbalEDU Website

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Skype


Twitter


Location


Interests

Found 51 results

  1. Memory Mapped Files The closest example to this technique is kRPC, which I experimented with about a year ago when developing my F-22 drone racer. I discovered there was a significant delay between the server and client however. I was drawing a Vector3 representation every frame at my vessels position, which would fall further and further behind the craft as it's velocity increased. I have used Unity's uNet client and local server and have not experienced this lag before, which has me stumped on the cause. I would be interested to chat with someone who knows more about this mod, or anyone who has also experienced this. Because of the lag I was experiencing with kRPC, I decided to build my own "bridge" using Memory Mapped Files. These allow multiple processes to share memory space, which can be used to store several different types of data structures. While these are easily the fastest way to send data, there is one major complication for this project. Mmap files are only supported in .net 4+, while KSP targets 3.5. My solution to this is to start a custom process at launch, which handles the memory map "bridge", then I send/receive values via the process's i/o stream. This allows me to send hundreds of values back and forth each frame, at 50+ ksp fps, which is perfect for the time being. My next goal however is to send much larger data structures over the bridge. I really want to get camera feeds from KSP into Unity, so that I can begin implementing VR control into my mods. I have successfully sent a Texture 2D byte array from a camera across the bridge on a frame-by-frame basis, but the problem is when I need to incorporate the process i/o. Converting the array to a string every frame gives less then 2fps in Unity. The solution to this is to get mMap files working in .net 3.5. I tried many different ways before settling on the additional process, with no luck. I do have a potential solution however, but could use some input from any .net guru's out there. Inverse Kinematics The IK code is all custom, but it's pretty amature. I rely on SOHCAHTOA and the law of cosines for everything. Unity gives me a lot of cheats that aren't available in real time. Vector3.distance for example easily gives me the distance between two points. It all works though, and I plan to expand it out to more joints. The neck arm also allows rotational target matching. PID Controller Servos Each servo has it's own tunable PID loop which uses the error between the actual and IK servo angles as its process controller. This output sets the speed of the servo on a frame-by-frame basis. I only have a basic knowledge of PID tuning, so if anyone out there would like to share some advice it would be greatly appreciated. Gait Sequencing & Steering Right now the only gait sequence that exists is walking at a set speed. Steering is the last thing I still need to do. I wrote some simple stride length adjustment code, which allows setting the desired stride length for both the right and left legs at the beginning of each stride. The actual steering is adjusted by a PID loop which decreases one side's stride length by a percentage of its default length. So my stride length is 2 meters, and the steering PID can shorten that by up to 5%. Terrain Mapping & Active BalanceThe hexapod now has full terrain mapping and navigation capabilities. Instead of using the vessel height to position the ground and foot targets, each foot target is now placed on the ground directly beneath it.Each hip target is now set according to the ground position of it't foot. There are two ways of setting the hip targets. Right now the hip targets are set at the average height of all the foot ground positions plus the target height. I've realized since recording the video it would be best to just set the hip targets so that the vessel is always the target height above the ground. Or some combination of the two. The first method helps going up hill, while the second is preferable for down hill. Also setting the hip height half way between the highest (foot ground + target height) and the lowest would be the way to insure each foot has the best chance of hitting it's target when going over extremely rugged terrain.The vessel also knows if it is traversing up a hill, is level, or traversing down a hill, and sets the color of the ground accordingly. Right now nothing changes according to that, but eventually the gait will respond to the different slopes. I tried taking it out to the mountains but I still need to find a way to orient the gyroscope to a level surface, even when the launch point is not level. The triangle begin drawn on the ground represents the leg group that is actually moving the hexapod forward. VR Implimentation As soon as I can get the camera feed from KSP to display to a render texture in Unity, I will be able to start moving forward with VR implementations. I have several things that will be pretty easy to accomplish right off the bat. I will be able to have the robotic head match the rotation of the HMD head, and display the camera feed to a HUD. I will also be able to control the arms of the mech using touch controllers. I have some thoughts on getting VR to work natively in KSP as well. This has been done before, with mixed results. I'd like to see if I could do better. Collaboration and Testing Let me know if you'd like to contribute in any way. If you'd like to help and know any Unity/C# that would be great, but I could use help with tuning and design as well. There are a lot of variables that need to be tweaked, as well as design changes to the mechs themselves. Let me know if you have any interest in helping and I can probably find something for you to do. There is also the potential for a real life counterpart. This is something I would definitely need help with, as my hardware skills are almost non-existant. I am planing on buying a frame in the near future, and would love to have some help implementing my Unity controller in the real world. If this interests you, there is lots of info already on this thread page. TODO Veering while walking Turning in place Terrain Mapping Traversing slopes Quadrapod/Bipedal Mech Jumping/Extended flight Thanks for checking it out, lots more to come.
  2. Hey guys. I've been having some trouble with Unity lately. (I'm using Beale's beginner modelling tutorial) Whenever I drag the model and collider OBJs with its textures into Unity, the materials folder does not appear. I don't know if there is a work around it (AKA Beale's way isn't the only way.) But if anyone has a clue as to why it's not working, that would be great.
  3. So whilst trying to put together some new model for a current mod I opened the project in unity (5.4.1) which I had built the entire mod in originally only to find tat the 'Part Tools' in the inspector is broken. I have no function for the KSP Part Tools (script) no option to set 'game data directory', no option to 'write', Nothing! The little expand arrow expands nothing, there is no option. As you can imagine I find this concerning as presently I can creating no new models as a result. So far I have tried: Re-installing the latest PartTools (both onto my machine and into the project). Restarting my machine (numerous times!) un-installing and re-installing my current version of Unity (5.4.1) Installing the latest version of Unity (5.5.1) and importing the project there. The result is the same; my KSP part tools script in the inspector has no functions/options. I have google'd and scoured the site and I can't see any solutions, any clues?
  4. I'm using Unity 2017.1.3p1 Personal and am seeing different .anim file structures based on the following: 1. Project Settings ---> Editor ---> Asset Serialization Mode = Force Text (KSP fails to load part) 2. Project Settings ---> Editor ---> Asset Serialization Mode = Force Binary and then do Project Settings ---> Editor ---> Asset Serialization Mode = Force Text (part and animation work fine after editing) In scenario 2 I remove material from the attribute and change classID from 23 to 21 for it to work. Is this a known Unity problem, a known work around or am I just doing something wrong? Thanks [snip]
  5. This thread will instruct you on how to create Kerbal parts using animated meshes, known in Blender as Shape Keys and known in Unity as Blend Shapes. Disclaimer: This guide assumes you already have a basic understanding of modeling, animation and creating simple Kerbal parts. Why??? The benefit of using animated meshes over regular scaling or rotating of a part allows you to yield complex animations from a single object, regardless of its shape. In the Kerbalverse, these parts are most noticeable with inflatable habitats. To my understanding, the only few on the market are those created by myself. With that said, I am hoping this will encourage others to take advantage of our new ability to use Blendshapes in Kerbal thanks to the hard work of @Shadowmage and his Textures Unlimited plugin, which allows for the loading of Asset Bundle-based models. Asset bundle based model loading is currently the only way to accomplish having these parts in Kerbal. For example, it would not be possible to create the egg-shape of the KEAM (BEAM module recreation) if I were to just scale a cylinder. The end resulting animation and shape would not be appropriate. See wireframe example of the KEAM's expansion using a single object: To start, create your model as normal. Create your desired animation with your object using Shape Keys (if Blender) (please Google for simple tutorials on this if you are unsure about animating with Shape Keys. The idea is very similar to animating a regular object. Where you would hit the I key for keyframe over the timeline, you instead do it over a new Shape Key on the side menu). If using Blender, it's best to save the .blend file within your Unity project location and work from that file within Unity. If you prefer or require FBX format for any various reason, ensure you have the "Baked Animation" option checked under the Animation tab in your export options. Check "Key All Bones" if using bones for anything. Inside Unity, you only need to verify your BlendShapes numbers under the Skinnded Mesh Render component of the animated object match the numbers in Blender. Warning: I noticed some issues above 9 Shape Keys where it seems that Unity only recognizes a total of 10. Avoid going above 9. It is very important to note, if you are using a model that contains any sort of part triggers, layers, or tags, like an Airlock or a Ladder, is is MANDATORY that you set up your part tags EXACTLY as shown in the following picture or they WILL NOT WORK when loaded in Kerbal: The final result on your model must still contain the proper tags and layers, for example with an airlock / hatch: To create the prefab ("Asset Bundle object file," as I call it), simply drag the ROOT part of your model FROM the hierarchy into the project folder window below. This action will create a prefab file. Select this prefab file so that you are viewing the preview of it in preview window. There is an AssetBundle "tag" location on the bottom of this window. Select the dropdown and create a new name for your asset bundle. Click KSPAssets on the top toolbar of Unity and select Asset Compiler. Click "Create" for your asset bundle, followed by Update and then Build. This outputs an assetbundlename.ksp file to your Unity project's "Asset Bundle" folder, one directory up from your Assets folder. The final step is to rename your finished asset bundle file to have an extension of .smf instead of .ksp Finally, drop your .smf file into your Kerbal addon folder inside GameData along with its textures and config file and as long as you have Textures Unlimited installed, the .smf file will be loaded like a normal part. I hope this can be of great help to the community or even just one person as I would love to see more complex animations in the Kerbal universe! Thanks!!!
  6. Sgt Doomball

    Unity Blueprints

    How should I install blueprints like the one for the REESES seat it is a unity file and not made completely into a mod
  7. Hi, I'm a student games programmer in my final year of University. I absolutely love KSP as it has given me a greater understanding of the difficulty and science behind real life space missions! I really want to make a simple 2D space sim game in order to buff out my portfolio (and for fun of course!). I've looked everywhere on the internet and forums for suggested methods of programming realistic gravity physics for orbiting etc. The problem is the methods I try are inconsistent or unrealistic... Looking at KSP it looks like they calculate trajectories and then put the ship on rails effectively so (relatively) wonky game engine physics don't change the orbit paths. I was just wondering if anyone knew how Squad achieved this, or if any developers who saw this could shed some light? Thank you very much, Jason.
  8. Hi modding gurus, Inspired in part by MacLuky and the undersized MHdlc LEM, I want to add a seat in to the stock MK1 and Mk2 Landercan IVAs - as it has been proven that there is plenty of space! Yes I can add a seat via crew capacity - but I want to see the little green lasses and laddies properly. I'm also not interested in additional props like ASET. I have installed Unity, and part tools, added a passenger seat prop, or a pilot seat from the props and 'wrote' the .mu file. Changing the part's CFG for number of seats and for Internal model doesn't seem to work as no IVA shows up? I'm assuming it isn't as simple as this what I assumed. Do I need to be working with both the part and the internal in the same scene? As I've only been loading in the internal. Do I need to include texture files as well? Or do I need to add the seat to the internal in blender and then export to unity? Kinda really starting from scratch here sorry. I've seen that there are a few hour plus videos. I don't have much time, like 30 mins a day to do any 'work' or gameplay, so pointing me at the best tutorial for this would be appreciated if that would be more beneficial. Thank you! Peace.
  9. So i made my texturing on a mesh directly in unity, without uv mapping in 3d software. And i set my material to ksp shader, and hit write. Unity Part Tools exported the image textures used on materials but in game it didn't work. Could it be done without UV maps or/ with untextured UVs ? And the materials to be an image texture, applied on the mesh in unity?
  10. Forgive me, I'm brand-new to Unity. I'm wanting to create new cockpit internals using RPM and ASET, I'm encountering a problem when I upload the Mk 1-2 cockpit, or at least that what I assume I'm uploading when I choose "PodCockpit." When I do that, I end up with the props and the "Size2PodBorder" locating outside of the interior model. On top of that, the Size2PodBorder seems skewed Here's a screenshot: https://ibb.co/e9BUAn If you zoom in to the location of the Move Tool, you'd see the base internal props. I haven't done anything else to the basic structure of it, and it runs fine when I boot up KSP. The same thing is happening when I uploaded with RPM and ASET props included. I tried it on others, and it's happening on Cupola but strangely enough not on the Mk 1 Pod cockpit. Any ideas?
  11. Eleusis La Arwall

    Unity for Linux

    I'm currently setting up a new computer and want to get back to KSP modding. I'd like to use Unity on Debian 9 and I've got some experience with Unity 5.4.1f1 on Debian 8 but it did not end very well last time. Before I install anything I'd like to know if other modders are running Unity on Linux (successfully)? If so, what version of Unity? Does anyone know where to get Unity 5.4.0p4 for Linux? Couldn't find it here.
  12. Hey guys, so I try to add a few parts to KSP for fun. It all works great but now that I want to add a bumpy texture I run into an issue. It works in Blender and Unity but is very bright and flickering in KSP (The control thing up top). I generate a normal map from a bump map in Unity and as mention it works well inside the Unity editor as you can see below. The texture I use besides the normal map is just the regular on is use for the other parts aswell. Maybe that's the problem? I read another forum post where somebody said I would need to somehow mix the specular and the color texture into one? I'm not sure what that means. Thanks for your help! The Solution: Somehow the Unity internal Normal map generator from greyscale images doesn't work well with KSP. I downloaded the software xNormal and it works flawlessly when I generate normal maps with it. What a pain that was.. Nope, I changed two things at once and the real reason it didn't work before was I exported my textures to TGA files. THat works on regular textures but bump maps apperently go wild using these. Using MBM as a texture format works!
  13. Just trying to gauge Best Practices while I'm upgrading the Unity projects for a large-ish mod I'm maintaining (I'm not very experienced with Unity). Do people use a single Unity project into which they put all of their parts (albeit in different scenes)? Or do people use one Unity project per part? Or something entirely different? The current layout is a bit of a mish-mash, some parts are grouped into a single Unity Project, others are in their own so I'm trying to see what the best way forward is. As far as I can see: Keeping each part (or small set of related parts) in separate projects: Pros: Fast loading allows multiple people to work on the project as long as they don't work on the same parts no need to worry about global namespace Cons: Need to maintain Unity project settings and add-ons and plugins (such as PartTools) separately for each project, taking up additional space in the repository and bigger chance for misconfiguration Maintaining a single Unity project (and separating parts into suitable directories): Pros: A single unity project easy to double-check global settings (eg, "Visible Metafiles" or the PartTools "GameData") single copy of plug-ins/addons potential for shared assets Cons: Slow loading more chance for conflicts if multiple people are working on the project more care needed to keep namespace "clean" (ie, must name all assets uniquely) What do people think?
  14. hello all, unfortunately I have already made some attempts so far without success. The model in Blender shows the geometric focus quite differently from Unity. so the Kerbal-pilot does not sit on the chair but higher up. - What am I doing wrong, is there a solution? thanks a lot in advance ! full album - https://imgur.com/gallery/CHcF3
  15. Skarch

    Replacing default sun

    Does anyone know how/if you can replace the default sun texture in 1.3.x? A few releases ago I was able to edit an asset file to get this... I've tried editing the sharedassets10.asset file (which I "think" is the right one) but I'm not having any luck. I'm thinking I just need the right editor, maybe the one's I've tried (UAE, UAV) aren't compatible with the newest version of Unity? I've also tried various sunflare mods (which are all awesome!) but the flares are still superimposed on top of the default sun texture. Or maybe there's a way to create a ghost in Scatterer that blocks the default sun, but I haven't figured it out yet. Seems the default sun texture always has priority. Anyway, thoughts and suggestions very much appreciated, thanks!!! -S
  16. Having been modding silently in the background here for several months. My latest creation is a RL10-b1 engine based on 2 photographs I have found online. Its made to scale and has the same performance characteristics. RL10-b1 is a big engine for a deep space thruster. its 2.24 meter nozzle and 4.11 meters long and produces 110 kn of thrust at 4600 m/s exhaust velocity an amazing weight of 277 kilograms. The concept of this engine is that if you are using INRU to resource your engines on things like dirty ice balls, comets or asteroid belt objects the end products are going to be hydrogen and oxygen. This is the engine to have. http://www.aerospaceguide.net/rocketengines/RL10B-2.html This version has the extended nozzle I have alot of detail in the nozzle, composed of conic sections with 96 sides per section, 192 trigs per section and interior and exterior. Plus the aluminum rings that hold the radiators on the outside. so each engine has about 4 conic section profiles so this is around 3000 trigs, plus the trigs for connection, piping going in and out of the engine. The photo shows irregularity in the lip of the bell, I used this to my advantage so that 24 of the pieces form the exterior limit. This lowered the collidor trig count considerable. All and all the RL10-b engine is more like anything in the stock game, and it works. But I need a base engine for a big off-world base lander. Blender has a Edit tool called duplicate that allows on to make exact copy of center engine to six equilateral hexagon position engines. Kept my collder mesh the count low be enshrouding the array so that only a few point on the engine stick out. So I figured I would make a three engine combo and seven engine combo. The three engine combo worked. but the seven engine combo gave me this when I imported it into Unity. Meshes may not have more than 65534 vertices or triangles at the moment. Mesh 'Circle' will be split into 2 parts: 'Circle_MeshPart0', 'Circle_MeshPart1'. UnityEditor.DockArea:OnGUI() Uh-oh, I broke Unity. No meshes were made. Apparently unity could not calculate the collider mesh, too many trigs.
  17. I'm experimenting with Kopernicus's ring shader, which means I am finally forced to use the Unity Editor. The shader source file itself is in the Kopernicus repository, but the project used to compile the shader is in a separate shader-export repository. To work properly, the compilation project needs to access the source, but I don't want to have to worry about manually copying files back and forth as I make changes. The natural solution is a symbolic link from the compilation project to the real file in Kopernicus. At first this worked fine, but at some point something unknown to me changed, and now when I compile the shader, Unity also replaces my symbolic link with a regular file copy of the shader source. As far as I can tell, nothing I've done has requested this behavior. Since I'm editing the real one in the Kopernicus repository, the next time I trigger a compile, my latest changes aren't reflected, which is inhibiting my efforts to "learn by doing." Is there a way to tell Unity not to mess with files I haven't requested it to touch? Alternately, can I compile shaders for Unity without using the Unity Editor (a very attractive option since it also seems to consume 100% CPU when idle)? A command-line solution would be ideal.
  18. Hello guys ! I'm currently working with @HuXTUS on a part. Adding a diffuse and a bump/normal map is very easy but what's up about specular ? Shaders have no slot for that. I just found an 1 year old discussion about that : Is it still the way to add specular ? And moreover, does someone has a great tutorial on how to add this alpha channel to the diffuse (not sure it works with JPG and PNG should be a mess for the game, no ?) Hope to see some help quickly Thanks
  19. Hello, I am now trying to configure @EmbersArc's KRE NDS docking port for my mod, but without the lid, in Unity. I made the changes in Blender, and now I'm trying to get the docking node to work. What is happening is the docking ports don't dock, but just rest on each other. This is my config: PART { name = CapsuleDockingPort1 module = Part author = EmbersArc mesh = model.mu rescaleFactor = 1.07 node_stack_bottom = 0.0, -0.385, 0.0, 0.0, -1.0, 0.0, 1 node_attach = 0.0, -0.385, 0.0, 0.0, -1.0, 0.0, 1 TechRequired = advMetalworks entryCost = 8400 cost = 400 category = Coupling subcategory = 0 title = KASA Docking System manufacturer = 7D Exploration Technologies Inc. & Kerbobulus Corp. description = A very advanced docking port. Fully compatible with the Clamp-O-Tron Docking Port system. attachRules = 1,1,1,1,0 mass = 0.05 dragModelType = default maximum_drag = 0.25 minimum_drag = 0.25 angularDrag = 0.5 crashTolerance = 10 maxTemp = 2000 // = 3400 bulkheadProfiles = size1, srf tags = berth capture connect couple dock fasten join moor socket stagingIcon = DECOUPLER_VERT stagingIcon = DECOUPLER_VERT MODULE { name = ModuleDockingNode referenceAttachNode = dockingNode deployAnimationController = 1 nodeType = size1 staged = False stagingEnabled = False } MODULE { name = ModuleAnimateGeneric animationName = Extend Docking Ring actionGUIName = Toggle Docking Ring startEventGUIName = Extend Docking Ring endEventGUIName = Retract Docking Ring allowAnimationWhileShielded = False } } Unity file (Unity 5.4.2p4) Texture Model
  20. With a normal modern object oriented system, you can inspect the public members of a class to understand what it can do and how to access those capabilities (and if all else fails, you check the documentation). If new capabilities are needed on an existing object, new public members are added, and if the new capabilities that you need are not a good fit for any existing object, a new class can be created instead. In Unity, as I understand it, the world is made up of GameObjects, each of which has Components to add capabilities. Consequently, many tasks in Unity boil down to finding the right GameObject, accessing the right Component of that GameObject, and then manipulating the Component's properties. But since a Component can be of any underlying type, and added and manipulated at run time by any code that has a reference to the GameObject, this breaks the in-code link between the outer object's type and its own capabilities; in effect, Unity encourages application developers to escape from the self documenting properties of languages like C#. (On the plus side, this is far more mod-friendly because the stock code can be added to without requiring a recompile.) I'm currently struggling to get my bearings in this environment. Specifically: Given a GameObject, is it possible to inspect its Components at run time to figure out what the object can do? There's no public Components collection, and GetComponents requires a type parameter, meaning you must already know the type you're looking for. Given a GameObject and a task that you know it can handle, how would you figure out which Component to access? I guess this might be the same as #1, but maybe there's a way to do it without listing them. Given a GameObject and a task that you know it can't handle, how would you find the appropriate Component to add? Is there an index of them somewhere, or a base class that would show up as the root of inheritance trees in online documentation? If you'd like an example to work from, I have a DialogGUIImage (because the DialogGUI* API is easy for simple pop-ups), and I'd like to change it to display a different image. I have tried a few things so far and not gotten good results. No effect, as expected because of how DialogGUI* works (visibleTexture is my DialogGUIImage, nameToTexture works when the popup is initially loaded, and 's' is a valid name): visibleTexture.image = nameToTexture(s); Compile error (texture property is read only): Image img = visibleTexture.uiItem.GetComponent<Image>(); img.sprite.texture = nameToTexture(s); The desired image appears on screen, offset outside of the window to the left and slightly up, and the original remains visible (based on @TaxiService's helpful post about adding and removing UI elements): Stack<Transform> stack = new Stack<Transform>(); stack.Push(visibleTexture.uiItem.gameObject.transform); visibleTexture.Create(ref stack, UISkinManager.defaultSkin); Image is hidden (technically a successful action, but I don't know how to replace it after this): visibleTexture.uiItem.gameObject.DestroyGameObjectImmediate(); I feel like there must be a GameObject.GetCompoment<SOMETHING>().SetTexture() call somewhere out there that I should use, but I don't know what the SOMETHING is or how to find it, hence my questions above. Thanks for any answers or tips!
  21. Hello guys. I have a problem with DDS into KSP. In Blender and Unity (in preview used DDS textures) it seems very nice (with KSP shaders), in Blender only one UV. So, I place in the model PNG textures, compile in PNG and replace them with DDS textures (how it seems), and the texture looks bad, but if I don't replace main texture, then it works as it should (but without specular). I checked sizes, and they matched. How to fix it? I really need DDS for a specular map in alpha.
  22. Imasundaj

    Why Unity?

    I was wondering why KSP was built using the Unity engine. If someone could explain to me why Unity, and why not a different game engine, that would be much apreciated. Why is Unity better than other engines? Why is it worse? Why KSP won't move to a different engine? What would be the biggest challenge? (Apart from basically starting from scratch obviously.) I'm simply curious. That is all.
  23. I'm coming across what I think is a bug in Unity's font system that makes it hard for me to "be nice" to other mods and not break them. This problem is weird and what I've learned so far is from a few weeks of on/off trial and error and experimentation. I could be wrong about the cause, but I've barked up a lot of wrong trees already trying to find other possible causes before settling on what I'm about to describe below as what I think is the cause of it. So what's the problem? These two Unity methods Font.GetOSInstalledFontNames Font.CreateDynamicFontFromOSFont can break fonts that are also in a Resource/Asset file if these steps happen in this order: 1: Unity loads a font from Resource or Asset files, but hasn't had any occasion to draw anything in that font yet. 2: Using Font.CreateDynamicFontFromOSFont(), You create another Font instance that is for a font the same font family as the one from step 1 above (i.e. loading "Arial bold" when "Arial" was loaded in step 1.) 3: The font from step 2 (DynamicFontFromOSFont) gets rendered into some text. 4: The font from step 1 (From the Resources or Asset file) gets rendered into some text. When you do the steps in that order, then Unity gets confused and seems to wipe out all the glyphs of BOTH instances of that font from then on (i.e both the one from the Resources and the one from the OS). From now on it will render all text in that font as blank, (and it now claims that all text drawn in that font is 0 pixels wide and 0 pixels high, so things like GUILayout buttons get shrunken to minimum size in addition to not being able to show the labels on things because the font is blank.) Note that if you swap steps 3 and 4 so the Resources font gets exercised in some way before the DynamicFontFromOSFont does, the bug does not happen! It only happens when the first attempt to draw something in the font instance that was built from the OS font call happens prior to the first attempt to draw something in the Resources instance of that font. Note that it's the order in which the fonts get USED to draw something that matters here, not the order in which they first get loaded. (i.e. you can swap steps 1 and 2 and it doesn't change the outcome). As you can tell from the fact that I used "Arial" as my example case above, this means when we do this in kOS, I have the chance to break every other mod that uses Unity's default GUI.skin for something. Oh, and this isn't just about using the legacy IMGUI. I noticed that the act of using the font *anywhere* in Unity is affected, even when I draw 3D hovering text in Arial in the game scene - if the Arial font has had this bug trigger, then that 3D text won't show up. I can trigger the bug by choosing to render font text into a Texture2D in memory that I don't even show on screen anywhere. Even rendering it that way triggers the same problem so long as I do it in the order shown above. Why did I want to do this?: At this point, the person reading this might be thinking, "Well then just don't do it! Stop using the OS fonts and instead ship with one and hardcode it.") So I feel I have to defend my desire to support doing this: I'm trying to let the user use any font on their OS as the kOS terminal font, and move away from our current technique of cutting and pasting regions from a texture file that contains images of the 128 ASCII chars. (For 2 reasons: Using a real font scales a lot better than stretching a bitmap image for those users who prefer the terminal to use a bigger font, and more importantly it would let you print to the terminal in your preferred language, for which you probably already have a font you like installed on your computer that's better for that purpose than whatever we might ship with. But wait, isn't it only a conflict when you actually try to RENDER the font? Isn't the user just picking one font, not every font on the OS? True, but Unity does not expose any of the metadata about a font until after you load it, and even then you still have to actually render a few characters with it before all that you need to know manifests itself. If you haven't loaded a font from the OS yet, then the font's string name is literally the only thing you know about it. You don't know if it's bold, italic, etc (except from making a heuristic guess from looking for substrings in the font's name like "this font's name has the word 'bold' in it. I guess it must be a bold font then.". Most importantly for my case - you can't tell if it's monospace or proportional until after you load it and try rendering a few characters with it. The font metadata isn't available through Unity. So I was doing a quick dummy render of a short string containing some wide and some narrow characters, and counting the pixels Unity reported it took to do so to find out if it's monospaced or not. This is relevant since I use the font to paint a terminal very fast by drawing each line of the terminal as a single string - I need to restrict the picks the user is allowed to the monospace fonts only. It's that test for monospace that mandates that I actually give each font an experimental test render, and it's doing that which caused me to trigger the bug this post is talking about. I thought this would be really slow at first (test render every font) but it turns out that even on a computer with a few thousand fonts installed it only takes a couple of seconds, and I only have to do it once and then never again (and I can throw away the font after I tested it so it's not eating up memory once I learned it's proportional). So why not just avoid it by forcing the order to come out the "safe" way? An obvious fix presents itself: Before trying to use any Font that comes from CreateDynamicFontFromOSFont, kOS could just make sure to iterate over every Font object that ii finds in the Resources and perform a dummy rendering with each of them. (i.e. Tell it to render "Hello" into a Texture2D, then throw away the Texture2D, just to exercise the font a bit first which seems to prevent the bug.) I have tried that and it does work.... but... read on: I'm not in control of the order that OTHER mods do things in, nor am I in control of what order Unity chooses to call the Awake() and Start() methods of all the Monobehaviours from all the mods, nor am I in control of whether or not other mods might try to wait and lazy-load a font dynamically from an asset bundle later on during the game. This means there is no point in time when I can reliably answer "yes" to the question: "At this point have all the fonts that will ever get loaded, during the life of this process, from any Resource/Asset, been loaded and we know there will be no more?" In order to reliably use this workaround to fix the problem, I have to do so at a point in time when that is true, otherwise there will be a Resources/Asset font I missed when I performed the "foreach Resource font, render something small with it" code. So now to the questions for other modders: (1) How many mods actually bother trying to ship with their own font? Then again, with SQUAD doing localizations in the next release, who knows if maybe even THEY might wait to load a font later on after game initialization so I can't rely on knowing if they will do so. Could it be so few mods that the solution is to simply see if we happen to break another mod and if so then react to that and work with the other modder to come up with a scheme to force a known loading order between the fonts used by our two mods? (2) Do I need to consider splitting this work off into a standalone font manager mod and then make kOS require it as a dependency? Then any modder that wants to load fonts should have to work through it instead of doing it on their own? (i.e. similar to other library-mods like the CommunityResourcePack, the goal of such a mod would be to make sure all font loading happens in one place where the order can be enforced to prevent the bug.) (3) Any suggestions for a workaround that I might not have tried? I'm really not a Unity expert at all. The only things I know about it I know from doing kOS dev work. Yes, I am aware of the fact that Unity lazy-loads font glyphs (I found that out when trying to implement other parts of this system) and therefore the need to use Font.RequestCharactersInTexture() before attempting a test render to look at character size. But I suspect the bug above is somehow related to this lazy-loading feature misfiring in some way so the two different instances of the same-named font are stepping on each other's toes, or maybe Unity is getting fooled into thinking it already performed all the lazy-load work for both versions of the similarly named font when it really only did so for one of them. (Thus the font's data never gets populated because it thought it already did so?) (4) As KSP gets more international users, will more mods start considering using their own fonts so that even though this might not be a problem today it will become one soon so I still have to worry about it? (5) Is this a known Unity bug that was already fixed in a release of Unity but we don't have it yet because KSP is a few revisions behind? If so might the problem magically fix itself in the next KSP release? I tried searching Unity's issue tracker for font-related bugs and spent a long time walking through them and not finding anything that seemed related, before I gave up on trying to do that.
  24. Is BackgroundWorker safe to use in KSP mods? I'm using it to offload intensive calculations, but I need it to trigger a UI update when done, and whenever I do this by any means other than setting a bool that I check later in the main thread's Update(), the whole game crashes. Including via the RunWorkerCompleted event. https://msdn.microsoft.com/en-us/library/hybbz6ke(v=vs.110).aspx https://msdn.microsoft.com/en-us/library/system.componentmodel.backgroundworker(v=vs.110).aspx I tried logging Thread.CurrentThread.ManagedThreadId, and that seemed to confirm what I suspected, that it's running the RunWorkerCompleted handler on a background thread, sometimes the worker thread, sometimes a completely new thread: I expected each of those bold red thread IDs to be 1, since that's the thread that kicked off the worker. Are there any known tricks that will make this work the way the C# documentation says it does? People on stackoverflow are talking about SynchronizationContext.Current, which is null the very first time but non-null after that, but setting it to a new instance of SynchronizationContext doesn't change the above output at all.
  25. I'm calculating the time from periapsis, but for some reason it always returns me a positive number. So in this case, the calculation results correct, it gives me 5 seconds since rocket passed periapsis. However here: And in this case it also gives me 5 s since we passed periapsis, it should be -5 s ( or orbital period - 5), but no it gives me the absolute value. Here is the code ( C#) GetStaticProperties(); // Calculates all the static parameters, like eccentricity semiMajoraAxis etc double e = _eccentricity; // works fine till here trueAnomalyAtStart = -Math.Acos( Double3.Dot(_eccV, posIn) / (e * Double3.Magnitude(posIn)) ); print ( (Double3.Dot(_eccV, posIn)) / (e * Double3.Magnitude(posIn) ) ); eccentricAnomalyAtStart = 2d * Math.Atan(Math.Tan(trueAnomalyAtStart / 2d) / Math.Sqrt( (1d + e) / (1d - e) )); anomalyToPeriapsis = (eccentricAnomalyAtStart - e * Math.Sin (eccentricAnomalyAtStart)); // ^ this is the anomaly to periapsis, this is what I actually use in the orbit math timeToPeriapsis = anomalyToPeriapsis / meanMotion; // this is used just to show player time to periapsis If anyone know what the problem is, I would really apreciate it.