Shadowmage

Members
  • Content Count

    4,594
  • Joined

  • Last visited

Community Reputation

6,751 Excellent

About Shadowmage

  • Rank
    Sr. Spacecraft Engineer

Recent Profile Visitors

18,220 profile views
  1. IDK, maybe? It should have 1.9.0 as the min version; I don't support older KSP versions (doesn't mean it won't work, I just don't test it, and I'm not offering support).
  2. The volume and pitch entries listed are key=value pairs, specifically, points on a FloatCurve defining how the pitch and volume will be effected by... likely be 'power' in this case (but it could also be 'time within the audio clip). volume = <key> <value> defines a single entry in the volume curve, where 'key' is the input time/power, and 'value' is the output for the volume at that input. These are all based on Unity Animation Curves, documented here: https://docs.unity3d.com/Manual/animeditor-AnimationCurves.html
  3. @Karussko Also make sure you don't have TextureReplacer or TextureReplacerReplaced installed, as those are known to disable the stock reflection system (and cause other conflicts). (it may be possible to still use them, with specific changes to their configurations, but I'm not personally sure what those would be; never used either of them)
  4. I believe the config you posted should be valid/usable, but the texture-use side of the effects was mostly untested; as I was unsure what they should actually do, I had no way to know if it was functioning correctly. If you want to make the texture available for selection from the in-game UI, you would need to define a config node for it, such as can be seen here for the bloom lens-dirt textures: https://github.com/shadowmage45/TUFX/blob/master/GameData/TUFX/Textures/BuiltinTextures.cfg Please let me know what/if anything you find in your investigations; will be glad to fix up anything that might need some attention code-wise.
  5. Yeah, I'm suspecting you are correct; or rather, they now included previously un-rendered layers in the ReflectionProbe, and/or increased its far-clip to cover the larger distance. Should work fine in...KSP1.6+? (dunno, don't test older versions...) Yes, there is a typo in the .version file, that last line (KSP_VERSION_MAX) should also be referencing KSP 1.9.9
  6. @Manwith Noname Sorry, got distracted there for a bit, but I do still intend on releasing an updated to TU that has the recently added 'multi-TUPartVariant' module support in it, hopefully sometime over the weekend. Have you run into any issues with the system as currently implemented? Looked like it was working, but as I said when I posted it, I did zero testing on the changes; just wanted to see if there were any problems, or if it was all working as expected.
  7. You are looking for the KSP.log file, which should be located in the same directory as the KSP executable/binary (at least on Windows). Then, in the log file, your exported profile should be near the bottom of the log. Copy the profile config node out, and save it into a new text-file located somewhere in your KSP/GameData directory. Change the 'name=' line in the config to give it a new and unique name, and then launch KSP. Your newly created profile should show up in the profile selection list.
  8. Neither Scatterer nor EVE are updated for 1.9.1 last I checked. Kopernicus likely not updated either (but I don't use it, so don't check it). Unsure on the others in your list except TUFX (which requires KSP 1.9.x). Need to either revert to KSP 1.8, or wait for those mods to update.
  9. More fun with rendering post-process rendering; this time...something different (but still a bunch of spheres...). What is being rendered there is a 'shadow depth' value, from the perspective of the light. The 'standard' view of the scene can be seen below: This is actually driven by a secondary camera that renders high-precision floating point depth (from the light's perspective) to a render-texture, which is dynamically positioned, ranged, and configured to keep the 'planet' skybox in its view. From there, the world-space rendered position is basically raycast into this depth texture, to tell how far the surface is from the nearest object to the light. The process I've best seen explained here (quite a good read actually): https://medium.com/@shahriyarshahrabi/custom-shadow-mapping-in-unity-c42a81e1bbf8 , though I'm doing things quite a bit different than that implementation. The question again on all of this might be 'why?' Well, more cool effects, of course. This one is what is needed for volumetric shadows, part of the volumetric lighting implemented in the scattering shader: https://en.wikipedia.org/wiki/Volumetric_lighting. Again, I was originally intending to not do any such work, but realized that many of the artifacts currently seen in the scattering implementation are due to lack of shadow depth information being fed into it, so it is illuminating things improperly in some cases. Can/will also be used for the ocean rendering, to likely very good effect. Still just playing around at this point, haven't started creating anything 'usable' out of all of this. Exploring the concepts, developing some code, learning some techniques and learning a ton about Unity rendering pipeline and shaders/graphics in general. Fun stuff...
  10. More work over the weekend, still just doing research, playing with things, and learning some new techniques... (lighter and darker areas are 'higher and lower' in the heightmap; blue highlighting is from specular lighting, showing bumpmap application; background is random colors based on view-direction...) (this is the same scene, but rendering the world-space normals) (same scene again, showing just the generated height-map... which is little more than simplex noise at this point) This is a 'screen-space bump-mapping' technique (there is no geometry being rendered...); the entire thing exists only in the shader code / parameters fed to the shader. Bunch of trig to determine where the sphere should be drawn based on the input values for size and location. Use the world-space position on the surface of the sphere to sample a 3d noise function to generate a seamless height-map (no distortions, no pinching, no stretching, no weird stuff at the poles). Calculate a tangent-frame for the world-space position on the sphere, and use that to derive a 'sphere-surface tangent-space normal' value from the sampled noise point and its neighboring points. Transform the tangent-space-normal value back into world-space, and manipulate/adjust the existing analytically derived surface normal accordingly. Finally, use the combined world-space surface normal to calculate diffuse and specular lighting for the sphere based on the primary light in the scene. One of the best parts is that it can all have dynamic LOD applied based on view depth; surfaces can appear smooth from a distance, with the details manifesting (smoothly, without pop-in) as the zoom level increases, with no maximum zoom level defined. You want sub-mm resolution of details? Fine. The noise sampling point-distance actually decreases as the camera nears the surface, enabling per-pixel sampling at all zoom levels, maintaining a very smooth look all the way down to the finest detail levels visible in a KSP camera based on its near-clip distance. Why would I do such a crazy thing you might ask? Ocean/water rendering.... specifically, rendering of the ocean surface and the optical effects of waves and surface distortions. Originally I had intended to not do anything with oceans, but after realizing that it is the stock KSP oceans that cause the issues with HDR, I figured it might be worthwhile to include something quick-and-functional (as HDR can make the rest of the effects so.. much.. better..). Currently it is using a quick and dirty 'shader based simplex noise' implementation for generation of the height-map, but all of the functionality is in-place to enable use of FFTs or other more realistic water surface calculation methods, merely need to get to a point where it would be practical to implement them. Furthermore, as it is all just a screen-space effect, it will likely be more efficient than KSPs existing 'mesh based' solutions for specific hardware. I'm planning on joining the screen-space water-surface effect with a volumetric-water-lighting shader; basically atmospheric scattering, but for water (possibly even just adapting the scattering system to do oceans as well...). Will handle the optical effects of underwater fog, light-color-extinction / light extinction based on depth and distance the rays travel through the medium. Should all be very 'physically based' when it is done, but should also be configurable to allow for non-realistic uses (e.g. EVE... why is it purple?). So far I'm not planning on implementing any complex reflection or refraction setups, nor caustics or anything special in regards to lighting (no volumetric shadows...yet). Want to get the basic effect setup and working before I try to add in the more complex functions. Yes, I'm also still working on the scattering shader. It functions, but with caveats and issues I'd like to track down before I release anything. Need to implement a full config-file based system to specify the bodies on which to use the effect, and the settings for each of those bodies. Need to do further cleanup and validation of the loading code and the configuration functions, and give the entire thing a good pass over to look for source-code issues in the shaders (was a very hacky port to PPPv2; much stuff commented out, done terribly inefficiently, or simply incorrect and not yet updated). Also still working on the configuration UI in regards to adding the 'spline parameter' editing functionality. Might still take a while to get this in place, as I've yet to find a workable system that can be done under the Unity IMGUI system. Likely though I will have this finished for the next major / 'feature' release (excluding any bugfix release that may happen before then).
  11. No. TUPartVariant is for working alongside the stock PartVariant system. What? I'm not sure what you are asking here? No, that little sample will not 'work' as it was just enough info to provide to @Manwith Noname so that he would know how to use additions to the system -- to provide him the syntax to use for his configs. If you are not writing configs, then you don't need to worry about it. One works with the stock PartVariant system, the other does not.
  12. @Manwith Noname I've added support for multiple TUPartVariant modules, as outlined above: https://github.com/shadowmage45/TexturesUnlimited/commit/c44539197b9a95c9fe49308e0c2e602a349de6e4#diff-2878e404bd26e80b4b5bd93e6116a98eR32-R294 Changes are available on the dev branch if you were interested (grab the .dll from there), though I've not given it any testing yet. As you inferred, It isn't going to work exactly like your combo of B9 + multiple TUTextureSwitch modules, as with part-variants you wouldn't be able to switch the texture-sets individually (they are tied to whatever was specified in the variant), but it should still allow for multiple recoloring section entries on a single part.
  13. @Manwith Noname I still might be able to do something about this in TU, to allow for multiple TUPartVariant modules to reside on a part, each referencing their own configs (somehow). They would all still be driven by the stock part-variant system, but target different meshes, and each add a new 'section' to the recolor GUI. Just something that would need a bit of deliberation on how to best implement it, and likely could be some very simple config changes to make it all work. Think, for instance, right now you add an EXTRA INFO block to the variant config, with a 'textureSet = XXXX' reference. What if we added an index/number to the TUPartVariant module (Default = 0), and appended (optionally) a number onto the texture-set names in the info block? Something like the config below (just random samples, not actually a viable config...) //the extra-info blocks would look something like: EXTRA_INFO { textureSet = textureSetForFirstGroupOfMeshes textureSet1 = textureSetForSecondGroupOfMeshes textureSet2 = textureSetForThirdGroupOfMeshes } // - OR - // (the '0' on the default textureSet property name is optional) EXTRA_INFO { textureSet0 = textureSetForFirstGroupOfMeshes textureSet1 = textureSetForSecondGroupOfMeshes textureSet2 = textureSetForThirdGroupOfMeshes } MODULE { name = TUPartVariant sectionName = FirstGroupOfStuff moduleIndex = 0 } MODULE { name = TUPartVariant sectionName = SecondGroupOfStuff moduleIndex = 1 } MODULE { name = TUPartVariant sectionName = ThirdGroupOfStuff moduleIndex = 2 } Thoughts? Is that, in essence what you are trying to accomplish? (if yes, I could very likely have this in the next release of TU; would require fairly minimal changes to the code I think (barring stock weirdness interfering with simplicity)). There should be no practical limit to the number of TUPartVariant modules you could add; only limited by how the model was split into meshes and how you wanted to group them up. I wholeheartedly agree. If only it were an option, and if only they would stop making even more parts use it. Or if they would fix the code-side API of it to not be full of holes. Or if they would fix their shader loading to recognize external shaders (you wouldn't even need TU modules then, aside from recoloring) (this would actually be a very easy fix for them; simply stop using Shader.Find("shaderName"), create a list of 'loaded shaders', let mods add shaders to that list during startup, and then search that list when looking for a specific one by name) .
  14. Argh! How could you do this to me I'm not seeing any messed up normals or weird rendering, almost like things are working like they should (sarcasm...) Top-right; uses two-way symmetry for its plate, the others are using 4-way. You can tell because the latches on all of them are 'messed up', but the latches on the top-right are 'less messed up than the others'. Its all about those little details... (that SQUAD misses, or messes up...). Note that one way to do it properly would be for no symmetry/no mirroring (2x/4x the texture space used); the other way would be to instead of mirror, simply repeat, but then you need to make each segment/wedge 'seamless', which is harder than it sounds for non-UV aligned textures.
  15. No. You have to use PartTools (in the Unity Editor) to compile the model to be compatible with and loaded by KSP. KSP will not load the standard .fbx, .obj, .dae, etc. model formats, and will only load its special .mu model format (incompatible with everything, and only used by KSP; binary encoded model with no published spec sheet for the encoding/decoding standard). In the past there was a '.mu importer/exporter' plugin for Blender (it still around), but I don't think the 'export' functions are compatible with newer KSP versions. You could give it a try (might work), but the real answer is still 'You have to use Unity, cannot be avoided'. (or if you are a c# wizard, you can whip up your own model-loader KSP plugin, and then use whatever model format you want...)