Jump to content

Johannes

Members
  • Posts

    16
  • Joined

  • Last visited

Reputation

125 Excellent

Profile Information

  • About me
    Former KSP2 Dev
  • Location
    Seattle, WA

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. You got this, I'm rooting for all of you
  2. That is practically impossible, for anyone…
  3. Yeah, computer graphics can often appear far simpler than their actual implementation needs to be. It's generally far easier/faster to draw a ton of straight lines that then appear like a curve, than to actually draw a completely accurate curve. And this is really only an issue in our specific use case because of the extreme camera perspectives these lines need to be viewed from. Additionally, as I touched on at the end of the post, we use the generated point data for more than just rendering, such mouse interaction raycasting, so while doing it all on the GPU and just throwing an obscene number of points at it and not worry about efficiently spacing them would have been a possible solution, for our use case a smaller, but more carefully placed, set of points ended up being a solid compromise.
  4. We are not using the full DOTS stack. We specifically use jobs to parallelize calculations in a few select systems where there are clear performance benefits. For example using jobs allows PQS+ to be considerably faster than the original system in KSP1 and without which the quality of terrain that we are working on would not really be feasible. Your concerns are absolutely valid. We try to be very careful when evaluating any new technologies that we might use in this project. The Unity Jobs System has been a part of our codebase since fairly early in development, but its uses are also isolated enough that if it was really necessary it could be replaced with another multithreading mechanism. So far however it has suited our needs well.
  5. Without going into too many details, we are using Unity Jobs in conjunction with the Burst Compiler and Unity.Mathematics to speed up some of the more expensive calculations.
  6. One thing I did not address in the original post is that this is less of an issue in KSP1 since it fades out orbits based on context: When you zoom in close enough on a celestial body its orbit fades away, for example when looking at Kerbin close enough to see its Moons as I do in my mockup. This is absolutely a valid approach which makes the issue far less noticeable in most cases.
  7. The kind of tessellation you are referring to is GPU-accelerated mesh tessellation that is a part of the rendering pipeline on modern GPUs: (image source) GPU tessellation is specifically designed to take a mesh and subdivide its triangles, which are then used in subsequent shader steps and ultimately rendered as a texture. The key takeaway is that the mesh generated by this tessellation operation is intended to be used for rendering, and otherwise cannot really be accessed directly anymore by code running on the CPU. It also requires a sufficiently detailed input mesh that you pass to the GPU first. The kind of "tessellation" discussed in my article is fairly different. We use the CPU to generate a relatively small set of points (rarely more than 100 per orbit) and their bounding volumes which we need to be able to access later for more than just rendering. This is also before any triangle mesh has been generated. If you want to render the path that is defined by these points you still need to generate a mesh first to pass to the GPU. For example in my sample code that would be performed in the implementation of the Draw.Line call itself. GPU tessellation is generally better suited for large complex geometry that you want to look smoother up close, such as terrain. While a more generalized compute shader could be used to perform some calculations, there is always overhead when passing data to and from the GPU and so it generally makes more sense in cases where you have a large enough amount of simple calculations to perform to really benefit from the GPU's parallelization. We do use compute shaders for some systems where there are clear performance benefits, but not for generating the points of our orbits.
  8. Joking aside though: As far as the orbit tessellation system is concerned, multiplayer orbit trajectories are no different from local ones
  9. Wait, this game has multiplayer? (yes that was a joke)
  10. The tessellation algorithm can work for anything you can define as a parametric function that is contiguous between a given start and end parameter. The truth may surprise you! Since you asked, this one!
  11. I don't think I can get into too much detail there, but you've got some good intuition. Only position is considered in the smoothness heuristic. Color and thickness are just stored after a point is generated using its parameter. Any effect other than a smooth gradient would probably be best done in a shader on the mesh ultimately generated from the points. But you are right, in some edge cases such as when points are too close to co-linear you may need more than just the triangle heuristic to get enough points for it to look good
  12. Yep! When first prototyping new tech like this it can often be faster to iterate on it in isolation in a separate project until it is ready, and since I already had that it's what I used to make these visuals. The 'icons' here are just 3D spheres that scale to a specific size in screen space (which was trivial to do with the drawing library I was using), so there is really no logic that prioritizes one over the other. You may have heard of the term 'separation of concerns' in programming. The portion of what you see here that can be found in the game is just the equivalent of the GenerateParametricPoints function in the post: It's a dedicated class that you give parametric functions for position, and optionally color and thickness, as well as anything else it might need such as the camera it should use, and it is only responsible for generating the points (and some additional metadata). How those points are actually drawn to the screen once you have them is the responsibility of a whole different bit of code that is far more basic here than it is in the actual game (and that could probably fill a blog post all on its own).
  13. Excellent - I really enjoyed writing it and I hope I get more opportunities to talk about some of the other cool new systems we're working on! Heck, I could easily write another post about this system alone - I really only described the basic implementation that my first prototype used. If you were to implement it as is you might notice some edge cases where the triangle heuristic doesn't quite cut it. If you look closely enough you might even notice some slight differences between what I described and the behavior in the renders (which use the current version of the system) That's honestly the biggest compliment I could ask for - Over the years I've learned so much from blog posts that covered topics that were way over my head at the time, but were written well enough to still be able to get something out of. I just want to pay it forward
×
×
  • Create New...