Jump to content

Developer Insights #12 – Planet Tech


Intercept Games

Recommended Posts

Just now, Pthigrivi said:

The color is tilted toward the red spectrum but its not red-red. I could see the color skewed orange 

And this would be satisfying enough to me. White sunlight would be disappointing however. It'd be interesting with sunlight skewed toward orange or even red. Everything would be different from what we see in the Kerbol system. You'd never see the system in true color except at night on the surface with white lights turned on. Even the moon that was briefly glimpsed over the horizon would be orange or red looking, not the white that we saw.

Link to comment
Share on other sites

Actual colors of the stars:

http://www.vendian.org/mncharity/dir3/starcolor/

At best, with red dwarf, we'd get orange-ish light. Though, reflected light may not necessarily be that color, as irregular surfaces distort light.

Edit: what is "true color" anyway? It's just what we conceive as sum of visible spectrum. It's pure luck that our sun (and Kerbol for that matter) shines in white light. Which would probably explain why we can't see in infrared, as our eyes didn't need that. True color, from other star, for theoretical other species could look like that. And perhaps our orange is their white. We're very limited as humans, aren't we?

Edited by The Aziz
Link to comment
Share on other sites

You can actually get a pretty good sense for how starlight would appear to the human eye based on how incandescent bulbs, candles, or a campfire appear, since the spectra of both are basically a blackbody, and the physiology of the eye responds the same. And since the coolest red dwarfs normally have surface temperatures no greater than 2300 K and the warmest no more than about 3700 K (See this, looking at SpT of M0.5V through M9.5V for red dwarfs), most red dwarfs wouldn't actually look that red to the naked eye; it would look more orange than anything. For reference, a candle's effective color temperature is in the 1500-2000 K range and regular incandescent bulbs are generally less than 3500 K (filaments are mostly made with tungsten, and tungsten's melting point is around 3600 K).

Here's a nifty chart:
 

Spoiler

color-temperature-chart.png&f=1&nofb=1

 

Link to comment
Share on other sites

2 hours ago, daninplainsight said:

 

  Hide contents

color-temperature-chart.png&f=1&nofb=1

 


Thats fun. Im an architect so we specify a lot of this stuff. Whats funny is though natural daylight is close to 5000k most people find 5000k bulbs painfully cold looking. We usually shoot for 2700k in living spaces and 3000k in kitchens and baths. The big issue for me is that LEDs while better than fluorescents still aren't producing an even spectrum, which is why even in warmer tones it has that slightly underdeveloped, disconcerting feel compared to natural light.

I say this for folks trying to understand light temperature compared to what they're used to.


led-lighting-2.jpg

Edited by Pthigrivi
Link to comment
Share on other sites

2 hours ago, The Aziz said:

Edit: what is "true color" anyway? It's just what we conceive as sum of visible spectrum. It's pure luck that our sun (and Kerbol for that matter) shines in white light. Which would probably explain why we can't see in infrared, as our eyes didn't need that. True color, from other star, for theoretical other species could look like that. And perhaps our orange is their white. We're very limited as humans, aren't we?

My head went here too @The Aziz. Still, whether kerbal or human, if you grew up in white sunlight I'd expect there to be a noticeable (probably not huge) difference when perceiving visible light from red dwarf.

Now what about a kerbal that was born in that red dwarf system that didn't experience the light of any other star? Fascinating. It'd be similar to a fish not knowing what it means to be wet.

Oh no... @Pthigrivi is an architect. 

Civil engineer here. Let me know when you're ready to set your building on the existing world.

Link to comment
Share on other sites

  • 2 weeks later...
On 12/10/2021 at 11:00 AM, Intercept Games said:

Hi, I’m Eric DeFelice, a graphics engineer on the KSP2 team. My job is to create technical solutions to the graphics features we have on KSP2. One of the most obvious of these systems is how we generate, position, and render the planets in the game.

We need a system to render the planets while in orbit and interstellar travel, as well as up close, on the planet surface. We want transition between these distances to appear as you would expect, as you get closer to the planet surface, you just see more detail. How do we solve all the problems associated with a graphics feature such as this? Can we just use traditional approaches for level of detail?

Lets dive a bit deeper into how we solve this problem in KSP2. I’ll try to give as much detail as I can without having this take an hour to read…

 

Basic mesh rendering & LOD systems
Lets start by looking at how most meshes are rendered in KSP2 (and most games for that matter). Generally, the mesh data is sent from system memory over to the GPU, where shaders read it, place it at the correct pixels on screen and output the correct color given some material properties. We could try and use this approach for our planets, but there are a couple big issues we would have when trying to achieve the level of detail we would like.

001_traditional_rendering.png

 

The biggest issues we would have revolve around the memory usage that it would take to store all that vertex data for planets that are as large and detailed as we have in the game. We could mitigate these problems with level of detail approaches, and perhaps trying to break up the planet into chunks, so we could only load in the chucks that are relevant. GPU tessellation is also a possibility, but that wouldn’t really give us much control over the terrain height. One other big issue has to deal with the size of our planets and precision issues when trying to position the planet in camera projection space. I’ll talk more about this shortly.

Pre-Alpha, Not Final

Given these problems, we don’t use this basic approach when rendering planets up close. We do however use this basic approach when rendering planets from further away. This allows artists to have full control over the look of the planet from this distance, and is a good starting point to add more detail to as you approach the planet surface.

 

Planet Positioning
Another core gameplay feature we have to keep in mind when rendering the planets is that their position may be moved around relative to our floating origin (for more info, see the previous dev blog by Michael Dodd). For our planet rendering purposes, this means that our planet center will usually be further from the origin than its radius. If we defined the planet vertex data in model space, then during rendering, when we transformed its position to camera projection space, we could possibly be dealing with some large transformation values. If we then are viewing the terrain while it is close to the camera, creating very small distances in camera space, we may have some visual artifacts (as seen above).

How do we deal with this possible problem? Well, one simple solution is to generate the vertex data so it is relative to the floating origin already. That way we don’t have to deal with the model to world transformation, keeping the position values in a reasonable range.

So now that we have our key concerns listed, we can finally look at how we solved these problems in KSP2.

 

PQS System Overview for KSP2
In KSP2 we use a very similar PQS (procedural quad sphere system) that was used in KSP1 (here is much more detail in the basics of that system). We have made some updates to the system, namely that we now generate all of the planet mesh data in compute shaders. This planet vertex data never gets sent back to the CPU, and we just send a procedural draw call to the GPU to render the mesh with the compute buffer data.

003_pqs_rendering.png

 

We do determine quad sub-divisions in a similar way as KSP1, but we generate the output mesh positions relative to our floating origin, instead of relative to planet center. When calculating each vertex position, we also calculate the height, slope and cavity for the mesh so that we can perform procedural texturing in the planet shader. One caveat we needed to account for in our procedural parameter calculation is that we need to make sure we have stable values for any given position on a planet. This is needed because we don’t want the texturing to visually change at a given position, which could occur if the slope changes at that position because of mesh tessellation.

For tessellation, we have to balance the level of detail we want at various distances with the performance concerns of generating more vertex data. The goal is to bump the level of detail for the terrain at a distance that isn’t really noticeable, so we don’t have a ton of visual detail popping in. We are constantly improving in this area (for reference, here is some previous footage of our planet tessellation tech).

Pre-Alpha, Not Final

 

Pre-Alpha, Not Final

One other feature we have to help improve performance is basic frustum culling. Since we don’t have a bunch of mesh data on the CPU we can’t rely on traditional approaches for culling, so we have to do this on our own. Since we already have a bunch of quad data, we might as well just use their positions for this purpose. On the CPU we can determine which quads are within the camera frustum, and only generate visual mesh data for those. This prevents us from doing a bunch of work on the GPU that we know will be thrown away later, since that part of the mesh isn’t even visible.

Pre-Alpha, Not Final

PQS Collider System Overview
Terrain colliders need to be created by this system as well, since they rely on the mesh data for the planet. There are a few differences in the requirements for collision however. We no longer want to tessellate collision mesh data based on distance from the camera, but rather on distance from possible colliders that could hit that terrain. Because of this, we need to keep track of separate collision quad data.

We also can’t perform the same frustum culling that we do for the visual mesh, as a vessel could be out of view when it collides with the terrain. Can we still do some sort of culling though? You guessed it, we can. We just cull any terrain colliders that we deem too far away to possibly have a collision in that frame. This does the same job as frustum culling does for the visual mesh, prevents us from doing a bunch of work on the GPU that we know is useless.

Pre-Alpha, Not Final

Pre-Alpha, Not Final

Everything coming together
Hopefully I gave you some more insight into how we generate and render our planets in KSP2. The key goals of the system are to provide a high level of detail of the planet at all distances while maintaining a solid frame rate. There are many unique problems in KSP2 compared to most other games I’ve worked on, so we definitely had to get creative with our solutions.

One final tidbit I’ll leave you with, is our system for how we transition to our PQS generated mesh from the low LOD mesh. Borrowing a technique from basic LOD systems, we actually just perform a cross-fade dither between the two meshes.

Pre-Alpha, Not Final

And lastly, all the systems coming together!

Pre-Alpha, Not Final

View the full article

Im curious why unity and not unreal engine with nanite?  Was UE4/UE5 ever considered and why unity? What about vulkan?  Also you are going to need a pretty stable positioning system especially in multiplayer.  Things like being stopped on a runway but there is that ever slowly driff on the runway because the planet is moving. We saw that in ksp one using multiplayer mod and interpolation 

Edited by Redneck
Link to comment
Share on other sites

 

Im curious why unity and not unreal engine with nanite?  Was UE4/UE5 ever considered and why unity? What about vulkan? 

Unity is used for KSP 1 and despite most of the code being rewritten, a full port to a different game engine was out of scope for the initial planned release, and then since then, the infrastructure was already built up in unity so a conversion to unreal would have taken too many months. I’m sure that unreal was considered, but dropped early on in development. Vulkan is a graphics API and its main purpose is to bring compatibility to Mac. If everything was coded well, the shader implementations should work with Vulkan, MoltenGL, or even Metal directly, which should give Mac users access to the game.

Link to comment
Share on other sites

6 hours ago, t_v said:

Unity is used for KSP 1 and despite most of the code being rewritten, a full port to a different game engine was out of scope for the initial planned release, and then since then, the infrastructure was already built up in unity so a conversion to unreal would have taken too many months. I’m sure that unreal was considered, but dropped early on in development. Vulkan is a graphics API and its main purpose is to bring compatibility to Mac. If everything was coded well, the shader implementations should work with Vulkan, MoltenGL, or even Metal directly, which should give Mac users access to the game.

Hopefully with the entire code being reworked we will have more stability and less bugs.

Link to comment
Share on other sites

On 12/14/2021 at 2:49 AM, Ahres said:

Will the lighting stay white even though the star is red? The sunlight should be red in this system shouldn't it?

I recognize that, yes, this is in development. I'm commenting more for discussion. With the lack of comments on the subject, I get the sense that most players would have no problem with the sunlight being white even though it'd drive me crazy if it was.

Isn’t the sun in the local system going to be the brightest light source and create the white point for the view.  So if light is a bit red would you even notice until late twilight with a lamp to take over whitepoint?

well unless it is a binary with a brighter star  

 

Link to comment
Share on other sites

  • 6 months later...

I have a question about this technique.

I think procedure of generating mesh of quad sphere is: 1.generate mesh for a cube(or a quad part of it). 2. normalize it's vertex position. 3.vertex position multiply by sphere radius.

While generating the vertex for terrain, the vertex position is relative to floating origin, but before that, the vertex position is relative to celestial body center during generation(like step2 normalization), so the floating point precision problem is still there. If the celestial body is the Earth, error of surface vertex position is around 0.25m(0.5ULP of fp32 at Earth radius). If the view perspection is a human standing on ground, the error will be very obvious.

I think the solution of this problem is using fp64 during the generation of vertex position,  calculate the relative position of vertex and floating origin, and convert the relative position to fp32 format, then the error is gone. But mesh generator of KSP2 is Compute Shader. The fp64 performance of most desktop GPU is very weak (e.g. 1/32 of fp32 FLOPS on GTX1080), so my solution is not suit for GPU.

I wonder how did you resolve this problem. Thanks!

Link to comment
Share on other sites

10 hours ago, Velctor said:

I think procedure of generating mesh of quad sphere is: 1.generate mesh for a cube(or a quad part of it). 2. normalize it's vertex position. 3.vertex position multiply by sphere radius.

That only generates a sphere, not featured terrain. And it is relative to centre, not to camera viewpoint.

10 hours ago, Velctor said:

While generating the vertex for terrain, the vertex position is relative to floating origin, but before that, the vertex position is relative to celestial body center during generation(like step2 normalization), so the floating point precision problem is still there. If the celestial body is the Earth, error of surface vertex position is around 0.25m(0.5ULP of fp32 at Earth radius). If the view perspection is a human standing on ground, the error will be very obvious.

If I read it correctly, there's no before for vertex data. Meshes are generated on the fly as the camera moves, they aren't stored anywhere. That's why they mention that they have to keep the values stable, independent of camera position. With small planets, there is also less concern about floating precision than there was with real size planets because the horizon is not that far away. Depth range can always (on the frame) be adjusted to the situation.

Only thing that's done on the CPU is are the frustum tests, i.e. determine which parts are in the camera field of view and need to be calculated and rendered. That's done with bounding boxes that would need pre-calculation in world space, but an axis aligned bounding box is just 24 bytes, or 48 with double values. Idk. if there is more meta data that describes how exactly the surface looks, if there are craters, or mountains, etc.

10 hours ago, Velctor said:

I think the solution of this problem is using fp64 during the generation of vertex position,  calculate the relative position of vertex and floating origin, and convert the relative position to fp32 format, then the error is gone. But mesh generator of KSP2 is Compute Shader. The fp64 performance of most desktop GPU is very weak (e.g. 1/32 of fp32 FLOPS on GTX1080), so my solution is not suit for GPU.

I wonder how did you resolve this problem. Thanks!

Yes, FP64 is too slow on consumer grade graphics cards. As I understand it, they do this (relative to camera position, or relative to eye) to get around the precision problems, and thus jittering and other artefacts of lacking precision, by calculating directly from the camera position instead of doing it in world space first and transform afterwards, thus avoiding the super large numbers. And it does it this each time one visits (looks at) a point, meaning possibly each frame. Thus they also avoid the additional memory/storage traffic of having to store positions for future use. It is just generated again and again.

That's pretty performant :-)

 

If you're up to rendering "real" planets as so many have tried and few have accomplished, that's indeed quite a different feat.

Edited by Pixophir
Link to comment
Share on other sites

  • 2 weeks later...
On 6/29/2022 at 3:02 PM, Pixophir said:

That only generates a sphere, not featured terrain. And it is relative to centre, not to camera viewpoint.

If I read it correctly, there's no before for vertex data. Meshes are generated on the fly as the camera moves, they aren't stored anywhere. That's why they mention that they have to keep the values stable, independent of camera position. With small planets, there is also less concern about floating precision than there was with real size planets because the horizon is not that far away. Depth range can always (on the frame) be adjusted to the situation.

Only thing that's done on the CPU is are the frustum tests, i.e. determine which parts are in the camera field of view and need to be calculated and rendered. That's done with bounding boxes that would need pre-calculation in world space, but an axis aligned bounding box is just 24 bytes, or 48 with double values. Idk. if there is more meta data that describes how exactly the surface looks, if there are craters, or mountains, etc.

Yes, FP64 is too slow on consumer grade graphics cards. As I understand it, they do this (relative to camera position, or relative to eye) to get around the precision problems, and thus jittering and other artefacts of lacking precision, by calculating directly from the camera position instead of doing it in world space first and transform afterwards, thus avoiding the super large numbers. And it does it this each time one visits (looks at) a point, meaning possibly each frame. Thus they also avoid the additional memory/storage traffic of having to store positions for future use. It is just generated again and again.

That's pretty performant :-)

 

If you're up to rendering "real" planets as so many have tried and few have accomplished, that's indeed quite a different feat.

Thank you for your reply.

What I don't understand is... for example: when generate a quad(or a grid mesh) of the terrain, we hope the vertex position of it is relative to camera(or floating origin or anywhere around camera), get the heightmap texture value to offset the vertices. but...the result is a flat terrain quad with heightmap offset, it can't be a part of a sphere. if we want to "bend" it to become a part of sphere, we have to compute the  position relative to planet center. and we got the floating point error. I don't know how to resolve this problem.

Edited by Velctor
Google page translate result was used by the quote
Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...