Jump to content

Where's the scattering?


Recommended Posts

While that is a completely valid question, what difference does it make?    
 

The game will be the game they produce.   What comes out of the studio on the 24th may only have a loose correlation to what has been previewed in past years.   Projects change over time.    Sacrifices and compromises have to be made.   What gets released is what get released.  
 

The video gaming industry has been doing this for 40 years.    Showing promotional stuff  that builds hype, and then releasing a completely different product.    This is not a new thing.    If these types of differences would affect you purchasing the game or not, then it’s on the consumer to wait a couple days after release to confirm the features you desire are actually in the product you want.     There are professional gamers out there who will do the initial purchase and review for you if this is the case.   Otherwise, it’s buyer beware.

 

I have no inside info on what or isn’t in the game.   I’m just a fellow kerbinaut who is happily awaiting the release of a sequel of a favored game, without any expectations or prejudices. 
 

Edit: Not speaking as a moderator or with any association with PD/TTi/etc, just a fellow gamer. 

Edited by Gargamel
Link to comment
Share on other sites

4 hours ago, mcwaffles2003 said:

The scatter shown has never been on Kerbin. I'm guessing its an effect reserved for planets with denser atmospheres. It's a pretty effect, but it doesn't belong at the KSC imo

Pretty sure it's this. The KSC likely uses materials that got implemented before the scattering was added, and the artists simply haven't gotten around to updating the materials. It'll get fixed some time either before or during the early access.

Link to comment
Share on other sites

3 hours ago, K^2 said:

Pretty sure it's this. The KSC likely uses materials that got implemented before the scattering was added, and the artists simply haven't gotten around to updating the materials. It'll get fixed some time either before or during the early access.

It makes sense  because the KSC itself looks inferior material/shader wise than the rest of the stuff about the game we have seen up to now.  PRobably it was the first models and materials they added in the development as well.

Link to comment
Share on other sites

5 hours ago, Vl3d said:

It's just beta guys, they're optimizing and adding features gradually. Be patient, it's just one more month to go and then we can start giving feedback through official channels.

They have mentioned multiple times that the beta footage that they show us is can and often likely is created with various beta features/functions and that, in the end, all of it will be recombined.

I do think we'll see scattering on Kerbin and at the KSC.

Link to comment
Share on other sites

If there isn’t any scattering at all on Kerbin but there is on other planets, that you’ll look pretty visually inconsistent. Even if the scattering effects we have seen are for denser atmospheres, there should still be a lesser degree of scattering for Kerbin. I’m hoping it is just a matter of making the materials compatible with scattering (and lighting, that top photo looks amazing) but even if it isn’t present at launch, it’ll probably be put in before 1.0, potentially due to asking why different planets look split between high and low graphics settings. 

Link to comment
Share on other sites

10 hours ago, Vl3d said:

It's just beta guys, they're optimizing and adding features gradually. Be patient, it's just one more month to go and then we can start giving feedback through official channels.

Well, in general, we will play the beta version, the date of the full release is unknown

Link to comment
Share on other sites

13 hours ago, mcwaffles2003 said:

The scatter shown has never been on Kerbin. I'm guessing its an effect reserved for planets with denser atmospheres. It's a pretty effect, but it doesn't belong at the KSC imo

Atmospheric scattering happens to an extent on any body with an atmosphere. It's definitely noticeable at medium to long distances on Earth. Kerbin has always been an Earth analogue, and I don't see how not giving it realistic scattering would make sense at all. 

Link to comment
Share on other sites

1 hour ago, treesniper12 said:

Atmospheric scattering happens to an extent on any body with an atmosphere. It's definitely noticeable at medium to long distances on Earth. Kerbin has always been an Earth analogue, and I don't see how not giving it realistic scattering would make sense at all. 

Not to mention that it's just a simpler way to implement sky light in general, once you have everything else set up.

I don't know if there's an easy way to hook it up in Unity (the slightly harder way is with a custom shader node), but a standard approach in custom engines is that you generally already have a system to light a surface from a light probe, and you use that. A light probe encodes the directional information about light at a point, and it's usually stored as spherical harmonics. The advantage is that you can usually get light quality almost as good as from a full cube map, but only storing a dozen or so "pixels" (a bit more for specular probes) instead of a full cube map, which allows you to place the probes in a grid around your map. That's the standard way of doing global illumination (GI). The materials then have a pre-computed response for each spherical harmonic, which means that the shader just has to take the light harmonics from the nearest probe, rotate these to local coordinates, taking normals and binormals into account, multiply them by response function, use a lookup table to get the value for that response pointed towards the camera, and return that value. It's a hell'a lot of math, but because it's all pre-computed and pre-tabulated, the actual runtime is blazing fast, and you get indirect illumination of your surfaces in most realistic conditions.

So what you do for scattered sky light is you do a single pass, computing spherical harmonics from the sky cube map. Then you pass these to the material shader, as if it was one of the light probes, and you let the material do its thing. It's basically zero overhead, since the update of the spherical harmonics for the sky can be spread between multiple frames. If you have a unified system, that has clouds baked into it, you'll even get occlusion from these. And the best part is that spherical harmonics preserve directionality of light. So if you have a sunset/sunrise situation, the side of a building facing the sun will be washed in auburn orange, while the side away from the sun will be bluish gray, exactly as you expect in that sort of a scene, and it takes almost zero work once you have the skybox computed.

Unity has a light probe system, and it has material nodes designed to work with these. What I don't know is if you can create a fake light node data that you can pass to the shader so that you can just recycle these material nodes and make it work with your custom skybox. If not, you'll have to do the work of building that shader node by hand. That said, from footage from other planets, it's clear that the work has already be done, whichever way they ended up doing it. So now it's literally just a matter of updating the materials used on Kerbin and in structures, including KSC.

The point here is that if you don't do the above, you still have to account for sunlight, and if you just do direct illumination from the Sun, it looks like crap. Well, by modern standards - that's what people used to do back in 2000, and it was fine then. Since at least the mid 2000's, people started adding an environmental map to the scene. You usually would bake two of these. One would be just a cube map of the light sources typical to the scene. For the indoors, it might have some fake lights on it, and for outdoors it'd typically be just your skybox. You'd use it as is for your specular light. Then for diffuse, you literally pre-compute the convolution with a cosine function, to get that dot-product factored in from every possible angle. Now instead of doing anything fancy, your entire environment light is just two texture lookups. Finally, if you wanted a day-night cycle, you'd have three of these, day, night, and dusk/dawn maps, that you you rotate them with the movement of the Sun and blend together as time moves from morning to day, to evening, to night. Cheap, reasonably believable, and easy to implement. I think KSP did something similar. Problem is, the moment you have clouds, rain, or any other weather effects, guess what, you have to add a new map variation for each of these. And it never looks quite right with fancy materials. Especially, once you start adding shinier looking surfaces. And if you already have GI working, it's not easier at all to implement! So you might as well go for proper scatter light on everything and have a single set of materials that accounts for it all.

tl;dr: You have to do work to do skylight illumination either way, and if you already set up scatter illumination for other worlds, you get it for free for other bodies, even these without an atmosphere at all! So there is zero reason not to.

Link to comment
Share on other sites

18 hours ago, Gargamel said:

While that is a completely valid question, what difference does it make?    
 

The game will be the game they produce.   What comes out of the studio on the 24th may only have a loose correlation to what has been previewed in past years.   Projects change over time.    Sacrifices and compromises have to be made.   What gets released is what get released.  
 

The video gaming industry has been doing this for 40 years.    Showing promotional stuff  that builds hype, and then releasing a completely different product.    This is not a new thing.    If these types of differences would affect you purchasing the game or not, then it’s on the consumer to wait a couple days after release to confirm the features you desire are actually in the product you want.     There are professional gamers out there who will do the initial purchase and review for you if this is the case.   Otherwise, it’s buyer beware.

 

I have no inside info on what or isn’t in the game.   I’m just a fellow kerbinaut who is happily awaiting the release of a sequel of a favored game, without any expectations or prejudices. 
 

Edit: Not speaking as a moderator or with any association with PD/TTi/etc, just a fellow gamer. 

Or they're just using test builds, as previously stated :P

Link to comment
Share on other sites

2 hours ago, K^2 said:

So if you have a sunset/sunrise situation, the side of a building facing the sun will be washed in auburn orange, while the side away from the sun will be bluish gray, exactly as you expect in that sort of a scene, and it takes almost zero work once you have the skybox computed.

If I'm reading this right, this means scatter and sunrise/sunset lighting are interlinked.

This is consistent with the stream; there was no coloured lighting at sunrise/sunset.

Link to comment
Share on other sites

2 hours ago, K^2 said:

Not to mention that it's just a simpler way to implement sky light in general, once you have everything else set up.

I don't know if there's an easy way to hook it up in Unity (the slightly harder way is with a custom shader node), but a standard approach in custom engines is that you generally already have a system to light a surface from a light probe, and you use that. A light probe encodes the directional information about light at a point, and it's usually stored as spherical harmonics. The advantage is that you can usually get light quality almost as good as from a full cube map, but only storing a dozen or so "pixels" (a bit more for specular probes) instead of a full cube map, which allows you to place the probes in a grid around your map. That's the standard way of doing global illumination (GI). The materials then have a pre-computed response for each spherical harmonic, which means that the shader just has to take the light harmonics from the nearest probe, rotate these to local coordinates, taking normals and binormals into account, multiply them by response function, use a lookup table to get the value for that response pointed towards the camera, and return that value. It's a hell'a lot of math, but because it's all pre-computed and pre-tabulated, the actual runtime is blazing fast, and you get indirect illumination of your surfaces in most realistic conditions.

So what you do for scattered sky light is you do a single pass, computing spherical harmonics from the sky cube map. Then you pass these to the material shader, as if it was one of the light probes, and you let the material do its thing. It's basically zero overhead, since the update of the spherical harmonics for the sky can be spread between multiple frames. If you have a unified system, that has clouds baked into it, you'll even get occlusion from these. And the best part is that spherical harmonics preserve directionality of light. So if you have a sunset/sunrise situation, the side of a building facing the sun will be washed in auburn orange, while the side away from the sun will be bluish gray, exactly as you expect in that sort of a scene, and it takes almost zero work once you have the skybox computed.

Unity has a light probe system, and it has material nodes designed to work with these. What I don't know is if you can create a fake light node data that you can pass to the shader so that you can just recycle these material nodes and make it work with your custom skybox. If not, you'll have to do the work of building that shader node by hand. That said, from footage from other planets, it's clear that the work has already be done, whichever way they ended up doing it. So now it's literally just a matter of updating the materials used on Kerbin and in structures, including KSC.

The point here is that if you don't do the above, you still have to account for sunlight, and if you just do direct illumination from the Sun, it looks like crap. Well, by modern standards - that's what people used to do back in 2000, and it was fine then. Since at least the mid 2000's, people started adding an environmental map to the scene. You usually would bake two of these. One would be just a cube map of the light sources typical to the scene. For the indoors, it might have some fake lights on it, and for outdoors it'd typically be just your skybox. You'd use it as is for your specular light. Then for diffuse, you literally pre-compute the convolution with a cosine function, to get that dot-product factored in from every possible angle. Now instead of doing anything fancy, your entire environment light is just two texture lookups. Finally, if you wanted a day-night cycle, you'd have three of these, day, night, and dusk/dawn maps, that you you rotate them with the movement of the Sun and blend together as time moves from morning to day, to evening, to night. Cheap, reasonably believable, and easy to implement. I think KSP did something similar. Problem is, the moment you have clouds, rain, or any other weather effects, guess what, you have to add a new map variation for each of these. And it never looks quite right with fancy materials. Especially, once you start adding shinier looking surfaces. And if you already have GI working, it's not easier at all to implement! So you might as well go for proper scatter light on everything and have a single set of materials that accounts for it all.

tl;dr: You have to do work to do skylight illumination either way, and if you already set up scatter illumination for other worlds, you get it for free for other bodies, even these without an atmosphere at all! So there is zero reason not to.

giphy.gif?cid=ecf05e47xowogqp4pt7xxfpf56

Link to comment
Share on other sites

47 minutes ago, Luriss said:

If I'm reading this right, this means scatter and sunrise/sunset lighting are interlinked.

I mean, it can heavily depend on implementation, but as far as modern techniques that give a good, realistic look, yes. And that's what I'm seeing on all of the other examples we have from other planets, so I expect this to be hooked up for Kerbin and structures.

52 minutes ago, whatsEJstandfor said:

giphy.gif?cid=ecf05e47xowogqp4pt7xxfpf56

Sorry, hazard of talking about something that you work on professionally. I try to correct for it, but as XKCD points out, it's hard to do effectively. Always happy to clarify anything specific if you're interested, though.

Link to comment
Share on other sites

5 hours ago, K^2 said:

I don't know if there's an easy way to hook it up in Unity (the slightly harder way is with a custom shader node), but a standard approach in custom engines is that you generally already have a system to light a surface from a light probe, and you use that.

As a person whos dabbled alot in graphics programming (and unity), light probes are only for static lights! :O. Wouldn't you have to recalculate and re-position (quite expensive bc of fourier decompositions) the light probes many times every second or so if you intend to use it with the sun? I also dont see light probes also used in scales such as kerbin either. They seem very much more suited for indoor or semi-open spaces with alot of indirect lighting and many (or complex) GI light sources, in order to maintain lighting consistency with moving objects.  The L2 spherical harmonics resolution unity uses also makes the the probe's lighting very diffuse, making it ill suited to 'sharper' lighting like the sun i think.  If they were using such an approach I think they would just forgo probes altogether, skip the expensive Fourier part and use a cubemap directly for lighting in a custom pbr shader.

However the light probe (and similar) format is very very very ill suited for say: being in space over a sunset, the ship doesn't have orange lighting, but specific portions of the ground below do because its inside the atmosphere being scattered, you would  have to recalculate all light probes in view many times and lag your device to death computing fourier transforms... (not to mention it would look bad, the lighting would be seen as very patchy with each patch corresponding to a light probe)  therefore a different approach is typically used:

I'm speculating the reason behind the sunlight not being scattered is a minor disconnect between the implementations of the scatterer and the pbr shading, which doesnt natively accept the results of the atmospheric scatter. The pbr stuff expects standard types of direct illumination and/or light probes, light maps. The scatterer is a post processing effect. Thus in order to have the scatter influence lighting everywhere you would have to use deferred rendering, and use the scatter on everything at once: that is rendering everything as encoded information concerning normals, depth, base color, etc baked into multiple textures, which afterwards running all the lighting passes, in which the sun itself is part of the scatterer pass as opposed to a separate light source.  This means the scattering shader would also have to compute the pbr material effect as only it knows the specific color of sunlight as it strikes geometry. Quite a challenge!

Currently to me it seems the scatterer is rendered first, and there is a 'placeholder' direct lighting sun applied on the materials and placeholder area light for the blue sky light. This then leads to all the weird effects like the water not reflecting the daytime sky (it turns black at sunsets rather then say, reflect the sky's orange color) and the incorrectly white sunset lighting. 

 


 

Edited by Xelo
Link to comment
Share on other sites

1 hour ago, Xelo said:

As a person whos dabbled alot in graphics programming (and unity), light probes are only for static lights! :O. Wouldn't you have to recalculate and re-position (quite expensive bc of fourier decompositions) the light probes many times every second or so if you intend to use it with the sun?

Nah, I'm not talking about indirect lighting from the terrain. There are, actually tricks for recalculating that dynamically for outdoors environments too, but that's getting into way more complex interactions. That would take us way off topic, so if you want to chat about that, feel free to DM me.

The idea is that the info the light probe gives you is how much light you're getting from every direction. To get goo skybox illumination, you want the light to be coming from every direction as well. So if you take the skybox and convert the illumination information from it into spherical harmonics, you can use it as one special light probe that's common to all of your scene geometry. In fact, you don't have to use actual GI with this - you can literally just have this one "light probe" generated from the skybox, and use that to get soft lighting that's dynamic with the time of day. And while computing all of the light probes for a scene is very, very expensive, computing just one from a given cube map over multiple frames is almost free.

1 hour ago, Xelo said:

The pbr stuff expects standard types of direct illumination and/or light probes, light maps. The scatterer is a post processing effect.

That's precisely the gap we're trying to close here. You can do the sky as just a screen space post-effect, of course, but then you don't get any of the other benefits. You can, instead, compute that scattering process as a cube map, ignoring screen space entirely, which, again, you only need to do once every so often, since the sun and the clouds aren't moving all that fast. On average, you end up spending about the same amount of time doing either, because you're doing ray casts on more pixels but over more frames, so this doesn't increase rendering costs. You can then use it directly with your PBR as a cube map, of course, but then you have to at least bake a diffuse environment map from it, which is pretty expensive. (O(n2 ln(n)) if you use FFT and convolution theorem.) Alternatively, you can compute spherical harmonics from that cube map (O(kn2), where k is number of harmonics) and feed that to PBR as if it was a light probe. It's basically the same computational cost, but now you have better specular highlights on metallic surfaces, support for anisotropic materials if you want it, etc.

There are extra steps you'd have to do if you wanted sky occlusion to work well, which involve actual light probes and, again, a lot more math. But then you actually do have to compute light probes which, as you point out, is a problem for a planet sized world. The conservative approach is to simply ignore occlusion for scattered light. You take horizon at exactly 0° elevation (so you only really have half of a scatter cube map for the sky) and only do occlusion for direct light from the sun using cascade shadows as normal.

Link to comment
Share on other sites

4 minutes ago, K^2 said:

Nah, I'm not talking about indirect lighting from the terrain.

I wasn't either, sorry if my words aren't clear. I also understand how light probes work, I have made my own implementations before and read the papers.  This is specially for the sky scatter only (which gives the sky its color and makes the sun orange). The failure case i describe is this:
image.png

A single light probe cannot handle all these cases at once. Thus multiple probes will need to be updated constantly on the ground, in the sky, local space, and etc. You don't even have to be particularly high to see this effect. This is somewhat impractical. And as you know each lightprobe needs to sample an entire separate cube-map all over for each 'value' in the light probe to generate the polynomials for the spherical harmonics (FFT doesnt help much in lower n like in light probes i think). This is why dynamic lightprobes arent discussed much, its a alot of work.
Using Deferred rendering will do all these cases at once, in one pass without the need to construct and recalculate probes constantly. Only the scatterer shader knows the color of the sun lighting at any texel in the scene and thus such an approach is what I would use.
 

23 minutes ago, K^2 said:

You can, instead, compute that scattering process as a cube map, ignoring screen space entirely, which, again, you only need to do once every so often, since the sun and the clouds aren't moving all that fast

I originally thought that, but I then I realized time warp exists. Such a luxury doesn't exist with that mechanic. Things have to be dynamically computed more often then not if you can make a day last fractions of a second.

 

27 minutes ago, K^2 said:

You can do the sky as just a screen space post-effect, of course, but then you don't get any of the other benefits.

That's what sets deferred rendering apart, unlike forward rendering, every piece of geometry visible on the screen is known by the scatterer through a texture and the shader shades them accordingly (and with perfect specular).  The water is simply included in the texture as normals & depth, the base texture and finally its material properties, and nothing extra is needed for the water to have orange reflections. (as the scatterer is a form of raymarcher in implementation you just march along the water's reflection vector from the camera and grab the color) 

And while yes sky occlusion still remains a problem, i don't think dynamic sky occlusion can be solved with light probes on these scales[0]. And kerbin, duna or eve are very dynamic environments. Most implementations I see go straight to raycasting approaches.
You can try have a much cheaper version of dynamic sky occlusion by combining screenspace(for craft) [1]/baked(for buildings) ambient occlusion and a simple 'special ambient light' on the terrain that dims near the sunset area (would actually be part of planetshine optimally). Since this ambient lighting is generally radially symmetric across the planet, you can store it as a 2d texture, making it extremely cheap to calculate.

image.png

[0] you may have one locally at your vessel only, but at that point I would just use a blurred cubemap instead o-o. Since with a light probe youd need to render the scene to a cubemap, have each texel do a 'unoptimal' gpu convolution or a slow cpu one, then pass it along to the material shader again. You may sample the sample number of pixels as with a blur, but the important thing is that each texel doesnt do alot of work individually, making it much much faster to calculate compared to the lightprobe in which each texel needs to sample the entire texture for the convolution. Unity also doesnt let you edit an existing light probe, so much as needing to replace the entire array over every time, which implies this is meant to be done rather rarely and only between existing scenes, possibly because its stored in the gpu?

[1] Screenspace AO also happens to be very suited and very common to deferred rendering.

 

Link to comment
Share on other sites

7 minutes ago, Xelo said:

A single light probe cannot handle all these cases at once.

Oh, I'm talking strictly surface rendering, not an orbital ship in flight passing through penumbra. I don't think there's a general approach that solves all situations without RT. But if we separate this into three cases: surface when camera is below clouds, surface + clouds when camera is above, and anything above the cloud layer, we can attack this gracefully.

When the camera is above the cloud layer, I would honestly just skip the scatter from clouds to ground. Most of the areas that would be affected aren't going to be particularly visible because of the clouds in the way. So the sky light for the planet when the camera is above clouds can be pre-computed. Scale intensity by distance from the star, to account for elliptical orbits, and otherwise, the light distribution is perfectly consistent when taken relative to the star's position. (This does break down with multiple stars, but I assume all relevant planets will have only one star we can take as dominant effect, and do only direct contribution from the other.)

For camera below the clouds, the cloud effect is significant, and you'll want to do the actual honest computation by factoring clouds into your scatter. Transition can be tricky, but I think blending is fine. Likewise, aircraft can use surface lights. It's not exactly right, but I think it's close enough that nobody will notice the difference. You just have to be careful about aircraft actually flying through a cloud.

Finally, for anything located very high in atmosphere or outside of it, you have some options. You can still do pre-computed approach by utilizing the cylindrical symmetry, meaning you can still get away with a 2D texture that takes an angle and altitude. (Beyond certain altitude this light stops changing and depends on the angle only, so you can clamp the texture.)

23 minutes ago, Xelo said:

I originally thought that, but I then I realized time warp exists.

Point, but if you're viewing things from the surface at high enough warp, you can start cutting a lot of corners. It does need to be handled to avoid very weird artifacts, though. And per above, I never considered this for other situations.

26 minutes ago, Xelo said:

That's what sets deferred rendering apart, unlike forward rendering, every piece of geometry visible on the screen is known by the scatterer through a texture and the shader shades them accordingly (and with perfect specular).

Yeah, but that doesn't help you with rendering surface illumination. If I'm standing on the surface and looking at the terrain, every point in front of me has the sky above. Even without having a single pixel of the sky, each pixel of the terrain now needs to know how much light it receives, and even ignoring the occlusion, that means casting a scatter ray in every possible direction towards the sky, because the clouds can be in any direction, and that way lies madness.

Yes, you can cast just a few rays from every  point, and then denoise the picture. But that still means doing 5-6 samples of the scatterer for every pixel on the screen to get decent results. And you absolutely must do this every frame. This is not cheaper than building a low resolution skybox once.

So yeah, if you always had a perfectly crystal clear sky, you just have a lookup map for each planet, and do deferred. But the moment you have weather you basically have to start thinking about it as a dynamic cube map. I don't see a way to step around that.

Link to comment
Share on other sites

28 minutes ago, K^2 said:

Oh, I'm talking strictly surface rendering, not an orbital ship in flight passing through penumbra.

I wasn't talking only about the ship passing through penumbra in that diagram. I was talking about you viewing a sunset on the ground from like 30km above, parts of the terrain is sunlit (1) , part of the terrain in orange sunset in a narrow band (2) , part of the terrain shrouded in darkness (3) . The cloud's sunset is delayed from being higher up in the atmopshere (4) .  Something above the atmosphere (5). A single cubemap or probe or what-have-you cannot account for (1) (2) (3) (4) (5) viewed at the same time, or even  just (1,2,3), especially as it moves across the surface. very fast. :c


Having the scatterer shade surfaces would solve the issue of sun-lighting no worries, but the ambient sky light is not practically dealt with a single lightprobe across all the surfaces you can see typically (unless you simply never left the ksc or something. Like in most games with limited kilometer-wide maps in which this technique is viable.)
 

35 minutes ago, K^2 said:

Most of the areas that would be affected aren't going to be particularly visible because of the clouds in the way.

I don't think you can just dismiss it because of cloud cover unless kerbin is permanently overcast like eve.  You can still very much see the ground above the cloud layer. This would also not apply to Duna. Clouds are usually generated from a b&w texture heightmap[1] , you can sample that once to get vague blurry shadows and etc, there is little performance hit from it.  This is also I figure how they have cloud shadows right now.
 

46 minutes ago, K^2 said:

Point, but if you're viewing things from the surface at high enough warp, you can start cutting a lot of corners. It does need to be handled to avoid very weird artifacts, though. And per above, I never considered this for other situations.

The warp isnt like normal speed or maximum overdrive with no in-between though. Simply having an absurdly fast velocity is also a thing you do regularly in this game and few others. KSP by nature doesn't leave alot of room for shortcuts that don't feel obvious.
 

1 hour ago, K^2 said:

Yeah, but that doesn't help you with rendering surface illumination. If I'm standing on the surface and looking at the terrain, every point in front of me has the sky above. Even without having a single pixel of the sky, each pixel of the terrain now needs to know how much light it receives, and even ignoring the occlusion, that means casting a scatter ray in every possible direction towards the sky, because the clouds can be in any direction, and that way lies madness.

Yes, you can cast just a few rays from every  point, and then denoise the picture. But that still means doing 5-6 samples of the scatterer for every pixel on the screen to get decent results. And you absolutely must do this every frame. This is not cheaper than building a low resolution skybox once.

So yeah, if you always had a perfectly crystal clear sky, you just have a lookup map for each planet, and do deferred. But the moment you have weather you basically have to start thinking about it as a dynamic cube map. I don't see a way to step around that.

As per the previous comment, you aren't building a single low res sky box once, you are building it every frame in which its only valid for rendering the immediate area on the surface.  
This is ok if its just literally just your vessel (see footnote [0] in previous comment) and/or you are just computing blurry reflections. This 'skybox' isnt valid 10km,20km,50km away, a distance you can very much see. esp as you rise higher.
I also did not suggest raycasting for diffuse surfaces  (which would require multiple rays), either, only reflective ones, which would just continue the existing ray in a different direction.
Thus assuming the 'sky is clear' seems the most practical to me, and considering your case of clouds, just using a cheap sample of the cloud texture for checking overcast areas which can be integrated into the cheap ambient light.  if the terrain/ship underneath is under a brighter area of a pre-blurred cloud texture,  the light renders this area greyer to compensate.

[1] 
image.png 
the blocky shapes of the ksp2 clouds heavily imply texture sampling with bilinear blending. Here, upscaled noise as comparison.
image.png
Eve redux also uses pre-made 2d textures. Was asking about using sdfs for the clouds.
image.png

Link to comment
Share on other sites

1 minute ago, Xelo said:

I was talking about you viewing a sunset on the ground from like 30km above, parts of the terrain is sunlit (1) , part of the terrain in orange sunset in a narrow band (2) , part of the terrain shrouded in darkness (3) . The cloud's sunset is delayed from being higher up in the atmopshere (4) .

So long as we get to ignore the clouds, we can just precompute all of this, including the difference between (1, 2, 3) and (4).

4 minutes ago, Xelo said:

I don't think you can just dismiss it because of cloud cover unless kerbin is permanently overcast like eve.

If your camera is on the ground, you absolutely can't. And that's the problem. If you're above the clouds, though, I would submit that nobody's going to notice the difference. So long as you still do cloud shadows for direct light, the lighting difference due to weather is going to be too subtle when viewed from above the cloud layers. There might be edge cases where you're not high above clouds and looking at the terrain below through a cloud break, but I don't think these are common enough and obvious enough to go for a far more expensive technique just to get these right.

7 minutes ago, Xelo said:

As per the previous comment, you aren't building a single low res sky box once, you are building it every frame in which its only valid for rendering the immediate area on the surface.  
This is ok if its just literally just your vessel (see footnote [0] in previous comment) and/or you are just computing blurry reflections. This 'skybox' isnt valid 10km,20km,50km away, a distance you can very much see. esp as you rise higher.

We only need the dynamic skybox for the first 10km. From there up, we no longer need to recompute it. It's literally always the same, and depends only on one angle, because there are no clouds anymore. We can bake all of the light info it into a 2D texture.

10 minutes ago, Xelo said:

I also did not suggest raycasting for diffuse surfaces  (which would require multiple rays), either, only reflective ones, which would just continue the existing ray in a different direction.
Thus assuming the 'sky is clear' seems the most practical to me, and considering your case of clouds, just using a cheap sample of the cloud texture for checking overcast areas which can be integrated into the cheap ambient light.  if the terrain/ship underneath is under a brighter area of a pre-blurred cloud texture,  the light renders this area greyer to compensate.

So how do you pick up the orange hue on one side of a mountain and gray on the other during sunset/sunrise? And how is it better than getting all of that and the correct cloud illumination when viewed from the ground? As far as I can tell, our approach above the clouds is identical, except I want to use look-up textures instead of doing ray casts. (Because I'm not sure why we're doing ray casts if we're ignoring clouds.)

 

I have a feeling we're about to go to the Shadertoy round.

Link to comment
Share on other sites

×
×
  • Create New...