Jump to content

[WIP][1.9.x-1.12.x] Scatterer-atmospheric scattering (0.0838 - 14/08/2022) Scattering improvements, in-game atmo generation and multi-sun support


blackrack

Recommended Posts

[quote name='blackrack']Btw is your celestial object shadow shader functional in the newest EVE?[/QUOTE]

Nope. Some of the stuff is there, but not enabled. I have to go a completely new route as many things wouldn't work properly had I pursued the multi-projector route (eg. any transparent things like clouds wouldn't be handled properly, scaled space shadows shake violently with camera movement, and multiple bodies obscuring the sun would result in multiple darkening passes being very unrealistic.) so I'm pursuing the single projector, multi-body shader for macro space, and an additional body material that handles multiple casts. I may turn this into a full scale lighting pass shader.

[quote name='blackrack']And another question about projectors, how do you geenerate them from the clouds texture for the shadows?[/QUOTE]

Oooh... a loaded question... basically, we take the vertex, the sun Direction, the body position and the cloud sphere radius, and use sphere collision math to determine at what distance a ray from that vertex would hit the cloud sphere. Then we take that, and get the cloud position with vertex+(sunDir*distance) and from there, we use the body position to get the point of the sphere that resolves to. From there, we just plug that into our existing formula to fetch the texture. You can take a look in [url]https://github.com/rbray89/EnvironmentalVisualEnhancements/blob/master/ShaderLoader/Shaders/CloudShadow.shader[/url] to get a better peek.
Link to comment
Share on other sites

[quote name='rbray89']Oooh... a loaded question... basically, we take the vertex, the sun Direction, the body position and the cloud sphere radius, and use sphere collision math to determine at what distance a ray from that vertex would hit the cloud sphere. Then we take that, and get the cloud position with vertex+(sunDir*distance) and from there, we use the body position to get the point of the sphere that resolves to. From there, we just plug that into our existing formula to fetch the texture. You can take a look in [url]https://github.com/rbray89/EnvironmentalVisualEnhancements/blob/master/ShaderLoader/Shaders/CloudShadow.shader[/url] to get a better peek.[/QUOTE]

That seems nifty, in the end do you just look at the alpha value of the texture? Do the shadows have variable alpha depending on the clouds? I probably shouldn't look at the code right now though as I'm way too tired to understand anything.

Also, I wanted to ask your opinion about logarithmic depth buffers. I've been meaning to implement a custom logarithmic depth buffer for use with the "post-processing" shader to solve what I think are depth buffer inaccuracies that arise mostly in dx11 and OpenGL when looking at the water surface from the limit altitude where the PQS is disappearing, but also I think it causes a flickering (z-fighting?) effect in dx9.

There are a few resources on this, notably

[URL="http://outerra.blogspot.fr/2009/08/logarithmic-z-buffer.html"]http://outerra.blogspot.fr/2009/08/logarithmic-z-buffer.html[/URL]

[URL="http://outerra.blogspot.fr/2013/07/logarithmic-depth-buffer-optimizations.html"]http://outerra.blogspot.fr/2013/07/logarithmic-depth-buffer-optimizations.html[/URL]

I haven't gone too much in detail yet but it seems feasible, although I think reconstructing world position from this new depth buffer might be a bit more of a pain (it already was a pain for me to get working).

I plan to just start from the provided unity depth shader and make changes to it, then use it as a replacement shader and obtain a depth buffer in a rendertexture. Thought it sounds straightforward, I know if too much matrix math gets involved I will lose my sanity.

[quote name='Nhawks17']Fixed the black land but the sky is still black :([/QUOTE]

Oh well, no idea what to do now, as far as I know it should be ignoring the projectors now. Edited by blackrack
Link to comment
Share on other sites

Good news, hopefully.

Latest EVE, Scatterer 0.0191 plus fix 3. It appears all the visual glitches are gone....at least for me. The only thing that remains is the atmo effect is lost when returning to space center from map view, which is presumably something to do with refreshing on scene change...though you two (Blackrack and rBray89) are far more qualified to be able to diagnose that.
Link to comment
Share on other sites

[quote name='Manwith Noname']Good news, hopefully.

Latest EVE, Scatterer 0.0191 plus fix 3. It appears all the visual glitches are gone....at least for me. The only thing that remains is the atmo effect is lost when returning to space center from map view, which is presumably something to do with refreshing on scene change...though you two (Blackrack and rBray89) are far more qualified to be able to diagnose that.[/QUOTE]

That is just a bug with the latest version that happens sometimes. Not important I'd say it's just the KSC view, nothing game-breaking. However, I'm testing the fix myself right now, and it seems with scatterer only the sky still goes black randomly.
Link to comment
Share on other sites

[quote name='blackrack']That seems nifty, in the end do you just look at the alpha value of the texture? Do the shadows have variable alpha depending on the clouds? I probably shouldn't look at the code right now though as I'm way too tired to understand anything.[/QUOTE]

Close... we first subtract the alpha off the color, then use the alpha to lerp between 1 and the color. It is then multiplied to the DstColor. This way, if the cloud was colored, we get a multiplicative effect that reflects the color passing through. We subtract the alpha to ensure that a fully opaque cloud is always opaque but if not some color should still show through (maybe we should use *(1-color.a) instead...) and then that value gets filtered once again by alpha to determine intensity.
[code]
Blend Zero SrcColor
...
fixed4 color = _Color * main.rgba * lerp(detail.rgba, 1, detailLevel);
color.rgb = saturate(color.rgb - color.a);
color.rgb = lerp(1, color.rgb, _ShadowFactor*color.a);
return lerp(1, color, shadowCheck);
[/code]
[quote name='blackrack']Also, I wanted to ask your opinion about logarithmic depth buffers. I've been meaning to implement a custom logarithmic depth buffer for use with the "post-processing" shader to solve what I think are depth buffer inaccuracies that arise mostly in dx11 and OpenGL when looking at the water surface from the limit altitude where the PQS is disappearing, but also I think it causes a flickering (z-fighting?) effect in dx9.

There are a few resources on this, notably

[URL="http://outerra.blogspot.fr/2009/08/logarithmic-z-buffer.html"]http://outerra.blogspot.fr/2009/08/logarithmic-z-buffer.html[/URL]

[URL="http://outerra.blogspot.fr/2013/07/logarithmic-depth-buffer-optimizations.html"]http://outerra.blogspot.fr/2013/07/logarithmic-depth-buffer-optimizations.html[/URL]

I haven't gone too much in detail yet but it seems feasible, although I think reconstructing world position from this new depth buffer might be a bit more of a pain (it already was a pain for me to get working).

I plan to just start from the provided unity depth shader and make changes to it, then use it as a replacement shader and obtain a depth buffer in a rendertexture. Thought it sounds straightforward, I know if too much matrix math gets involved I will lose my sanity.[/QUOTE]

If it is just water surface, than do what I was doing. I didn't want to enable depth processing on the ocean.:
[code]
_OceanRadius ("Ocean Radius", Float) = 63000
_PlanetOrigin ("Sphere Center", Vector) = (0,0,0,1)
...
float _OceanRadius;
float3 _PlanetOrigin;
...
[vertex]
o.L = _PlanetOrigin - _WorldSpaceCameraPos.xyz;
...
[surface]
half3 worldDir = normalize(IN.worldVert - _WorldSpaceCameraPos.xyz);
float tc = dot(IN.L, worldDir);
float d = sqrt(dot(IN.L,IN.L)-(tc*tc));
float3 norm = normalize(-IN.L);
float d2 = pow(d,2);
float td = sqrt(dot(IN.L,IN.L)-d2);

float oceanRadius = _Scale*_OceanRadius;
half sphereCheck = step(d, oceanRadius)*step(0.0, tc);

float tlc = sqrt((oceanRadius*oceanRadius)-d2);
float oceanSphereDist = lerp(depth, tc - tlc, sphereCheck);

depth = min(oceanSphereDist, depth);
[/code]

This will allow you to get the distance to the ocean if it is a "hit" or depth in the "miss" case. From there, you just take the min of the depth and the ocean dist. It allows you to get the VERY accurate distance to the ocean (floating point loss rather than depth buffer compression)

I dreamed this up when thinking about how to go about doing stuff without using the depth buffer, and found that spherical collision is actually a very common math problem apparently. I've since used the same approach in many shaders. We are very lucky that things are perfect spheres in KSP.

[COLOR="silver"][SIZE=1]- - - Updated - - -[/SIZE][/COLOR]

Oh, and you shouldn't have to use shaderreplacer any more! The newest KSP added the shader tags for RenderType.
Link to comment
Share on other sites

[quote name='rbray89']Close... we first subtract the alpha off the color, then use the alpha to lerp between 1 and the color. It is then multiplied to the DstColor. This way, if the cloud was colored, we get a multiplicative effect that reflects the color passing through. We subtract the alpha to ensure that a fully opaque cloud is always opaque but if not some color should still show through (maybe we should use *(1-color.a) instead...) and then that value gets filtered once again by alpha to determine intensity.
[code]
Blend Zero SrcColor
...
fixed4 color = _Color * main.rgba * lerp(detail.rgba, 1, detailLevel);
color.rgb = saturate(color.rgb - color.a);
color.rgb = lerp(1, color.rgb, _ShadowFactor*color.a);
return lerp(1, color, shadowCheck);
[/code]


If it is just water surface, than do what I was doing. I didn't want to enable depth processing on the ocean.:
[code]
_OceanRadius ("Ocean Radius", Float) = 63000
_PlanetOrigin ("Sphere Center", Vector) = (0,0,0,1)
...
float _OceanRadius;
float3 _PlanetOrigin;
...
[vertex]
o.L = _PlanetOrigin - _WorldSpaceCameraPos.xyz;
...
[surface]
half3 worldDir = normalize(IN.worldVert - _WorldSpaceCameraPos.xyz);
float tc = dot(IN.L, worldDir);
float d = sqrt(dot(IN.L,IN.L)-(tc*tc));
float3 norm = normalize(-IN.L);
float d2 = pow(d,2);
float td = sqrt(dot(IN.L,IN.L)-d2);

float oceanRadius = _Scale*_OceanRadius;
half sphereCheck = step(d, oceanRadius)*step(0.0, tc);

float tlc = sqrt((oceanRadius*oceanRadius)-d2);
float oceanSphereDist = lerp(depth, tc - tlc, sphereCheck);

depth = min(oceanSphereDist, depth);
[/code]

This will allow you to get the distance to the ocean if it is a "hit" or depth in the "miss" case. From there, you just take the min of the depth and the ocean dist. It allows you to get the VERY accurate distance to the ocean (floating point loss rather than depth buffer compression)

I dreamed this up when thinking about how to go about doing stuff without using the depth buffer, and found that spherical collision is actually a very common math problem apparently. I've since used the same approach in many shaders. We are very lucky that things are perfect spheres in KSP.[/QUOTE]

Well it's mostly the ocean, starts at 10 KMs up. But at higher altitudes other artifacts start appearing over the land so this won't be enough. I noticed that forcing deferred rendering seems to get rid of all these artifacts but it causes many more issues with the rendering that I don't feel like looking at. Do you know if deferred rendering uses some kind of higher precision depth buffer? Maybe 32bit for deferred vs 24bit for forward?


[quote name='rbray89']
Oh, and you shouldn't have to use shaderreplacer any more! The newest KSP added the shader tags for RenderType.[/QUOTE]

So wait, how do I re-render everything so as to obtain a custom depth buffer without a replacement shader?

Edited: I might still use your snippet for the ocean as I still get some kind of barrier around the horizon on oceans in OpenGL. Edited by blackrack
Link to comment
Share on other sites

[quote name='blackrack']Well it's mostly the ocean, starts at 10 KMs up. But at higher altitudes other artifacts start appearing over the land so this won't be enough. I noticed that forcing deferred rendering seems to get rid of all these artifacts but it causes many more issues with the rendering that I don't feel like looking at. Do you know if deferred rendering uses some kind of higher precision depth buffer? Maybe 32bit for deferred vs 24bit for forward?[/QUOTE]

Not that I'm aware of.

In case you didn't see:

Oh, and you shouldn't have to use shaderreplacer any more! The newest KSP added the shader tags for RenderType.

[COLOR="silver"][SIZE=1]- - - Updated - - -[/SIZE][/COLOR]

[quote name='blackrack']Well it's mostly the ocean, starts at 10 KMs up. But at higher altitudes other artifacts start appearing over the land so this won't be enough. I noticed that forcing deferred rendering seems to get rid of all these artifacts but it causes many more issues with the rendering that I don't feel like looking at. Do you know if deferred rendering uses some kind of higher precision depth buffer? Maybe 32bit for deferred vs 24bit for forward?




So wait, how do I render this as to obtain a custom depth buffer without a replacement shader?[/QUOTE]

ShaderReplacer being the entity that substitutes KSPs shader code with the modified code that adds the Shader Tags that they ommitted :)
EDIT: This guy: [url]https://github.com/LGhassen/Scatterer/blob/master/scatterer/ShaderReplacer.cs[/url] Edited by rbray89
Link to comment
Share on other sites

[quote name='rbray89']Not that I'm aware of.

In case you didn't see:

Oh, and you shouldn't have to use shaderreplacer any more! The newest KSP added the shader tags for RenderType.

[COLOR="silver"][SIZE=1]- - - Updated - - -[/SIZE][/COLOR]

ShaderReplacer being the entity that substitutes KSPs shader code with the modified code that adds the Shader Tags that they ommitted :)[/QUOTE]

But deferred rendering definitely fixes all the artifacts which is weird.

Ah okay, I'm not talking about ShaderReplacer, but about using replacement shaders [URL="http://docs.unity3d.com/Manual/SL-ShaderReplacement.html"]http://docs.unity3d.com/Manual/SL-ShaderReplacement.html[/URL] to re-render a scene and obtain a depth buffer for example.
Link to comment
Share on other sites

[quote name='blackrack']Ah okay, I'm not talking about ShaderReplacer, but about using replacement shaders [URL="http://docs.unity3d.com/Manual/SL-ShaderReplacement.html"]http://docs.unity3d.com/Manual/SL-ShaderReplacement.html[/URL] to re-render a scene and obtain a depth buffer for example.[/QUOTE]

Right. I just wanted to mention that you can remove this entity and no longer have to worry about updating it going forward.

As a cool by-product, if you do make your own depth buffer you could use a camera with a huge clipping area and then you could use the near camera instead of the far camera for rendering and make much more dense atmo-rendering.
Link to comment
Share on other sites

[quote name='rbray89']Right. I just wanted to mention that you can remove this entity and no longer have to worry about updating it going forward.

As a cool by-product, if you do make your own depth buffer you could use a camera with a huge clipping area and then you could use the near camera instead of the far camera for rendering and make much more dense atmo-rendering.[/QUOTE]

This is a cool idea, and I was thinking about doing this to simulate a really thick atmo on Eve, but for scatterer it wouldn't work that well because the proland atmo is really rigid in many ways. ALso, won't this make it render on top of the EVE clouds again?

However I thought about making a custom shader based on the unity global fog shader to simulate dust or really dense fog on Eve.
Link to comment
Share on other sites

[quote name='blackrack']This is a cool idea, and I was thinking about doing this to simulate a really thick atmo on Eve, but for scatterer it wouldn't work that well because the proland atmo is really rigid in many ways. ALso, won't this make it render on top of the EVE clouds again?

However I thought about making a custom shader based on the unity global fog shader to simulate dust or really dense fog on Eve.[/QUOTE]

Yeah, but that would be where the idea of writing alpha could save us.

Me too...

[COLOR="silver"][SIZE=1]- - - Updated - - -[/SIZE][/COLOR]

[quote name='blackrack']But deferred rendering definitely fixes all the artifacts which is weird.[/QUOTE]

Oh, and I think that might be the case. I think that unity might actually render to two seperate buffers for depth and normal to achieve quality results, while you are reading from a depth+normal map right?
Link to comment
Share on other sites

[quote name='rbray89']Yeah, but that would be where the idea of writing alpha could save us.

Me too...

[COLOR="silver"][SIZE=1]- - - Updated - - -[/SIZE][/COLOR]



Oh, and I think that might be the case. I think that unity might actually render to two seperate buffers for depth and normal to achieve quality results, while you are reading from a depth+normal map right?[/QUOTE]

Actually no, I'm using the depth only map that forward gives and it has artifacts compared to deferred.

The depthNormals map was a lot worse and I couldn't even achieve the effects I wanted with the normals so I just got rid of it.
Link to comment
Share on other sites

[quote name='blackrack']Actually no, I'm using the depth only map that forward gives and it has artifacts compared to deferred.

The depthNormals map was a lot worse and I couldn't even achieve the effects I wanted with the normals so I just got rid of it.[/QUOTE]

Hmmm... Then I'm not sure. It is still probably producing a better depth buffer though.
Link to comment
Share on other sites

[quote name='rbray89']Hmmm... Then I'm not sure. It is still probably producing a better depth buffer though.[/QUOTE]

I've looked around a bit, it appears that when using forward, unity uses whichever depth buffer is default for the platform and hardware (32bit for dx9, 24bit for OpenGL) and when using deferred it uses replacement shaders to render a 32bit depth buffer for both. So this should work :cool:

And although dx9 seems to have slight flickering at some altitudes it's much better than the combined patterns and flickering OpenGL has. This will be a first step before I try a logarithmic depth buffer.
Link to comment
Share on other sites

[quote name='blackrack']I've looked around a bit, it appears that when using forward, unity uses whichever depth buffer is default for the platform and hardware (32bit for dx9, 24bit for OpenGL) and when using deferred it uses replacement shaders to render a 32bit depth buffer for both. So this should work :cool:

And although dx9 seems to have slight flickering at some altitudes it's much better than the combined patterns and flickering OpenGL has. This will be a first step before I try a logarithmic depth buffer.[/QUOTE]

Ah that makes sense. I'd probably like a better depth buffer anyways and being able to send that off to all cameras rather than relying on them to have a unified buffer.
Link to comment
Share on other sites

Custom depth buffer is working, all high altitude artifacts and moire patterns are gone. The barrier effect at the edge of the ocean remains but I already have an idea for that.

[IMG]http://i.imgur.com/HjQX71H.jpg[/IMG]

The maximum depth allowed is actually 24bit. Turns out OpenGL was using 16bit by default which explains why it was so bad.

Now to fix the ground to orbit transition.

Edited: for those who want to try it: [URL="https://mega.nz/#!6JYDSKQJ!JoSWw_pr7wH1wBbknI3k7Nfl5xhRJ7tUcYCCcx3xHyA"]ArtifactsFix.zip[/URL]

I will probably add a few more features before making a complete new release/reupload to kerbalstuff. Edited by blackrack
Link to comment
Share on other sites

[quote name='blackrack']Custom depth buffer is working, all high altitude artifacts and moire patterns are gone. The barrier effect at the edge of the ocean remains but I already have an idea for that.

[url]http://i.imgur.com/HjQX71H.jpg[/url]

The maximum depth allowed is actually 24bit. Turns out OpenGL was using 16bit by default which explains why it was so bad.

Now to fix the ground to orbit transition.

Edited: for those who want to try it: [URL="https://mega.nz/#!6JYDSKQJ!JoSWw_pr7wH1wBbknI3k7Nfl5xhRJ7tUcYCCcx3xHyA"]ArtifactsFix.zip[/URL]

I will probably add a few more features before making a complete new release/reupload to kerbalstuff.[/QUOTE]

Sweet! What is involved with generating the new depth texture?
Link to comment
Share on other sites

[quote name='rbray89']Sweet! What is involved with generating the new depth texture?[/QUOTE]

It's actually pretty simple, using this as a replacement shader

[CODE]
Shader "Custom/DepthTexture" {
SubShader {
Tags { "RenderType"="Opaque" }
Pass {
Fog { Mode Off }
CGPROGRAM

#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"

struct v2f {
float4 pos : SV_POSITION;
float2 depth : TEXCOORD0;
};

v2f vert (appdata_base v) {
v2f o;
o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
UNITY_TRANSFER_DEPTH(o.depth);
return o;
}

half4 frag(v2f i) : COLOR {
UNITY_OUTPUT_DEPTH(i.depth);
}
ENDCG
}
}
}

[/CODE]

And then just creating a camera that renders the scene with this replacement shader to a rendertexture. This produces a regular (linear?) depth buffer but the rendertexture depth can be set manually when creating it. This seems to fix my OpenGL issues but dx9 still seems to have some flickering so I'll probably experiment with the logarithmic depth buffer later on.
Link to comment
Share on other sites

[quote name='blackrack']It's actually pretty simple, using this as a replacement shader

[CODE]
Shader "Custom/DepthTexture" {
SubShader {
Tags { "RenderType"="Opaque" }
Pass {
Fog { Mode Off }
CGPROGRAM

#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"

struct v2f {
float4 pos : SV_POSITION;
float2 depth : TEXCOORD0;
};

v2f vert (appdata_base v) {
v2f o;
o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
UNITY_TRANSFER_DEPTH(o.depth);
return o;
}

half4 frag(v2f i) : COLOR {
UNITY_OUTPUT_DEPTH(i.depth);
}
ENDCG
}
}
}

[/CODE]

And then just creating a camera that renders the scene with this replacement shader to a rendertexture. This produces a regular (linear?) depth buffer but the rendertexture depth can be set manually when creating it. This seems to fix my OpenGL issues but dx9 still seems to have some flickering so I'll probably experiment with the logarithmic depth buffer later on.[/QUOTE]

Very cool!
Link to comment
Share on other sites

[IMG]http://i.imgur.com/vsRPpXB.png[/IMG]I am seeing a noticible lag now on scene changes before scatterer kicks in, it was always behind the scene load but now it's more pronounced, no other effect of note. Linux 64bit, Intel with nvidia graphics and overidden hardware antialiasing. Happens for a second before physics kicks in. Glorius. Edited by selfish_meme
Link to comment
Share on other sites

[quote name='blackrack']
Edited: for those who want to try it: [URL="https://mega.nz/#!6JYDSKQJ!JoSWw_pr7wH1wBbknI3k7Nfl5xhRJ7tUcYCCcx3xHyA"]ArtifactsFix.zip[/URL]
[/QUOTE]

I think this just killed my framerate. Need to sleep so I haven't had time to check everything. I had also just installed SCANsat so that could be adding to it, though no satellites were out mapping.
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...