Jump to content

Normal map / Lighting direction problems


Recommended Posts

flipping the mesh normal will cause them to turn inside-out in Unity. How does it look in Unity? it may simply be a display thing in Blender. Some 3d apps doesn't display mirrored Normal maps, requiring you to give unique UV coordinate to entire mesh. But Unity doesn't mind flipped normals and overlapping UVs. I wouldn't worry about it as long as the model looks fine in Unity.

Edited by nli2work
Link to comment
Share on other sites

Not sure I get the question, but I'll take a stab anyway. Are you sure all your normals are looking in the direction they're supposed to?

I made a normal map to test this. Look at the dimples near the red arrows. Look where the shading and highlights are. It's not consistent.

Anyway the face normals and vertex normals seem to be correct.

flipping the mesh normal will cause them to turn inside-out in Unity. How does it look in Unity? it may simply be a display thing in Blender. Some 3d apps doesn't display mirrored Normal maps, requiring you to give unique UV coordinate to entire mesh. But Unity doesn't mind flipped normals and overlapping UVs. I wouldn't worry about it as long as the model looks fine in Unity.

Unity shows the same problem. It must be something to do with the model.

I have other mirrored meshes in there. They don't reverse the normal map. I noticed that joining one of the problematic meshes to a new untextured mesh changes the direction of the shading and highlights. I'm just not sure how to fix it so that it's all consistent.

Edited by Cpt. Kipard
Link to comment
Share on other sites

hard to say without knowing how you got to this step. haven't ran into this issue for a while now. and last I encountered it, it was Max not displaying mirrored normals correctly.

is there a backface cull display so you can see if all the faces are really facing outward? is the shader set to tangent space? is there some operation in blender to unify or reset the normals for the entire model? you could try flipping the UV patch but that probably won't fix it. Don't see any tools that might help in the Blender manual. :/

Edited by nli2work
Link to comment
Share on other sites

The texture space is set to "tangent" by default. I don't know what I did to get it like that. The only things that come to mind is mirroring, and scaling to negative 1 along an axis then recalculating normals outside. I'm not really sure that's what caused it because I do it all the time and when I pay attention this problem doesn't occur.

I was able to fix it by joining objects to another object that behaved correctly.

I'd still like to know what happened there though. It's weird, and frankly joining meshes like that is more work because now I have to separate them again and move the origins around. There has to be a simpler way.

edit.

It's as if the normal map coordinates are flipped along either x or y or both, but the texture itself is fine.

Edited by Cpt. Kipard
Link to comment
Share on other sites

It's probably mirroring. I get similar issues in Max if I mirror, forget to reset the object's scale. It will export into Unity with the normals flipped, and the mesh looks inside out.

I do the attach/detach thing often, generally I just make a primitive, align it to the original object's pivot. Attach the original object to the primitive, then delete the primitive sub-object. Fixes issues without having to deal with shifted pivots.

Link to comment
Share on other sites

  • 1 month later...

I'm still having this problem and the workaround isn't working anymore. This is getting really annoying.

In Blender Under "Mapping" in the texture tab, setting the Y size to negative 1 fixes the shading across the board. I still don't understand why. In the past exporting a normal map from GIMP without flipping any of the axes would make the normal map appear correct in blender, but Y would have to be flipped in order to appear right in Unity. Now I have to flip Y in blender as well.

Link to comment
Share on other sites

No. I spoke too soon. It doesn't work. What works is inverting Y when generating the normal map in the first place. AFAIK This is not normal behaviour for blender. Up until now I only had to do that for the final Unity work. Seriously I'm getting pretty sick of this, so if no one comes up with something I'm going to abandon this mod.

Edited by Cpt. Kipard
Link to comment
Share on other sites

No. I haven't started working with Unity yet, except to check how the normal map behaves in Unity. I generate the normal map with the NormalMap plugin in GIMP. They are compressed TGAs. I have to check "Invert Y" in order for the shading and highlights to appear in the correct direction. AFAIK that's expected for Unity. The only good thing about this is that now I don't have to re-generate it after previewing the models in Blender, but I don't like the fact that Blender isn't behaving consistently any more. In the past Blender displayed normal maps correctly without "Inverting Y". Now it doesn't.

I've tried every possibility of mirroring, scaling the meshes to negative 1 and recalculating normals to outside, scaling the objects to negative 1 and recalculating normals, applying object rotations and scale, but whatever I do it always ends up looking the same now... unless I invert Y in the map.

After that last post I checked a few of the objects with the normal map in Unity and they reflect light correctly, so that's something at least. I just wish I knew what I did to break things like this. Google isn't very helpful, but this is such a specific problem that maybe my fu isn't good enough and I've already clicked and slid a bunch of setting in Blender to no avail. This WILL happen again if I continue to mod complex things like the Skylon

Link to comment
Share on other sites

Well, bingo. That's the issue right there.

Unity's native normal map format is a weird one--4 channels, and inverted.

All of R, G, and B should be the old G channel (inverted), and Alpha the old R channel.

If you export a texture as a normal map from the Unity editor, the editor will take care of all that. If you do it yourself, you...have to do it yourself.

Open some Squad normal maps (i.e. convert mbm to png) to see what I mean.

Link to comment
Share on other sites

Well, bingo. That's the issue right there.

Unity's native normal map format is a weird one--4 channels, and inverted.

All of R, G, and B should be the old G channel (inverted), and Alpha the old R channel.

If you export a texture as a normal map from the Unity editor, the editor will take care of all that. If you do it yourself, you...have to do it yourself.

Open some Squad normal maps (i.e. convert mbm to png) to see what I mean.

Could you go into more detail into the make-up of normal maps?

I took a look at the Squad ones and...

Mine

b310719804.jpg

Squad's

4c316f7549.jpg

I generate my normal maps using a height-map like this, maybe I'm doing it wrong.

46060daa33.jpg

Thanks! :)

Edited by Beale
Link to comment
Share on other sites

Could you go into more detail into the make-up of normal maps?

I took a look at the Squad ones and...

Mine

http://puu.sh/bl90V/b310719804.jpg

Squad's

http://puu.sh/bl93B/4c316f7549.jpg

I generate my normal maps using a height-map like this, maybe I'm doing it wrong.

http://puu.sh/bl9dl/46060daa33.jpg

Thanks! :)

what do you use to generate the normal with? Unity's normal generated from your 3rd bump map is identical to nDo PS plugin or Max; Green is Up. But your first normal, the green channel is inverted, Green is down.

Edited by nli2work
Link to comment
Share on other sites

what do you use to generate the normal with? Unity's normal generated from your 3rd bump map is identical to nDo PS plugin or Max; Green is Up. But your first normal, the green channel is inverted, Green is down.

Hi!

I generate the Normal map outside of Unity, using software called SSBump_Generator . The generated map is then put straight into unity for export.

Link to comment
Share on other sites

That's a bingo?

I don't get it. I've never exported a texture from unity. This doesn't seem relevant to what I'm talking about at all.

it just means whatever you are using to create the normal map is using reversed Y as Unity engine. Everything is working fine. if you want to keep using what you've been using to get normal maps; invert the Y before feeding it to KSP.

Edited by nli2work
Link to comment
Share on other sites

Cpt Kipard:

Let's try this again:

Unity expects normal maps in a certain format. If you do not give Unity a normal map in that format, and you do not tell it to convert to that format, it will not appear correctly ingame.

The format Unity expects normal maps in is:

The Green channel has the Y value, and needs to be inverted. (Note that this is usually also copied to the R and B channels)

The Alpha channel has the X value.

You can see the result in the pic Beale posted of a SQUAD normal map; it's gray because the R, B, and G channels are identical, since they all express Y-inverted, and X is in alpha.

The format you are passing has X in the Red channel, Y in the Green channel, and Z in the Blue channel, with no Alpha channel.

If you export your textures from Unity (i.e. to MBM) this conversion is handled automatically. If you don't do that and you are not using ATM, it is usually handled automatically in texture loading. If you don't do that, and you are using Active Texture Management but do not have the texture specified as a normal map (i.e. the NORMAL_LIST node) in an ATM config, then it will not be converted automatically on load by KSP, and it will fail to work.

Since Unity has this weird format, obviously you should only do this conversion *after* you have finished getting everything looking right in Blender, since obviously Blender is not going to know what to do with weird Unity-style normal maps.

Link to comment
Share on other sites

SQUAD normal map; it's gray because the R, B, and G channels are identical, since they all express Y-inverted, and X is in alpha.

I've been doing pretty much the same as Capt Kipard, only exception being +Y is up. They work fine in Unity and KSP, whether directly from Photoshop, Max, or through PartTools. with or without ATM. This RGB = X; Alpha = Y is a funky setup I have never heard of before, and seems rather wasteful. Is it MBM only?

Link to comment
Share on other sites

No, it's how Unity stores normal maps internally. All textures (that Unity is told will be used for a normal map) are converted to that format on load.

However, if Unity doesn't know it's supposed to be used as a normal map (i.e. if it not a normal-map-flagged MBM, and you're using ATM, and ATM isn't told to treat it as a normal map) it will not be converted, and there will be a line in the log that it failed to load.

Link to comment
Share on other sites

Since Unity has this weird format, obviously you should only do this conversion *after* you have finished getting everything looking right in Blender, since obviously Blender is not going to know what to do with weird Unity-style normal maps.

Yes. I know this. The problem I've been trying to describe in this thread is that NOW for some reason Blender behaves like Unity in the way it handles normal maps. Unity is completely irrelevant here. I don't understand how I can make this any clearer. It's most likely a problem with the model itself, because that's all I've been working on.

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...