Jump to content

AdmiralStewart

Members
  • Posts

    44
  • Joined

  • Last visited

Everything posted by AdmiralStewart

  1. One more thing about the F Curves, you can use the editor to change the "interpolation type" from smooth to linear. Might help. #1 If it is "floating off the axis" then you have three things to look at. The object origin is what it should rotate around. The other thing to look at is to make sure you are actually rotating around the object origin or around the 3D cursor. #2 If it is floating off after being animated, then you first keyframe is not set correctly. You can try to use ctrl-A to apply rotation, location, and scale when you start on frame 1. I myself prefer to move to frame on, click record, select all, move 0, rotate 0, and scale 1. I find that I have a better starting point for setting up my actions (it gives me all 9 tracks on my keyfram/f curve/NLA/dopesheets). #3 - This is probably where you are if you're reasonably decent at Blender but stuck on getting the part into KSP - Make sure that the origin is aligned with your "axis" or else Unity will not correctly accept the animation. Blender may animate it the way you want to see, but if your object origin is not aligned to your axis of rotation, Unity will have it animate way out of whack. Blenders object data includes an entire stack of transforms (movements and rotations) on objects and can interpolate between when your object transforms bounce from local space to world space. So far I've only found the object data being origin based - location on 3 axes, rotation 3 axes, (and maybe scaling on 3 axes, not sure) in Unity. There's probably a way to change the tracks in Unity to follow world space instead of local space, but you'd have to ask somebody who knows Unity.
  2. A new skysphere will be awesome... I hope we keep the handy dandy Minmar guide star.
  3. I'd have to reinstall '05 to see if I can figure it out... I'm leaning towards "probably not." IIRC just about every shape in SW is procedural. According to "Inside Solidworks 2003," the closest export format might be stl, which would require another package to re-export for Unity import. Additionally, texture mapping is going to be insanely difficult, because UV mapping is not really a feature (at least in 2k5, idk about 2k11). I can't imagine textures are a priority, as SolidWorks is a CADD/CAME product and not at all a graphics program. I think Catia may have some of those capabilities for presentational graphics. Solidworks is kind of like "Catia Lite," if you can call any $8000-$20000 software package a "lite" version of a $25K-$50K one. I (personally and based on my own experience) would not spend any time attempting to create parts in SolidWorks. I also have Max 2.5R2, and really did not much care to use that either (mostly because I cannot find my parallel port dongle o_O). I sat down and learn Blender to make parts. Don't be embarrassed to not know. You're one of very few people in the world who knows how to use SW or even has any reason to have heard of it. That's definitely something. How it would benefit you is by giving you a leg up in following tutorials and learning an actual graphics program, designed for "digital art," because that is what games ultimately are.
  4. It's obviously a Romulan warbird with a skull head and horns like an antelope.
  5. Did you use the record feature with auto keyframes? Also, you can switch to the F-Curve view - anything wobbly should show up as a very "noisy" line. You can PM me the blend file if you want.
  6. Well, that's too bad. One can add a lot of detail with a little alpha. No biggie though, truthfully I'd rather they stay on top of core features than spend weeks worrying about my expanded steel walkway. I am forced to wonder if KSP isn't great with transparency or it's Unity itself.
  7. Note the difference between the screenshots http://imgur.com/a/OJ6Ph particularly the Crew Tank below my base piece. Is it supposed to be visible through the part? Please note that the second screen shot only displays the command module above because I deleted the polies beneath. I had set them to Black alpha, and they were cause a weird streak effect, kind of like the Predator when he was invisible. I do notice that little dude's helmet has some transparency to it but the crew module shows up fine through it, so maybe I'm doing transparency wrong. Also, the Kerbal shows up fine through the expanded steel grate when he is standing below it, just not the rest of the ship. EDIT: Apparently his faceplate does not show up through the grate. The normals are showing up fine, if a little misdirected because this is a 6x rotationally arrayed model. I'll eventually have to rebake the normals for the whole thing at once. Is this maybe a limitation of my video card (8800 GTX)? That would suck. Additionally, are two sided polygons an option in Unity?
  8. This is how I do it - maybe not the correct way - but pretty good nonetheless: Do your AO bake on a white, matte, claylike material (default white minus specularity low intensity I guess). Open your texture in Gimp. Open your AO Map in Gimp and paste it onto a layer above your texture. Set the AO layer to multiply. Does an amazing job adding that finishing touch! Like I said, maybe not the right way, but it certainly looks good (imo).
  9. Well, as far as modeling goes, the only important 'tech speak' for you would be the thing about low and high poly models. Low poly means few faces (those triangles and squares in your model). High poly means many. You can make a really awesome high poly model, but it really wouldn\'t be appropriate for a game. For this game, 'they' say shoot for 1000 faces. You can tell how many faces you have by looking on the top bar when in Edit Mode (it will say something like 250-1500 Fa - total of 1500). Keep in mind that if most of your faces are quads (four-sided), the game actually sees twice as many (because a square is two triangles put together). In any case, you want to get kind of clever when emulating a lot of detail on a model that doesn\'t have much. Texture maps help a lot. A key texture map now available to us is the 'normal map,' which can be created from the high polygon model and 'baked' onto the low poly model. That\'s why you want to create two copies of the same model. A shortened explanation of the normal map is that it changes the way the light bounces off the object. As far as baking goes, it has a lot of different uses not limited to textures. You\'ll learn more about baking as you go through more tutorials and learn more about the capabilities of modeling softwares.
  10. Unfortunately, Blender is pretty out there versus most applications that people are used to, not something one can really EILIT. Shift-D to copy and paste, right click where you want to paste? Watt? There are approximately 600 bajillion keyboard shortcuts, and little explanation of where the tools you need are reached with mouse clicking instead. I highly recommend this dude\'s first dozen or so tutorials; spend the few hours or so following along, it will get you up and running:
  11. What are you modeling in? If it\'s Blender, I could take a look and see about whipping up some textures. Cycles texturing-cum-baking has become a great interest of mine over the past few weeks. Additionally, make sure you\'re creating low an high poly models, too. Normal bakes, Cycles bakes, and AO bakes make something dull really pop. I\'m still a little unsure about making good spec maps, though. I\'m not really sure how to proceed there. Youtube tutorials, I guess.
  12. Another thing that can have a significant impact on unwraps is the view you\'re in. Press 1, 3, or 7 to gave the other views a try.
  13. Make sure snap to pixels is on in the UV Editor Window. Inside Gimp (or whatever) do a select by color threshold 0 and select the transparent part. This will select exactly all polygons in your wrap. Use a combination of add/subtract - box, free, or shape select to cut out the portions you don\'t intend to paint bucket. Create a new layer and paint there. Reexport as something other than your UV Export.png file. And to save yourself heartache, use Gimp 2.8 -> Windows -> Single Window Mode.
  14. As far as I can tell, the Unity part tools inbuilt colorings are troublesome to say the least. A lot of times, it will simply come out white (though occasionally they work). You might work on texturing them in an image application as has been suggestion. Blender 2.63 + Gimp 2.8 work together AMAZINGLY if you follow this dude\'s tutorials: (Round One and Round Two). Use a combination of Blenders inbuilt paint plus quickedit/import export with gimp.
  15. After piddling with the loft path shape, fiddling with the bevel object scale, and dinking with the taper axes (tapers are killer ), select the final product, hit 'C' then 'Curve To Mesh' (or Mesh from Curve, I forget). Remember to change your path, bevel and taper subdivisions downward somewhat, because it will give you a mesh with exactly the faces you see. A tapered engine cryo-recirc tube is exactly what you\'d see on a real engine.
  16. A few questions: Now that we can easily make multi-bodied Parts for animation purposes, and it appears we can apply multiple textures is there a minimum texture size? I\'d like my 'main' model to be more dull, but well textured, but for shiny metal components, I\'m thinking like 64x64 would be great to reduce the texture load of a 'showpiece' part. Does the converted 'mbm' format require that all texture sizes match? Does this mean I can create an emissive-specular texture as well as an alpha diffuse texture? My goal is to make a simple plane object that appears to be an array of faux shiny lights - multiple pretty lights out of two triangles, thus reducing the face load. It\'s possible that I\'m thinking about this one the wrong way... [Edit] - I think I was going about it the wrong way, the emissive effect itself is only in the lighter-than-black areas of the grayscale emissive map, so there\'s no need for a second object.
  17. Oh man was I ever frustrated by flipped/rotated colliders. Here\'s what I find works every time: Build your model, reset all your origins to a common central point. Drag/drop your .blend file into your Unity project. Drag/drop that underneath your PartObject that you create. There should be two models to expand underneath that. Here\'s where I kept getting messed up: don\'t delete your collider mesh. Simply select it, and in the inspector window, uncheck Renderable, Cast, and Receive Shadows. Select the actual model (third level down in the Game Object heirarchy; PartObject-->BlenderImport-->TheActualModel). Then insert Game Component->Physics-> Mesh Collider. Then drag the wireframe body of the collider (in the project window, below the BlenderImport - the icon with wires, not the blue one) into the mesh line of the Mesh Collider that is applied to your model. If needed, I\'ll post a screenshot when I get back on my Computer.
  18. One more question; My animation is fine, when 'played automatically,' it plays until it stops. In fact it plays in the part selection window the moment the part is selected. It even animates the second collider as expected (correctly allowing for surface attachments). Animate physics is currently checked. Having unchecked that, how in my module can I play it manually? Currently protected override bool onPartActivate() { //this.animation.enabled = true; //commented because this is now deprecated. this.animation.Play(); //does nothing //this.animation.playAutomatically(); //also does nothing return base.onPartActivate(); } doesn\'t do anything when the part is activated (except for play the activation sound effect correctly). The only thing I can think of is that the second collider does share space with the main body collider. Per C7s earlier statement that colliders on a single game object will not collide with each other, I assume this is fine. Additionally, as was asked earlier, how can I run it in reverse if I want to? [Edit]: So close! Animation[] allAnimations = this.FindModelAnimators( 'Default Take' ); if( allAnimations != null ) print( allAnimations.Length.ToString() ); else print( '\nNone Found\n' ); allAnimations[0].animatePhysics = true; allAnimations[0].Play(); My bomb didn\'t move with the arm, but it\'s only a matter of time and Debug Prints (hopefully); I\'m pretty sure that to run it in reverse, I may have to set up a second 'Take,' although I\'m not sure how to set up a second animation in Blender that will be properly imported (Probably as Default Take.001' or something similar). [Edit]: Well, here\'s how I got it to play the second part of the animation on command: I set up the entire animation in Blender. Note that I went through the whole DopeSheet-->Action Editor-->NLA Editor since I never had before, but I\'m reasonably certain you don\'t have to, though. Just animate the full range of motion with a beginning, middle, and end, where the middle is the deployment and the end is the return to the initial state. Split the animations in Unity by selecting the main part import in Projects, then inspect. choose Split, then choose the Deployment animation, add a second one as the Retraction animation. Keep in mind that it doesn\'t seem matter what you call them. Finally; in whatever code block you are using to run the animation clips, you can put something like this (I don\'t doubt there\'s a more correct way to do it, but if there is, I haven\'t found it yet): Animation[] allAnimations; string[] clipNames; protected override void onPartStart() { this.stackIcon.SetIcon( DefaultIcons.STRUT ); //Find Clips on Part Creation allAnimations = this.FindModelAnimators(); clipNames = null; int counter = 0; //Kludge: Use of a counter because I //cannot see the AnimationStates //in allAnimations for simple iteration! //Also there is no string[] GetClipNames(); //which there probably should be o_O if( allAnimations != null ) { clipNames = new string[allAnimations[0].GetClipCount()]; foreach( AnimationState animState in allAnimations[0] ) { clipNames[counter] = animState.name; counter += 1; } } else print( '\nNone Found\n' ); base.onPartStart(); } then select and play... protected override bool onPartActivate() { if( allAnimations != null && allAnimations.Length > 0 ) { //allAnimations[0].Play(clipNames[0]); //<-- 'Deploy' //or allAnimations[0].Play(clipNames[1]); //<-- 'Retract' //you probably want some error checking here, but I don\'t } return base.onPartActivate(); } I still can\'t get my bomb to move with it, so if anyone has any tips, that\'d be awesome.
  19. The Blender 2.6 Tutorial: 01 - ??? may be the easiest series to follow:
  20. I have a question which is probably be more appropriately asked elsewhere, but the part creation video raises it for me, and you may know the answer: You mention in your video simply generating the collider mesh in Unity and selecting convex (which builds more tris than I would think ). I\'ve noticed the physx simulation is basically the performance sink in this game (and I imagine any simulation based game built on Unity). I don\'t have enough experience to know, is the number of polys in the collider going to add 'significantly' to that performance hit? If I were to create, say, a 12 tri cube to surround the gear bay, would it boost the performance (measurably)? I\'m mostly referring to assemblies with many parts. Is the number of colliders more important than the number of triangles building up those colliders? For instance, considering 6 72 triangle colliders versus 36 12 triangle colliders; would there be a noticeable difference? Should I take considerable time to optimize the collider?
  21. This guy has one of the best sets of tutorials you can see on Blender: Getting through the first dozen should be more than enough.
  22. I didn\'t seem to have trouble importing from Blender 2.62 into Unity (Free) 3.52f2. Attached is a very basic blend file that seemed to have worked fine. I deleted the node_collider and highpoly_zTank objects from the tree after import.
  23. Aha! Thanks! Screenshots for other noobs: http://imgur.com/a/1BZTV
  24. Never used unity; it\'s taken me a couple hours just to get my first test part displayed and in the correct orientation (without the node_collider showing up - o_O). If I might ask, how are you applying the normal maps? I\'ve got both maps imported, and their material shaders set (texture map as Diffuse, normal map as KSP/Bumped) but I seem to only be able to apply one at a time to the zTank mesh. See pics: http://imgur.com/a/hPg5R#1
  25. Have any of you pro modelers run into trouble when trying to import/export the lander leg in Blender? Just a straight import/export doesn\'t work. It is behaving like a model that has no material assigned to it, ie: it appears in the list of parts that one can use in the VAB, but when selecting it it does not appear in the main view to be placed. I ran into that with basic parts not properly materialed. I suspect it has something to do with the colliders and anchors coming in as 'empty' objects, of which the four submodels are children. Troubleshooting thus far: Materials assignments to the meshes themselves. Materials assignments to the meshes and the colliders/anchors. Re-UV-mapping the meshes. Deleting the parent empty objects (nodeCollider, nodeLegCollider, objAnchor, objLeg). Recreating the parent empty objects. Recreating the parent objects as meshes themselves. Have you guys run into this? Is there a different object that should be used? Is this simply of Blender (2.6) having a compliance issue with COLLADA (as in the empty objects should be coming in as a standardized object for which Blender has no support yet)? Any tips? Thanks.
×
×
  • Create New...