Jump to content

Search the Community

Showing results for tags 'hexapod'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General
    • Announcements
    • Welcome Aboard
  • Kerbal Space Program 1
    • KSP1 Discussion
    • KSP1 Suggestions & Development Discussion
    • KSP1 Challenges & Mission ideas
    • KSP1 The Spacecraft Exchange
    • KSP1 Mission Reports
    • KSP1 Gameplay and Technical Support
    • KSP1 Mods
    • KSP1 Expansions
  • Kerbal Space Program 2
    • KSP2 Dev Updates
    • KSP2 Discussion
    • KSP2 Suggestions and Development Discussion
    • Challenges & Mission Ideas
    • The KSP2 Spacecraft Exchange
    • Mission Reports
    • KSP2 Prelaunch Archive
  • Kerbal Space Program 2 Gameplay & Technical Support
    • KSP2 Gameplay Questions and Tutorials
    • KSP2 Technical Support (PC, unmodded installs)
    • KSP2 Technical Support (PC, modded installs)
  • Kerbal Space Program 2 Mods
    • KSP2 Mod Discussions
    • KSP2 Mod Releases
    • KSP2 Mod Development
  • Community
    • Science & Spaceflight
    • Kerbal Network
    • The Lounge
    • KSP Fan Works
  • International
    • International

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Skype


Twitter


About me


Location


Interests

Found 1 result

  1. Memory Mapped Files The closest example to this technique is kRPC, which I experimented with about a year ago when developing my F-22 drone racer. I discovered there was a significant delay between the server and client however. I was drawing a Vector3 representation every frame at my vessels position, which would fall further and further behind the craft as it's velocity increased. I have used Unity's uNet client and local server and have not experienced this lag before, which has me stumped on the cause. I would be interested to chat with someone who knows more about this mod, or anyone who has also experienced this. Because of the lag I was experiencing with kRPC, I decided to build my own "bridge" using Memory Mapped Files. These allow multiple processes to share memory space, which can be used to store several different types of data structures. While these are easily the fastest way to send data, there is one major complication for this project. Mmap files are only supported in .net 4+, while KSP targets 3.5. My solution to this is to start a custom process at launch, which handles the memory map "bridge", then I send/receive values via the process's i/o stream. This allows me to send hundreds of values back and forth each frame, at 50+ ksp fps, which is perfect for the time being. My next goal however is to send much larger data structures over the bridge. I really want to get camera feeds from KSP into Unity, so that I can begin implementing VR control into my mods. I have successfully sent a Texture 2D byte array from a camera across the bridge on a frame-by-frame basis, but the problem is when I need to incorporate the process i/o. Converting the array to a string every frame gives less then 2fps in Unity. The solution to this is to get mMap files working in .net 3.5. I tried many different ways before settling on the additional process, with no luck. I do have a potential solution however, but could use some input from any .net guru's out there. Inverse Kinematics The IK code is all custom, but it's pretty amature. I rely on SOHCAHTOA and the law of cosines for everything. Unity gives me a lot of cheats that aren't available in real time. Vector3.distance for example easily gives me the distance between two points. It all works though, and I plan to expand it out to more joints. The neck arm also allows rotational target matching. PID Controller Servos Each servo has it's own tunable PID loop which uses the error between the actual and IK servo angles as its process controller. This output sets the speed of the servo on a frame-by-frame basis. I only have a basic knowledge of PID tuning, so if anyone out there would like to share some advice it would be greatly appreciated. Gait Sequencing & Steering Right now the only gait sequence that exists is walking at a set speed. Steering is the last thing I still need to do. I wrote some simple stride length adjustment code, which allows setting the desired stride length for both the right and left legs at the beginning of each stride. The actual steering is adjusted by a PID loop which decreases one side's stride length by a percentage of its default length. So my stride length is 2 meters, and the steering PID can shorten that by up to 5%. Terrain Mapping & Active BalanceThe hexapod now has full terrain mapping and navigation capabilities. Instead of using the vessel height to position the ground and foot targets, each foot target is now placed on the ground directly beneath it.Each hip target is now set according to the ground position of it't foot. There are two ways of setting the hip targets. Right now the hip targets are set at the average height of all the foot ground positions plus the target height. I've realized since recording the video it would be best to just set the hip targets so that the vessel is always the target height above the ground. Or some combination of the two. The first method helps going up hill, while the second is preferable for down hill. Also setting the hip height half way between the highest (foot ground + target height) and the lowest would be the way to insure each foot has the best chance of hitting it's target when going over extremely rugged terrain.The vessel also knows if it is traversing up a hill, is level, or traversing down a hill, and sets the color of the ground accordingly. Right now nothing changes according to that, but eventually the gait will respond to the different slopes. I tried taking it out to the mountains but I still need to find a way to orient the gyroscope to a level surface, even when the launch point is not level. The triangle begin drawn on the ground represents the leg group that is actually moving the hexapod forward. VR Implimentation As soon as I can get the camera feed from KSP to display to a render texture in Unity, I will be able to start moving forward with VR implementations. I have several things that will be pretty easy to accomplish right off the bat. I will be able to have the robotic head match the rotation of the HMD head, and display the camera feed to a HUD. I will also be able to control the arms of the mech using touch controllers. I have some thoughts on getting VR to work natively in KSP as well. This has been done before, with mixed results. I'd like to see if I could do better. Collaboration and Testing Let me know if you'd like to contribute in any way. If you'd like to help and know any Unity/C# that would be great, but I could use help with tuning and design as well. There are a lot of variables that need to be tweaked, as well as design changes to the mechs themselves. Let me know if you have any interest in helping and I can probably find something for you to do. There is also the potential for a real life counterpart. This is something I would definitely need help with, as my hardware skills are almost non-existant. I am planing on buying a frame in the near future, and would love to have some help implementing my Unity controller in the real world. If this interests you, there is lots of info already on this thread page. TODO Veering while walking Turning in place Terrain Mapping Traversing slopes Quadrapod/Bipedal Mech Jumping/Extended flight Thanks for checking it out, lots more to come.
×
×
  • Create New...