Jump to content

VR_Dev

Members
  • Posts

    1,039
  • Joined

  • Last visited

Everything posted by VR_Dev

  1. Rigged my walking hexapod's neck up with IK so the head can remain stable. More vidoes and dev thread here -
  2. Crazy week at work for me but I did get some progress over the weekend. I rigged up the arm "neck" with IK and PID servos. In terms of design I added another servo at the base of the head. This allows me to control the pitch of the head in addition to its position. Unfortunately I realized too late that servo only has a range of 60, so it will have to be replaced. I also added a rotator to control the roll of the head so that it can remain level along its z axis, when the body is not. The base of the whole neck has a rotator as well, which will swing the whole arm around. Neither of those are hooked up yet. Eventually I will use a combo of joystick/buttons to maneuver the head target around, as well as preset positions. You can see from the bouncyness the servos have to be tuned still. Also haven't figured out the extender yet. That presents an interesting challenge. The other big thing I've been working on is sending a render texture through my bridge. I got a camera feed down to byte array, which I can send via the bridge, but the problem comes from a critical failure in the design. KSP targets .net 3.5, and memory mapped files aren't supported unit 4+. I tried a lot of ways to get around this problem, but eventually ended up starting a custom process at launch, then using the i/o for that process to send data over the bridge to unity. This works fine for single values, but encoding a byte array of a texture to strings every frame dropped the frame rate down to under 2 fps. Really the only solution to this is to eliminate the custom process, and get mmap files to work in 3.5. There seem to be a couple ways to do this, but they are very difficult. If there are any .net guru's out there, I think I have a solution, but I could use some help. I really want to get a camera feed from ksp into unity, because it opens a whole new world of things i could do in vr. Specifically having the neck arm mirror the heads rotation, and displaying the feed to a HUD of some sort.
  3. It would be cool to use WIFI, as that would probably be the most robust. I definitely would want to include sensors, but I've always had the idea to stick one of these on a mech/quadcopter, for super accurate room tracking. I have a couple laying around, and VIve's room tracking is accurate enough that you could program complex maneuvers (as long as its in a room rigged with lighthouses). You could also then view the mech in VR, and someday AR, allowing for floating UI's around the robot.
  4. Yeah I saw Hexy but the whole goal of this robotics adventure was to learn how to develop for the raspberry pi/arduino. I eventually want to program quadcopters, but I figured it would be smarter to start off on the ground. I'd like to get into stuff like this. I've already done a lot of flying programming in both kOS and native c#; Wingman kOS script Yeah a quadrapod that doubles as a quadcopter is essentially my end goal in ksp. It would be nuts to see IRL too. Problem is the weight mostly. I didn't know that about real servos, I thought for sure you would get a position. Then again I know nothing about hardware. We could just drop the speed PID controls out and have a fixed speed. Yeah mine only walks in a straight line right now, barely. So there are still a lot of leg target commands/algorithms to be written. Three targets rotate around their center point for the lift and move forward cycle, while the other three leg targets just translate straight backwards for the forward motion. You can see the pink box that is the leg target anchor, then there is a forward position and rear position that the target moves between That game is awesome. The IK is super smooth, and they have the gait figured out for every direction. Even for pushing things. That goes along with an idea I had where the hexapod could lift its two front arms for grabbing/shooting, then walk around like a quadrapod. Yeah I don't know much about any of that. Like I said I'm all software, but I wanna learn arduino/raspberry pi. Also don't know c++ but I really want to learn that, else I could just create a wrapper for my c# library. Yeah this was always in the back of my mind, controlling a real bot from my unity controller. I always imaged it would be wired, but bluetooth would probably be the way to go. I never really mentioned in my main post but there is a plugin running on the craft in ksp, which acts as the driver. Interesting, I'll def look into it. You guys got me excited for a real world counterpart.
  5. I responded to this in the dev thread, if anyone is interested. https://forum.kerbalspaceprogram.com/index.php?/topic/170110-wip-unity-controlled-robotic-plugins/&do=findComment&comment=3286357
  6. @ZodiusInfuser Glad to hear you like the work so far. I was actually going to buy a hexapod kit like the ones you have, but didn't want to spend hundreds of dollars right off the bat. So I decided to build one in KSP. Also in KSP it will be able to fly, which would be next to impossible IRL. The skeletons both serve a purpose as well, in addition to showing the actual and desired position. The speed of each servos is set every frame based on the error between the actual skeleton and the IK skeleton using a PID controller. This could absolutely be used in the real world. All it needs is the input of the actual servo position, and it outputs what the servo should be set at. I use a couple built-in unity features (Vector3.Distance for example), which would have to be replaced with a real world counterpart, but it wouldn't be that hard to do. The only development you would need to do is for the actual walking. The legs are coded to move to a target position, then you use logic to move that target position around, allowing the mech to walk. In mine, the target rotates around an anchor, then slides along the ground for the forward movement. You also have to set the target along the y axis, to move the hip to its desired location.
  7. This is truly awesome @Tirehtoori R.I.P, as well as everything you have posted to your channel. You truly are the king of mechs. I have something that might interest you as well. Video and Dev thread -
  8. Nice dude I'm happy to hear you're jazzed about it. You know I've been doing the same for years with not great results. Def not as good as your boston dynamics type dog or transformer though. I'd love for you to incorporate this technique into your mechs. Do you happen to have any c#/dev experience? This isn't even close to being plug and play release ready. I could rig up a mech for you, but you'd probably have to figure out the walking logic.
  9. You could try and do a better job than I did if you were interested.
  10. https://forum.kerbalspaceprogram.com/index.php?/topic/170110-wip-unity-controlled-robotic-plugins/ Video & dev thread
  11. Hey thanks a lot. I know the robotics part isn't super useful to people, but I wanted to show off the memory mapped bridge. I have lots of plans to implement it in other non-robotic projects that people may find more useful.
  12. Memory Mapped Files The closest example to this technique is kRPC, which I experimented with about a year ago when developing my F-22 drone racer. I discovered there was a significant delay between the server and client however. I was drawing a Vector3 representation every frame at my vessels position, which would fall further and further behind the craft as it's velocity increased. I have used Unity's uNet client and local server and have not experienced this lag before, which has me stumped on the cause. I would be interested to chat with someone who knows more about this mod, or anyone who has also experienced this. Because of the lag I was experiencing with kRPC, I decided to build my own "bridge" using Memory Mapped Files. These allow multiple processes to share memory space, which can be used to store several different types of data structures. While these are easily the fastest way to send data, there is one major complication for this project. Mmap files are only supported in .net 4+, while KSP targets 3.5. My solution to this is to start a custom process at launch, which handles the memory map "bridge", then I send/receive values via the process's i/o stream. This allows me to send hundreds of values back and forth each frame, at 50+ ksp fps, which is perfect for the time being. My next goal however is to send much larger data structures over the bridge. I really want to get camera feeds from KSP into Unity, so that I can begin implementing VR control into my mods. I have successfully sent a Texture 2D byte array from a camera across the bridge on a frame-by-frame basis, but the problem is when I need to incorporate the process i/o. Converting the array to a string every frame gives less then 2fps in Unity. The solution to this is to get mMap files working in .net 3.5. I tried many different ways before settling on the additional process, with no luck. I do have a potential solution however, but could use some input from any .net guru's out there. Inverse Kinematics The IK code is all custom, but it's pretty amature. I rely on SOHCAHTOA and the law of cosines for everything. Unity gives me a lot of cheats that aren't available in real time. Vector3.distance for example easily gives me the distance between two points. It all works though, and I plan to expand it out to more joints. The neck arm also allows rotational target matching. PID Controller Servos Each servo has it's own tunable PID loop which uses the error between the actual and IK servo angles as its process controller. This output sets the speed of the servo on a frame-by-frame basis. I only have a basic knowledge of PID tuning, so if anyone out there would like to share some advice it would be greatly appreciated. Gait Sequencing & Steering Right now the only gait sequence that exists is walking at a set speed. Steering is the last thing I still need to do. I wrote some simple stride length adjustment code, which allows setting the desired stride length for both the right and left legs at the beginning of each stride. The actual steering is adjusted by a PID loop which decreases one side's stride length by a percentage of its default length. So my stride length is 2 meters, and the steering PID can shorten that by up to 5%. Terrain Mapping & Active BalanceThe hexapod now has full terrain mapping and navigation capabilities. Instead of using the vessel height to position the ground and foot targets, each foot target is now placed on the ground directly beneath it.Each hip target is now set according to the ground position of it't foot. There are two ways of setting the hip targets. Right now the hip targets are set at the average height of all the foot ground positions plus the target height. I've realized since recording the video it would be best to just set the hip targets so that the vessel is always the target height above the ground. Or some combination of the two. The first method helps going up hill, while the second is preferable for down hill. Also setting the hip height half way between the highest (foot ground + target height) and the lowest would be the way to insure each foot has the best chance of hitting it's target when going over extremely rugged terrain.The vessel also knows if it is traversing up a hill, is level, or traversing down a hill, and sets the color of the ground accordingly. Right now nothing changes according to that, but eventually the gait will respond to the different slopes. I tried taking it out to the mountains but I still need to find a way to orient the gyroscope to a level surface, even when the launch point is not level. The triangle begin drawn on the ground represents the leg group that is actually moving the hexapod forward. VR Implimentation As soon as I can get the camera feed from KSP to display to a render texture in Unity, I will be able to start moving forward with VR implementations. I have several things that will be pretty easy to accomplish right off the bat. I will be able to have the robotic head match the rotation of the HMD head, and display the camera feed to a HUD. I will also be able to control the arms of the mech using touch controllers. I have some thoughts on getting VR to work natively in KSP as well. This has been done before, with mixed results. I'd like to see if I could do better. Collaboration and Testing Let me know if you'd like to contribute in any way. If you'd like to help and know any Unity/C# that would be great, but I could use help with tuning and design as well. There are a lot of variables that need to be tweaked, as well as design changes to the mechs themselves. Let me know if you have any interest in helping and I can probably find something for you to do. There is also the potential for a real life counterpart. This is something I would definitely need help with, as my hardware skills are almost non-existant. I am planing on buying a frame in the near future, and would love to have some help implementing my Unity controller in the real world. If this interests you, there is lots of info already on this thread page. TODO Veering while walking Turning in place Terrain Mapping Traversing slopes Quadrapod/Bipedal Mech Jumping/Extended flight Thanks for checking it out, lots more to come.
  13. It's all about pid loops. That's how the stock SAS works, and you can use them to control just about anything. I have them for pitch/yaw/roll, then the rest is just logic to get it to do what you want. Feel free to ask any questions I'll help explain where I can.
  14. Slowly working on converting my kos script into a native ksp MOD. The kOS pilot script completed the course maybe 1 in 10 times, but I'm looking for it to complete the course every time. I have drastically improved the flight mechanics, which give a much more predictable turn. More details can be found here. https://forum.kerbalspaceprogram.com/index.php?/topic/160990-racing-drone-plugin-w-potential-ml-integration/ I slow time down around the first gate, that's not frame rate dropping.
  15. Hey I'm super interested in this. Potentially streaming rendertextures/texture2ds. Did you happen to figure anything out?
  16. Friendly bump, I moved past this for a while but now its a roadblock. childParts = hipRotServo.servo.HostPart.children; This gets you the child attachment node, I just need to figure out the logic to search the entire hierarchy.
  17. Solved it. .FindChildParts<Part>(true); I'm trying to create a list/array of the highlighted parts. Part.attachNodes gets me both the parent and child of the root highlighted part, but I only want the children. Everything stack and surfaced attached. Part.children doesnt seem to do the trick either. It may be weird hierarchy in these ir rework parts though?
  18. Hey thanks for the info. If I open up the roll to adjust pitch the radius of the turn will vary greatly. I actually have it tuned pretty good right now, and because the yaw only makes small adjustments it seems to work ok. The problem is I have two pids controlling the yaw. One to control the actual input, and one to set the target vector to close the gap. Getting two pids to work together is always difficult, but it seems to be working ok. I used a pid library I found online so I'm not sure if I'm using a good one or not. Also I miss BDArmory, I used to do tons of dogfights with my jet and my gundam. Don't have much time for ksp anymore and when i do its just this project. I had some older drone kos scripts I would also like to make plugins. No video but I had an awesome wingman kos script back in the day. IR Tracking Cannon, skip to like 3 minutes in.
  19. Got the autopilot working, not its just tuning the pids. I'm begining to look into machine learning to see if i could set up ksp to loop the first gate, while having some kind of library set the pids. The main difference between this and the kos script is that the autopilot is set to close the x & y proximity gap to its target, not point at the target. With the plane rolled the yaw is responsible for for closing the x(formally y) distance, but using yaw for pitch is proving to be difficult. I'm thinking about having the pid unroll the plane to control altitude in the turns.
  20. So I actually fixed it. I copied an existing cfg file and changed the name/values which got it to load. When I created my "SavedValues.cfg" I must have saved it as a text file. I knew it was something stupid I did. Thanks for the help everyone.
  21. Oh sorry I totally missed that suggestion the first time. Doesnt fix the problem though
  22. So I have tried this and every other way to get the correct path as a string. I'm starting to think that isn't the problem however. If the path were wrong I assume I would get a null ref exception when i use: fullPath = Path.Combine(Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location), "SavedValues.cfg").Replace("\\", "/"); ConfigNode savedValues = ConfigNode.Load(fullPath); But I don't. I'm starting to think I am just accessing the node incorrectly. So again here is my .cfg file. VALUES { pitchP = 10; pitchI = 10; pitchD = 10; } So i get the root node of the cfg using the code above, then get the VALUES node with this, which is when the first null ref gets thrown. if (savedValues.HasNode("VALUES")) { savedValues = savedValues.GetNode("VALUES"); } Then I should be able to access the values in the node via. pitchIValue = float.Parse(savedValues.GetValue("pitchI")); Is this not the correct way to get info from the nodes?
×
×
  • Create New...