VR_Dev

[WIP] Unity Editor Plugin

Recommended Posts

WNttxj0.png
Ever wish you could play with KSP from the Unity editor? I have developed a way to do just that, using .net's Memory Mapped Files. This allows most of the development work to be done in the unity editor, meaning you have full debugging, realtime control via the editor, and elimination of the constant rebuild and KSP restart. Personally I have seen my progress explode in comparison to building native KSP plugins. As you can see I am currently using this technique to control Infernal Robotics, but there is no limitation on what this technique can control in KSP. Flight controls, part manipulation, and KSP world settings are all easily controlled from the Unity editor.

Memory Mapped Files

The closest example to this technique is kRPC, which I experimented with about a year ago when developing my F-22 drone racer. I discovered there was a significant delay between the server and client however. I was drawing a Vector3 representation every frame at my vessels position, which would fall further and further behind the craft as it's velocity increased. I have used Unity's uNet client and local server and have not experienced this lag before, which has me stumped on the cause. I would be interested to chat with someone who knows more about this mod, or anyone who has also experienced this.

Because of the lag I was experiencing with kRPC, I decided to build my own "bridge" using Memory Mapped Files. These allow multiple processes to share memory space, which can be used to store several different types of data structures. While these are easily the fastest way to send data, there is one major complication for this project. Mmap files are only supported in .net 4+, while KSP targets 3.5. My solution to this is to start a custom process at launch, which handles the memory map "bridge", then I send/receive values via the process's i/o stream. 

This allows me to send hundreds of values back and forth each frame, at 50+ ksp fps, which is perfect for the time being. My next goal however is to send much larger data structures over the bridge. I really want to get camera feeds from KSP into Unity, so that I can begin implementing VR control into my mods. I have successfully sent a Texture 2D byte array from a camera across the bridge on a frame-by-frame basis, but the problem is when I need to incorporate the process i/o. Converting the array to a string every frame gives less then 2fps in Unity. The solution to this is to get mMap files working in .net 3.5. I tried many different ways before settling on the additional process, with no luck. I do have a potential solution however, but could use some input from any .net guru's out there.

Inverse Kinematics

The IK code is all custom, but it's pretty amature. I rely on SOHCAHTOA and the law of cosines for everything. Unity gives me a lot of cheats that aren't available in real time. Vector3.distance for example easily gives me the distance between two points. It all works though, and I plan to expand it out to more joints. The neck arm also allows rotational target matching.

Spoiler

 

PID Controller Servos

Each servo has it's own tunable PID loop which uses the error between the actual and IK servo angles as its process controller. This output sets the speed of the servo on a frame-by-frame basis. I only have a basic knowledge of PID tuning, so if anyone out there would like to share some advice it would be greatly appreciated.

Gait Sequencing & Steering


Right now the only gait sequence that exists is walking at a set speed. Steering is the last thing I still need to do. I wrote some simple stride length adjustment code, which allows setting the desired stride length for both the right and left legs at the beginning of each stride. The actual steering is adjusted by a PID loop which decreases one side's stride length by a percentage of its default length. So my stride length is 2 meters, and the steering PID can shorten that by up to 5%.

Spoiler

 


Terrain Mapping & Active Balance
The hexapod now has full terrain mapping and navigation capabilities. Instead of using the vessel height to position the ground and foot targets, each foot target is now placed on the ground directly beneath it.

Each hip target is now set according to the ground position of it't foot. There are two ways of setting the hip targets. Right now the hip targets are set at the average height of all the foot ground positions plus the target height. I've realized since recording the video it would be best to just set the hip targets so that the vessel is always the target height above the ground. Or some combination of the two. The first method helps going up hill, while the second is preferable for down hill. Also setting the hip height half way between the highest (foot ground + target height) and the lowest would be the way to insure each foot has the best chance of hitting it's target when going over extremely rugged terrain.

The vessel also knows if it is traversing up a hill, is level, or traversing down a hill, and sets the color of the ground accordingly. Right now nothing changes according to that, but eventually the gait will respond to the different slopes. I tried taking it out to the mountains but I still need to find a way to orient the gyroscope to a level surface, even when the launch point is not level. The triangle begin drawn on the ground represents the leg group that is actually moving the hexapod forward.

 

VR Implimentation                                                                                                                                                                                                                                                                                                                                                                                                                                       

As soon as I can get the camera feed from KSP to display to a render texture in Unity, I will be able to start moving forward with VR implementations. I have several things that will be pretty easy to accomplish right off the bat. I will be able to have the robotic head match the rotation of the HMD head, and display the camera feed to a HUD. I will also be able to control the arms of the mech using touch controllers. I have some thoughts on getting VR to work natively in KSP as well. This has been done before, with mixed results. I'd like to see if I could do better.

Spoiler

 

 

Collaboration and Testing

Let me know if you'd like to contribute in any way. If you'd like to help and know any Unity/C# that would be great, but I could use help with tuning and design as well. There are a lot of variables that need to be tweaked, as well as design changes to the mechs themselves. Let me know if you have any interest in helping and I can probably find something for you to do. 

There is also the potential for a real life counterpart. This is something I would definitely need help with, as my hardware skills are almost non-existant. I am planing on buying a frame in the near future, and would love to have some help implementing my Unity controller in the real world. If this interests you, there is lots of info already on this thread page.

TODO

  • Veering while walking
  • Turning in place
  • Terrain Mapping
  • Traversing slopes
  • Quadrapod/Bipedal Mech
  • Jumping/Extended flight

Thanks for checking it out, lots more to come.

Edited by VR_Dev
  • Like 16

Share this post


Link to post
Share on other sites
On 1/26/2018 at 9:16 PM, Drew Kerman said:

:confused::0.0::confused: that's pretty sick. Where are all the robotic fanboys? We know you're out there

Hey thanks a lot. I know the robotics part isn't super useful to people, but I wanted to show off the memory mapped bridge. I have lots of plans to implement it in other non-robotic projects that people may find more useful.

Share this post


Link to post
Share on other sites

This is epic!
Ok after foaming at the mouth long enough to formulate a response I want to say this is exactly what I been looking for for years! I'll happily admit that my robots up to this point have been nothing more than fancy statues with features with little progress to change that. This looks like it would be a thousand times easier than using IR sequencer. I can not tell you my gratitude for it escapes my ability for words. 

Edited by Colonel Cbplayer
  • Like 1

Share this post


Link to post
Share on other sites
20 hours ago, Colonel Cbplayer said:

This is epic!
Ok after foaming at the mouth long enough to formulate a response I want to say this is exactly what I been looking for for years! I'll happily admit that my robots up to this point have been nothing more than fancy statues with features with little progress to change that. This looks like it would be a thousand times easier than using IR sequencer. I can not tell you my gratitude for it escapes my ability for words. 

Nice dude I'm happy to hear you're jazzed about it. You know I've been doing the same for years with not great results. Def not as good as your boston dynamics type dog or transformer though. I'd love for you to incorporate this technique into your mechs. 

Do you happen to have any c#/dev experience? This isn't even close to being plug and play release ready. I could rig up a mech for you, but you'd probably have to figure out the walking logic.

Share this post


Link to post
Share on other sites

Now this is exciting!! I've been wanting a simulation environment for testing out the mechanics of walking robots for ages, but never thought that KSP could be it, much less with my own parts!!

I'm unfamiliar with unity programming but have done C#. What would it take to implement a custom algorithm with this? I am thinking this would be a great tool for designing and verifying my various walking robot designs before spending IRL money to actually build them!

  • Like 1

Share this post


Link to post
Share on other sites
1 hour ago, ZodiusInfuser said:

Now this is exciting!! I've been wanting a simulation environment for testing out the mechanics of walking robots for ages, but never thought that KSP could be it, much less with my own parts!!

I'm unfamiliar with unity programming but have done C#. What would it take to implement a custom algorithm with this? I am thinking this would be a great tool for designing and verifying my various walking robot designs before spending IRL money to actually build them!

//FROM ANOTHER THREAD

@VR_Dev I can't believe I didn't know about this before you posted. This is amazing!! I love how you've got the two skeletons overlaid in the view showing the intended and actual position of the robot. My question now is, could this be used separate from / in addition to KSP, to control lets say this real robot that is collecting dust in my flat? https://www.youtube.com/watch?v=sR4aj7tOwko

@ZodiusInfuser Glad to hear you like the work so far. I was actually going to buy a hexapod kit like the ones you have, but didn't want to spend hundreds of dollars right off the bat. So I decided to build one in KSP.  Also in KSP it will be able to fly, which would be next to impossible IRL. 

The skeletons both serve a purpose as well, in addition to showing the actual and desired position. The speed of each servos is set every frame based on the error between the actual skeleton and the IK skeleton using a PID controller. 

This could absolutely be used in the real world. All it needs is the input of the actual servo position, and it outputs what the servo should be set at. I use a couple built-in unity features (Vector3.Distance for example), which would have to be replaced with a real world counterpart, but it wouldn't be that hard to do.

The only development you would need to do is for the actual walking. The legs are coded to move to a target position, then you use logic to move that target position around, allowing the mech to walk. In mine, the target rotates around an anchor, then slides along the ground for the forward movement. You also have to set the target along the y axis, to move the hip to its desired location.

Edited by VR_Dev
  • Like 2

Share this post


Link to post
Share on other sites
2 minutes ago, VR_Dev said:

@ZodiusInfuser Glad to hear you like the work so far. I was actually going to buy a hexapod kit like the ones you have, but didn't want to spend hundreds of dollars right off the bat. So I decided to build one in KSP.  Also in KSP it will be able to fly as well, which would be next to impossible IRL. 

If you're wanting a cheap kit then check out the Hexy for around $300. I won't say how much mine cost to build but it was more than that...

Being able to fly is certainly an advantage KSP has over IRL, although I'm sure some kind of drone + walker could be made.

6 minutes ago, VR_Dev said:

This could absolutely be used in the real world. All it needs is the input of the actual servo position, and it outputs what the servo should be set at. I use a couple built-in unity features (Vector3.Distance for example), which would have to be replaced with a real world counterpart, but it wouldn't be that hard to do.

The only potential issue I could foresee is that IRL servos often don't give you any angle back, meaning that the PID would not be able to compensate for weight due to gravity for instance. I guess a virtual model could be used to compensate for that though.

9 minutes ago, VR_Dev said:

The skeletons both serve a purpose as well, in addition to showing the actual and desired position. The speed of each servos is set every frame based on the error between the actual skeleton and the IK skeleton using a PID controller. 

The only development you would need to do is for the actual walking. The legs are coded to move to a target position, then you use logic to move that target position around, allowing the mech to walk. In mine, the target rotates around an anchor, then slides along the ground for the forward movement. You also have to set the target along the y axis, to move the hip to its desired location.

Ah good, you've abstracted the IK from the walking to become moving points around. This matches my mindset, in that the kinematics are "just" an animation on top of a gait generator that moves a set of points around according to some rules set out. I've been working on an idea for a while now of how to deal with lots of target points with different stride lengths to get full 2D motion control, like in a First Person Shooter. Also, something that supports (omni-)wheels too. I'm nowhere near having something useable yet though, although the code I've been writing is in C#.

I'm curious what you mean by rotating the target around an anchor. Is this related to the motion path you are wanting the body to take, such as turning on the spot?

 

Also, this may be of interest to you (https://youtu.be/RjAyq2kmGT8?t=59s). It's a trailer from a game that never got released, that has by far the best Quadruped IK and target control I've seen in a game. I hope to be able to replicate this one day.

  • Like 1

Share this post


Link to post
Share on other sites

Hmm, first thing that pops on my mind is some of Arduino kit as possible more/less cheap controler for real life robot. I haven't get into arduino microcontrolers deeply, due to lack of free time and need for it, but as much as I was able to get info on it, it mostly use C++ as language for source code. Not exactly C#, but close enough if you get used to it.

It should be jus ta matter of proper wiring, depending of what you need to trigger, analog or digital output to control your robot. Some kind of "driver" would probably be necessary in terms of relays or transistor switches that would provide power for motors and servos on robot. Some kind of sequencer/timer groups should not be hard to do with atmel microcontrolers, but logic how to execute certain sequence is different kind of animal.

But, yes, proper logic that made within game should be more/less easy to transfer in real life aplication. One of reasons why I like KSP and IR mod.

  • Like 2

Share this post


Link to post
Share on other sites

Arduino would definitely be my go-to for such a project, as I've used them (and even designed my own) for other projects in the past. In fact my hexapod has one but not as the main servo driver unfortunately; that is a custom thing that accepts serial commands. So at minimum, if Unity can output serial data over bluetooth then it could remote control real hardware quite easily. Having things run on-board as you point out would require porting from C# to C++, which shouldn't be too hard as long as not too many libraries are used.

Indeed, that's what I like about KSP and IR, that a lot of the knowledge and ideas can be transferred to IRL. In this case, if an algorithm could be designed and verified with KSP, that would be a lot faster than having to deal with real hardware, plus you get a physics engine with it too!

Edited by ZodiusInfuser
typo
  • Like 2

Share this post


Link to post
Share on other sites
7 minutes ago, ZodiusInfuser said:

Indeed, that's what I like about KSP and IR, that a lot of the knowledge and ideas can be transferred to IRL. In this case, if an algorithm could be designed and verified with KSP, that would be a lot faster than having to deal with real hardware, plus you get a physics engine with it too!

Exactly.

If it is possible to create bluetooth comunication with C# (I'm almost 99% sure it is possible) then it should not be too hard to create plugin compatible with unity game engine or even compatible plugin for KSP. People do similar stuff for some time:

https://blog.arduino.cc/2018/01/24/create-a-custom-kerbal-space-program-cockpit-with-arduino/

If they have already created input/output commands from and to specialized KSP hardware controler, it should not be too difficult to extend it for IR mod.

  • Like 1

Share this post


Link to post
Share on other sites
On 2/5/2018 at 1:37 PM, ZodiusInfuser said:

If you're wanting a cheap kit then check out the Hexy for around $300. I won't say how much mine cost to build but it was more than that...

Yeah I saw Hexy but the whole goal of this robotics adventure was to learn how to develop for the raspberry pi/arduino. I eventually want to program quadcopters, but I figured it would be smarter to start off on the ground. I'd like to get into stuff like this.

Spoiler

 

I've already done a lot of flying programming in both kOS and native c#;

Spoiler

 

Wingman kOS script

0D9abkV.png

On 2/5/2018 at 1:37 PM, ZodiusInfuser said:

Being able to fly is certainly an advantage KSP has over IRL, although I'm sure some kind of drone + walker could be made.

The only potential issue I could foresee is that IRL servos often don't give you any angle back, meaning that the PID would not be able to compensate for weight due to gravity for instance. I guess a virtual model could be used to compensate for that though.

Yeah a quadrapod that doubles as a quadcopter is essentially my end goal in ksp. It would be nuts to see IRL too. Problem is the weight mostly. 

I didn't know that about real servos, I thought for sure you would get a position. Then again I know nothing about hardware. We could just drop the speed PID controls out and have a fixed speed.

On 2/5/2018 at 1:37 PM, ZodiusInfuser said:

Ah good, you've abstracted the IK from the walking to become moving points around. This matches my mindset, in that the kinematics are "just" an animation on top of a gait generator that moves a set of points around according to some rules set out. I've been working on an idea for a while now of how to deal with lots of target points with different stride lengths to get full 2D motion control, like in a First Person Shooter. Also, something that supports (omni-)wheels too. I'm nowhere near having something useable yet though, although the code I've been writing is in C#.

I'm curious what you mean by rotating the target around an anchor. Is this related to the motion path you are wanting the body to take, such as turning on the spot?

Also, this may be of interest to you (https://youtu.be/RjAyq2kmGT8?t=59s). It's a trailer from a game that never got released, that has by far the best Quadruped IK and target control I've seen in a game. I hope to be able to replicate this one day.

Yeah mine only walks in a straight line right now, barely. So there are still a lot of leg target commands/algorithms to be written. Three targets rotate around their center point for the lift and move forward cycle, while the other three leg targets just translate straight backwards for the forward motion. You can see the pink box that is the leg target anchor, then there is a forward position and rear position that the target moves between

That game is awesome. The IK is super smooth, and they have the gait figured out for every direction. Even for pushing things. That goes along with an idea I had where the hexapod could lift its two front arms for grabbing/shooting, then walk around like a quadrapod. 

On 2/5/2018 at 1:39 PM, kcs123 said:

Hmm, first thing that pops on my mind is some of Arduino kit as possible more/less cheap controler for real life robot. I haven't get into arduino microcontrolers deeply, due to lack of free time and need for it, but as much as I was able to get info on it, it mostly use C++ as language for source code. Not exactly C#, but close enough if you get used to it.

It should be jus ta matter of proper wiring, depending of what you need to trigger, analog or digital output to control your robot. Some kind of "driver" would probably be necessary in terms of relays or transistor switches that would provide power for motors and servos on robot. Some kind of sequencer/timer groups should not be hard to do with atmel microcontrolers, but logic how to execute certain sequence is different kind of animal.

But, yes, proper logic that made within game should be more/less easy to transfer in real life aplication. One of reasons why I like KSP and IR mod.

Yeah I don't know much about any of that. Like I said I'm all software, but I wanna learn arduino/raspberry pi. Also don't know c++ but I really want to learn that, else I could just create a wrapper for my c# library.

On 2/5/2018 at 1:52 PM, ZodiusInfuser said:

Arduino would definitely be my go-to for such a project, as I've used them (and even designed my own) for other projects in the past. In fact my hexapod has one but not as the main servo driver unfortunately; that is a custom thing that accepts serial commands. So at minimum, if Unity can output serial data over bluetooth then it could remote control real hardware quite easily. Having things run on-board as you point out would require porting from C# to C++, which shouldn't be too hard as long as not too many libraries are used.

Indeed, that's what I like about KSP and IR, that a lot of the knowledge and ideas can be transferred to IRL. In this case, if an algorithm could be designed and verified with KSP, that would be a lot faster than having to deal with real hardware, plus you get a physics engine with it too!

Yeah this was always in the back of my mind, controlling a real bot from my unity controller. I always imaged it would be wired, but bluetooth would probably be the way to go. I never really mentioned in my main post but there is a plugin running on the craft in ksp, which acts as the driver.

On 2/5/2018 at 2:05 PM, kcs123 said:

Exactly.

If it is possible to create bluetooth comunication with C# (I'm almost 99% sure it is possible) then it should not be too hard to create plugin compatible with unity game engine or even compatible plugin for KSP. People do similar stuff for some time:

https://blog.arduino.cc/2018/01/24/create-a-custom-kerbal-space-program-cockpit-with-arduino/

If they have already created input/output commands from and to specialized KSP hardware controler, it should not be too difficult to extend it for IR mod.

Interesting, I'll def look into it. You guys got me excited for a real world counterpart.

Edited by VR_Dev
  • Like 2

Share this post


Link to post
Share on other sites

There is already complete module for arduino, for both, bluetooth or WiFi to extend capabilities of main controler.

https://create.arduino.cc/projecthub/user206876468/arduino-bluetooth-basic-tutorial-d8b737

https://create.arduino.cc/projecthub/jeffpar0721/add-wifi-to-arduino-uno-663b9e

From unity coding side, as much as I was able to understand trough quick peek, bluetooth is defined in windows system as additional COM port in device manager. It should be possible to send/receive any kind of data on such COM port as any other COMN port on machine. It is just a matter of writting proper software on arduino controler, to determine response on received commands and transmit some feednback to PC machine that comunicate with it.

20 hours ago, ZodiusInfuser said:

The only potential issue I could foresee is that IRL servos often don't give you any angle back, meaning that the PID would not be able to compensate for weight due to gravity for instance. I guess a virtual model could be used to compensate for that though.

Wait. What ? Why you are not develop some cheap reliable way to determine exact position of angle on servos ? :D

Jokes aside, it is probably easier to tell than do in real life. I haven't messed with electronics for ages, so first things that poped up on my mind is to use variable capacitors or variable resistors. Any kind of potentiometer might not be reliable enough, so variable capacitor in circuit with oscilator that provide AC signal with variable frequency might give more precise results. Other way may be usage of photodiodes. Initial thoughts with it would be to place photodiodes in circle on the edge of stator piece of robotic rotating part and light source on rotor of rotating part. As part rotate, only one of photodiode would be lighted (or maybe very few in neighborhood). Maping lighted photodiode with array of digital inputs can give good feedback about exact angle of robotic part. Other possible solution might be to use reflective surface of some kind on rotating part instead of active light source. Maybe even combination of light source - optic fiber - photodiode when you need to place large number of diodes in tight place. Since optic fiber thickness can be quite small in diameter (comparable to thickness of human hair) it could be possible to place photodiodes elsewhere in robot central body and only provide light source trough optic fiber. That coulod possibly give higher resolution of true angle position of servo motors.

Anyhow, that is just theoretical rambling, put something like this in practical use in real life application is different story. As I was writting this, I found that similar thing is already available, by using some kind of pattern on reflective surface that give feedback of servo angle position. Some kind of absolute rotary encoder. It is probably expencive due to specialised microcontroler and software used, so it might be out of reach for hobby usage. But that is area where arduino kicks in, with right kits used it may be reduce cost to acceptable level. Either is way out of my league to end up as expencive toy that will only collect dust over time, that is reason why I stick to the software/simulator only.

@VR_Dev, your WIP project with unity integration might have good usage in real life other than just controling a robot trough arduino interface. With right sensors attached on robot, it can provide data back to unity what such robot can "see" with sensors. Based on that data it should be possible to render 3D environment in unity game engine, giving accurate as possible feedback to human operator what is going on in robot nearby area. In some situations, for example in area full of smoke/dust or something where ordinary camera is useless, such data can be very valuable. It have a lot of potential usage, beside jsut for fun to use in game for personal pleasure. It is something worth to develop for certain.

  • Like 1

Share this post


Link to post
Share on other sites
34 minutes ago, kcs123 said:

There is already complete module for arduino, for both, bluetooth or WiFi to extend capabilities of main controler.

https://create.arduino.cc/projecthub/user206876468/arduino-bluetooth-basic-tutorial-d8b737

https://create.arduino.cc/projecthub/jeffpar0721/add-wifi-to-arduino-uno-663b9e

From unity coding side, as much as I was able to understand trough quick peek, bluetooth is defined in windows system as additional COM port in device manager. It should be possible to send/receive any kind of data on such COM port as any other COMN port on machine. It is just a matter of writting proper software on arduino controler, to determine response on received commands and transmit some feednback to PC machine that comunicate with it.

Oh yes, I've used Bluetooth as a COM port before from a C++ application, just never from C# or Unity. Writing a custom protocol would not be too hard from my experience, just requires a bit of forethought as to what you may want to support in the future.

34 minutes ago, kcs123 said:

Wait. What ? Why you are not develop some cheap reliable way to determine exact position of angle on servos ? :D

Jokes aside, it is probably easier to tell than do in real life. I haven't messed with electronics for ages, so first things that poped up on my mind is to use variable capacitors or variable resistors. Any kind of potentiometer might not be reliable enough, so variable capacitor in circuit with oscilator that provide AC signal with variable frequency might give more precise results. Other way may be usage of photodiodes. Initial thoughts with it would be to place photodiodes in circle on the edge of stator piece of robotic rotating part and light source on rotor of rotating part. As part rotate, only one of photodiode would be lighted (or maybe very few in neighborhood). Maping lighted photodiode with array of digital inputs can give good feedback about exact angle of robotic part. Other possible solution might be to use reflective surface of some kind on rotating part instead of active light source. Maybe even combination of light source - optic fiber - photodiode when you need to place large number of diodes in tight place. Since optic fiber thickness can be quite small in diameter (comparable to thickness of human hair) it could be possible to place photodiodes elsewhere in robot central body and only provide light source trough optic fiber. That coulod possibly give higher resolution of true angle position of servo motors.

Anyhow, that is just theoretical rambling, put something like this in practical use in real life application is different story. As I was writting this, I found that similar thing is already available, by using some kind of pattern on reflective surface that give feedback of servo angle position. Some kind of absolute rotary encoder. It is probably expencive due to specialised microcontroler and software used, so it might be out of reach for hobby usage. But that is area where arduino kicks in, with right kits used it may be reduce cost to acceptable level. Either is way out of my league to end up as expencive toy that will only collect dust over time, that is reason why I stick to the software/simulator only.

Because most servos can be hacked to get their internal potentiometer value out, or you buy Dynamixels and be done with it :P. You are right that photodiodes and a reflective disc could also be used to get absolute position, but this can get really complex if done DIY, or be very expensive if not.

Edited by ZodiusInfuser
  • Like 1

Share this post


Link to post
Share on other sites

Using COM port in C# should not be much different than it is in C++. Writting a plugin for unity is different story, though, but should not be impossible. Hey, we already got telnet communication with kOS, that suggest that communication trough COM port is doable.

Yep, probably any idea to get absolute position already exist on market in some form as cheaper complete product than any hacky way you can made it for home usage. I carried away a bit with it, not using much of electronic stuff in RL for long time, so I don't know exactly what already exist on market and what not :)

 

  • Like 1

Share this post


Link to post
Share on other sites
5 hours ago, kcs123 said:

There is already complete module for arduino, for both, bluetooth or WiFi to extend capabilities of main controler.

@VR_Dev, your WIP project with unity integration might have good usage in real life other than just controling a robot trough arduino interface. With right sensors attached on robot, it can provide data back to unity what such robot can "see" with sensors. Based on that data it should be possible to render 3D environment in unity game engine, giving accurate as possible feedback to human operator what is going on in robot nearby area. In some situations, for example in area full of smoke/dust or something where ordinary camera is useless, such data can be very valuable. It have a lot of potential usage, beside jsut for fun to use in game for personal pleasure. It is something worth to develop for certain.

It would be cool to use WIFI, as that would probably be the most robust. I definitely would want to include sensors, but I've always had the idea to stick one of these on a mech/quadcopter, for super accurate room tracking.

gallery-1483576689-vive-puck.png?resize=

I have a couple laying around, and VIve's room tracking is accurate enough that you could program complex maneuvers (as long as its in a room rigged with lighthouses). You could also then view the mech in VR, and someday AR, allowing for floating UI's around the robot.

 

Edited by VR_Dev
  • Like 2

Share this post


Link to post
Share on other sites

Crazy week at work for me but I did get some progress over the weekend. I rigged up the arm "neck" with IK and PID servos. In terms of design I added another servo at the base of the head. This allows me to control the pitch of the head in addition to its position. Unfortunately I realized too late that servo only has a range of 60, so it will have to be replaced. I also added a rotator to control the roll of the head so that it can remain level along its z axis, when the body is not. The base of the whole neck has a rotator as well, which will swing the whole arm around. Neither of those are hooked up yet. Eventually I will use a combo of joystick/buttons to maneuver the head target around, as well as preset positions.

You can see from the bouncyness the servos have to be tuned still. Also haven't figured out the extender yet. That presents an interesting challenge.

PWyhkTA.png

 

The other big thing I've been working on is sending a render texture through my bridge. I got a camera feed down to byte array, which I can send via the bridge, but the problem comes from a critical failure in the design. KSP targets .net 3.5, and memory mapped files aren't supported unit 4+. I tried a lot of ways to get around this problem, but eventually ended up starting a custom process at launch, then using the i/o for that process to send data over the bridge to unity. This works fine for single values, but encoding a byte array of a texture to strings every frame dropped the frame rate down to under 2 fps. 

Really the only solution to this is to eliminate the custom process, and get mmap files to work in 3.5.  There seem to be a couple ways to do this, but they are very difficult. If there are any .net guru's out there, I think I have a solution, but I could use some help.

I really want to get a camera feed from ksp into unity, because it opens a whole new world of things i could do in vr. Specifically having the neck arm mirror the heads rotation, and displaying the feed to a HUD of some sort.

 

Edited by VR_Dev
  • Like 2

Share this post


Link to post
Share on other sites

so this amazing @Colonel Cbplayer gfy really got me excited to go back to working on this guy.

Spoiler

 

 

G87yUjB.png

In terms of walking, I would like to build a quadrapod first, and develop some techniques for COM management. But in the short term I think I'll rig up the arms, so that I can control the mech in VR just like this iron man project i was working on. I am controlling the arm here with a touch controller.

I also used to use custom PID to control the engine mounts, but I want to hook the IR straight up to the steeringmanager PID. So the IR engines would effectively work just like stock flaps. Nothing against Cbplayers transformer, but I would assume there is a fair amount of SAS controlling it. I would like the robotics to do all the work.

Spoiler

 

The main problem with getting this guy to walk is going to be the lack of motion along its back. @ZodiusInfuser have you ever thought of some kind of ball joint?

Edited by VR_Dev
  • Like 1

Share this post


Link to post
Share on other sites

I updated the main thread with all my thoughts on this project. I will try my best to keep it updated as the project progresses. If anyone is interested in collaborating in development, or tuning/design of the mech's themselves, please let me know. I have a million ideas for this project, and I wont' be able to get to all of them by myself. There is also the potential for a real life counterpart. This is something I would definitely need help with, as my hardware skills. I am planing on buying a frame in the near future, and would love to have some help implementing my unity controller.

Currently I am trying to get a demo release of the project. I have started writing a custom editor that will expose all the tweakable variables the mech uses. 

fyi19Nk.png

As well as a UI to track groundspeed. This will be important for tuning the craft for top speed. You can see that in this simple VR demo video I made. Eventually you will be able to manipulate the head and arm targets with the VR hand. My end goal is to be sitting in a cockpit, camera feeds as the cockpit windows, and the mech skeleton displayed on a dash in front of me.

VR Gif

Spoiler

 

 

Edited by VR_Dev

Share this post


Link to post
Share on other sites

So I finally got around to course correction while walking, which allowed the hexapod to walk the full length of the runway in about 15 minutes. I wrote some simple stride length adjustment code, which allows setting the desired stride length for both the right and left legs at the beginning of each stride. The actual steering is adjusted by a PID loop which decreases one side's stride length by a percentage of its default length. So my stride length is 2 meters, and the steering PID can shorten that by up to 5%.

You can see in output log, at the beginning of every step cycle it prints the error, and what the stride is being adjusted by. Right now its just set to go straight, but when I start adjusting the launch vector, I will be able to control which direction the mech will travel. The stride length, PID settings, and max % can all be tweaked from my custom editor. I have been using it to do speed tests, which is why the rockets got lost.

Next up is slope navigation. If you watch to the end of the video you'll see how great it is at it right now. One day I'll record with sound, but my gf and cats don't like the orchestra of IR servos moving.

Steering GIf

 

Edited by VR_Dev
  • Like 2

Share this post


Link to post
Share on other sites

Awesome! I find funny that since I was made aware of your efforts (and a friend for completely unrelated reasons asked me about walker motion), I've been looking back at my 5+ year old notes and arrived at completely different solutions for steering and terrain adaptation. I actually did some code over the weekend for the latter that I think would work well, but need to do more analysis to be sure.

Share this post


Link to post
Share on other sites

 

My gf was gone so I worked on this pretty much non stop over the weekend. (Also why the recording actually has sound). The hexapod now has full terrain mapping and navigation capabilities. Instead of using the vessel height to position the ground and foot targets, each foot target is now placed on the ground directly beneath it.

Each hip target is now set according to the ground position of it't foot. There are two ways of setting the hip targets. Right now the hip targets are set at the average height of all the foot ground positions plus the target height. I've realized since recording the video it would be best to just set the hip targets so that the vessel is always the target height above the ground. Or some combination of the two. The first method helps going up hill, while the second is preferable for down hill. Also setting the hip height half way between the highest (foot ground + target height) and the lowest would be the way to insure each foot has the best chance of hitting it's target when going over extremely rugged terrain.

The vessel also knows if it is traversing up a hill, is level, or traversing down a hill, and sets the color of the ground accordingly. Right now nothing changes according to that, but eventually the gait will respond to the different slopes. I tried taking it out to the mountains but I still need to find a way to orient the gyroscope to a level surface, even when the launch point is not level. The triangle begin drawn on the ground represents the leg group that is actually moving the hexapod forward.

Building the terrain mapping box was pretty difficult, as there are 36 vertices being set each frame. It took a while but it was definitely worth it.

Developing this process I found a couple of big mistakes in the gate code, which have now been fixed. The stride is now much smoother and more efficient than it used to be. I haven't done a top speed test yet after the fixes, but average speeds have picked up across the board. Still lots of tuning to be done.

Next up is steering.

 

Edited by VR_Dev
  • Like 2

Share this post


Link to post
Share on other sites

Some slopes are still slippery for bot, but it walk much, much better than before :) Great progress.

  • Like 1

Share this post


Link to post
Share on other sites

That is one impressive terrain adaptation algorithm you've got there! Am I right in thinking you check the height of the leg relative to the ground on each update rather than just on contact? I as because if you were to translate this to IRL getting that height value won't be possible so another approach to terrain adaptation would be required, which is something I'm currently looking in to.

Good luck tackling steering, especially if you want to combine translation/strafing with turning. Not impossible, but to do it well is likely to require a lot of maths and may not fit in with your current circular striding approach.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now