Jump to content

Can I use ROS# inside a KSP plugin?


Recommended Posts

So I'm totally new to C# development and have no idea what I'm doing. I'm wanting to create a bridge between KSP and Robot Operating System (ROS). I know what I'm doing in ROS for the most part, but I'm having trouble determining if it's worth it. ROS is a modular software framework that uses a publisher / subscriber model. I'm basically just trying to start by controlling one axis of the rocket during ascent. The plugin would need to publish the current angle of the rocket and subscribe to a control input that would be fed into KSP as flight inputs.

An employee from Siemens has created a ros-bridge package that focuses on using using unity for robot simulations. (wiki here) I just want to know if I can use these plugins inside of a KSP plugin. 

Link to comment
Share on other sites

  • 1 month later...

I was just thinking about the same thing! 

Not sure if you've found anything or tried your self yet? I haven't, but I'm actually in the early process of learning ROS; However, I've been using Unity & C# for almost a decade. (Pretty much know them like the back of my hand at this point.)    ---   Potentially a nice knowledge trade there if you wanna collaborate a little?

The only thing I'm not familiar with is KSPs API, deployment method, and how (or how much of) the action system is exposed in their API (as it is in the "Actions" tab of the in game editor for instance), it looks like a good bit, but, I'll look harder latter. Getting simple 2 way communication shouldn't be too bad at all though.

Otherwise, I'm actually getting really excited about doing this!

However, I'm thinking a bit more ambitiously for the functionality (it's late, err 'early' here, but) If the actions system is exposed in a way that is easily serialized, which I have to assume that it is since it all gets broken down to XML at some point... It might actually be almost easier for ROS to control all available actions of every component, than to do anything wanky with specific controls. Couple that with what looks like a pretty nice, and simple, example of Unity's Camera image messaging in the link you provided (thanks for that btw!), and... This could actually be a *really* nice tool for really, any robotics dev (with perhaps other future or existing extensions installed).

This of course makes me immediately think of part opportunities (for instance, a RealSense camera to throw in the scene, that ROS could then use), but I think that points out quite quickly how this would NEED to handle time acceleration properly. In other words: so time could in *some* instances be slowed for accuracy sake for instance -- The multiple renderings required for simulated robotics, *will* hit the frame rate pretty hard, and fairly quickly (depending how many cameras you need). (Notice of course I'm talking about vision guided rocket powered aerial and land drones here, not just MechJeb type stuff!  :cool:) -- But,  thats why in the Udacity self driving car course simulator, they actually re-record training images of a car you get to drive to train your DL models on your recorded actions, *not* your 'live' driving.... otherwise the lag (particuarlly as the physics engine tries to catch up after garbage collections and each image capture) would kill the real time 'play'. -- Something like that might be the only thing truly "difficult" about this; If in the future someone (like my self actually!!!) wanted to use this to train and test any form of controller, including Deep Learning. KSP mods seem to have a difficult time 'forcing' objects to be where they want them (not sure if thats just lack of Unity understanding on the part of those mod-ers, or a true conflict with KSPs Physics additions, but... potential issue with 'future' growth in regards to user generated training data)... Not something to be concerned with early on though, all classical control methods would work ok (as 'ok' as they do in any quasi real time simulation at least).

I think the real "work" here though is just going to be UI dev within KSP to allow defining what 'sensors', and what 'actuators', get put onto what Topics (and thats really just learning how they did their Editor). An expert user could of course make use of this far before that (perhaps much more to the point of your original question), but, I think thats where this gets REALLY interesting! :)

I mean, if you have ROS experience, it stands to reason any well documented code bridge between it and KSP could be used without all that, but... to at least have a UI for linking parts to particular topics would greatly increase the utility and market, not only to ROS play, but robotics dev, almost as whole.

Do you know how ROS nodes tend to handle time, or how well they stick to a particular method? I mean, I know anyone can write an executable and tie it into ROS; In which, any timing mechanism could be used (kinda gonna be an issue for *some* things I imagine), but are there any sort of design patterns to use a specific "Topic" ,or message element, for "Time" (or other things), instead of direct access to date/time APIs for instance?

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...