TRIDENT EU Project final experiments

The TRIDENT EU project final experiments were carried out at Sóller harbour, Mallorca (Spain), in October 2012. An autonomous recovery of a black box was demonstrated, involving the use of an I-AUV composed of the Girona-500 vehicle, a 7-DOF electric arm built by GraalTech and a three-fingered hand made in the University of Bologna. The complete system was running in free floating mode, meaning that all the degrees of freedom were controlled simultaneously in order to reach and recover the target.

Posted in Sin categoría | Leave a comment

UFF – a.k.a Unmanned Fish Feeder

Few people these days at the lab. Only our new two fish members stay at the water tank, and need to eat! Fortunately arduino is here to make things easy. This is a home-made, arduino-powered fish feeding device:

Home-made Fish Feeder

A pot containing delicious fish food is attached to a small SG90 servo that makes it turn. The screw top (here upside down) is fixed, and the screw has been removed. It also has a little hole with a flap that helps the food leaving the pot when the pot turns. The servo is connected to an arduino that moves it between two pre-defined positions during a short time, and then waits for 12 hours until the next meal 😉

Hopefully this will keep our underwater friends well fed during August…

Posted in Uncategorized | Leave a comment

Loading URDF models in Android devices

These days I’ve been figuring out how to load URDF-based models in Android without having to re-write too much code. Instead, the idea was to take profit of all the code already existing in ROS. Viewing URDF models in ROS basically involves four packages:

  • ogre, a wrapper around OGRE3D: the open-source graphics rendering engine used by rviz and gazebo.
  • ogre_tools, that contains code for rendering basic geometric primitives and loading meshes with OGRE3D.
  • urdf_interface and urdf_parser, which basically parse an URDF file and build up a data structure from it.
  • rviz, concretely the code under the robot/ folder, which calls ogre_tools and urdf_parser and actually builds the scene graph.

So, it wouldn’t be an easy task ;). Doing it with rosjava was one of the possibilities, but would have involved re-writing much of the code, specially for urdf parsing and, even more important: 3D visualization. It is a common practice in Android (and guess others too!) to perform 3D rendering using native code in C/C++, mainly because of the high performance required.

Fortunately, OGRE3D has been improving Android support in the last weeks (check this forum thread). This became my first option, since having OGRE in Android would allow to re-use much of the code existing in ROS. Cross-compiling Ogre with the Android NDK was not easy, but worked fine after some fixes. I was able to get the demos running on my Galaxy Tab 10.1, and actually got surprised because of the number of examples and nice features already supported!

The next step was to cross-compile ogre_tools, urdf_parser, urdf_interface and the robot part of rviz. These mainly depend on ros/console, tinyxml and…. libboost 😦 I got rid of the first dependency by re-defining the macros ROS_ERROR, ROS_WARN and ROS_DEBUG. Tinyxml fortunately is tiny, as the name indicates 😉 I could cross-compile it without many problems (which doesn’t mean I didn’t have any!). My main concern was boost, but guess what?? there is a github project that downloads and cross-compiles a number of common open-source libraries for you! and boost is there! that really saved me a lot of time.

With all the dependencies installed I was able to cross-compile the ROS packages, and put everything into a proof-of-concept OGRE demo that takes an .urdf file as input and renders the robot. Check how the PR2 looks like in Android!:

Next I would like to check how to run OGRE as a native part inside a rosjava app. rosjava could receive ROS messages from the ROS network and update the visualization on the native part.

Posted in Uncategorized | Tagged | Leave a comment

OSG Interactive Markers moved to trunk

The Interactive Markers client for OpenSceneGraph has become usable, and therefore moved to the trunk of our uji-ros-pkg repository, inside the osg_interactive_markers package. This package is basically an OpenSceneGraph (OSG) adaptation of the Interactive Markers client writen for rviz/Ogre. Most of the code has been taken from the rviz sources, and adapted to use OSG data types and facilities when possible. It allows the creation of Interactive Markers in OpenSceneGraph applications. See the corresponding wiki page in the ROS website for more information.

The osg_interactive_markers is part of a more general visualization_osg stack, together with two more packages: osg_markers and osg_utils. osg_markers can be used to create Markers geometry in OSG, whereas osg_utils contains some classes that may be useful in ROS-OSG applications, like a FrameManager (also adapted from rviz), searching in the scene graph, etc. The three of them can be linked as libraries inside your OSG projects.

Posted in Uncategorized | Tagged | Leave a comment

How to choose the rotation axis in a OSG RotateCylinderDragger

I’ve been looking for an easy way to change the default rotation axis in a RotateCylinderDragger of OpenSceneGraph, and didn’t find one that suited my needs. You can always introduce intermediary transforms, but it would be sub-optimal, and can give some problem when the RotateCylinderDragger is part of a more general CompositeDragger. Diving through the OSG API I have found an easy solution that basically replaces the default constructor and customizes the Projector instead of using the default one. The new contructor takes a Quaternion (osg::Quat) as argument, which indicates the new rotation axis orientation with respect to the default one (0,0,1) when creating an osg::Cylinder. I post a snippet below in case it is useful to someone else:

#include <osgManipulator/Dragger>
#include <osgManipulator/RotateCylinderDragger>
#include <osgManipulator/Projector>
#include <osg/Shape>

/** A custom RotateCylinder that allows the user select the rotation axis */
class CustomRotateCylinderDragger: public osgManipulator::RotateCylinderDragger {
        public:
                CustomRotateCylinderDragger(osg::Quat &rotation) {
                        //Set the default rotation axis
                        osg::Cylinder *c=new osg::Cylinder();
                        c->setRotation(rotation);
                        _projector = new osgManipulator::CylinderPlaneProjector(c);
                        setColor(osg::Vec4(0.0f, 1.0f, 0.0f, 1.0f));
                        setPickColor(osg::Vec4(1.0f, 1.0f, 0.0f, 1.0f));
                }
   
                ~CustomRotateCylinderDragger() {}
};
Posted in Uncategorized | Leave a comment

TRIDENT 2nd Annual Review

The IRSLab was in Girona during the last week, participating in the 2nd Annual Review of the TRIDENT FP7 european project. The main milestone to show was the mechatronic integration of the Girona 500 AUV, the Graaltech Arm (University of Genoa) and the hand (University of Bologna). All the systems were successfully integrated and a recovery mission was carried out in a tele-operated manner. See the following video for a summary:

Simultaneously, other two demos where performed in simulation using UWSim. They showed the leader following controller (Instituto Superior Técnico, Lisbon), and the free-floating controller (University of Genoa). They both integrated different parts of the whole TRIDENT architecture, such as target detection and tracking, pose estimation, arm/hand control, vehicle control and mission planning:

Posted in Uncategorized | Leave a comment

Interactive Markers with OpenSceneGraph

My last post was about visualization of ROS Markers with the OpenSceneGraph rendering library. I have now added the interactive part, thus allowing OSG-based applications to render Interactive Markers, and users to interact with them. There are still things to do, like returning feedback, or support for menus. The code, still experimental, is under the visualization_osg stack in our google code repository. More to come!

Posted in Uncategorized | Tagged | 1 Comment

ROS Markers in OSG: Hello world

These days I’m trying to implement rendering of ROS Interactive Markers in OpenSceneGraph, so that we can start using them inside UWSim. Fortunately I’m able to reuse many of the code already implemented in rviz. I have first focused on the rendering of visualization_msgs/Markers:  had to replace the calls to Ogre by the equivalents in OSG, and remove any dependency with rviz-specific classes. I guess things will become a bit more difficult on the “Interactive” part :-S

Below is a snapshot of UWSim listening to the output of the basic_controls tutorial. I have started a new experimental stack visualization_osg in our uji-ros-pkg repository.

OSG visualization of the markers in the basic_shapes tutorial

Posted in Uncategorized | Tagged | 2 Comments

Control of UWSim vehicles with MATLAB through ipc_bridge

This post  describes how to interface control algorithms developed in Matlab with UWSim in the context of underwater simulation, using the IPC-Bridge.

IPC-Bridge is a ROS stack developed by Nathan Michael (Penn Robotics) that allows communication between Matlab and ROS via IPC. The current version of IPC-Bridges includes built-in support for nav_msgs/Odometry messages, which is the type of message that UWSim uses for updating the position and velocity of underwater vehicles.

You will need to create a new launch file that will set up the bridge for Odometry messages between Matlab and a ROS network:

<launch>  
  <node pkg="ipc" name="central" type="central" output="screen" 
        args="-su"></node>
  <node pkg="ipc_nav_msgs" name="example_node" 
        type="nav_msgs_Odometry_subscriber" output="screen">
    <remap from="~topic" to="/vehiclePose"/>
    <param name="message" value="odometry"/>  
  </node>
</launch>

You will also need to suitably set the ROS interface for vehicles in UWSim. Just place this code inside the part of your scene XML (replace the vehicle name by yours):

<ROSOdomToPAT> 
  <topic>/vehiclePose</topic>
  <vehicleName>vehicle</vehicleName>
</ROSOdomToPAT>

Then, you just have to publish an Odometry message from Matlab. You can take as template one of the examples that come with ipc_bridge. It should look something like this (which sets a sinusoidal trajectory around position 0,0,-3.5):

% create a publisher that publishes a geometry_msgs/Odometry message
pid_id=nav_msgs_Odometry('connect','publisher',
                         'example_module','odometry');

% create an empty geometry_msgs/Odometry message structure
msg=nav_msgs_Odometry('empty');
msg.header.frame_id='world';
msg.child_frame_id='vehicle';

angle=0.0;
while (1)    
  % Set position
  msg.pose.pose.position.x = 0.2*sin(2*angle);
  msg.pose.pose.position.y = 0.2*sin(angle);
  msg.pose.pose.position.z = -3.5+0.2*sin(angle);
  % Set orientation
  msg.pose.pose.orientation.x = 1;
  msg.pose.pose.orientation.y = 0;
  msg.pose.pose.orientation.z = 0;
  msg.pose.pose.orientation.w = 0;
  % Twist linear
  msg.twist.twist.linear.x = 0.0;
  msg.twist.twist.linear.y = 0.0;
  msg.twist.twist.linear.z = 0.0;
  % Twist angular
  msg.twist.twist.angular.x = 0.0;
  msg.twist.twist.angular.y = 0.0;
  msg.twist.twist.angular.z = 0.0;

  %Send message
  nav_msgs_Odometry('send',pid_id,msg);
  angle=angle+0.0001;
end
Posted in Uncategorized | Tagged , | Leave a comment

rostweet now let your robots share pictures

Cloud robotics is getting more and more attention recently. See for example the Google initiative presented at I/O 2011, or the RoboEarth project funded by the European Comission. The idea is to let robots share information on the cloud with the goal of generating a global knowledge base and ultimately accelerating robotic development.

We recently presented rostweet, a ROS package that lets your ROS-based robots post text to a twitter account. We now extend it and add support for posting pictures, by using the recent post_with_media twitter API. This enables the sharing of images among ROS-based robots through twitter, which could have interesting applications in ‘social’ object recognition, learning, or just communication among robots and between robots and humans.

This is how it works: by running rostweet, a ROS service is created that your node can call for posting a tweet with some text (string) and an optional image (sensor_msgs/Image). In addition, incoming tweets are published on a topic as they arrive, also in the format of string + sensor_msgs/Image. Just create a twitter account for your robot and start following other robots (or humans!). For now, rostweet only supports the image media type, although support for other types of media can be easily added.

Please check our first post for installation instructions, although there are some changes in this last version:

  • Posting a tweet is now done though a service call. An example node called ‘post’ has been included in the package. Use it as rosrun rostweet post "<text>" [<path_to_picture>]
  • The rostweet_msgs/IncomingTweet message now includes also an image. To see just the text of the incoming tweets, do rostopic echo /rostweet/incomingTweet/tweet
Posted in Uncategorized | Tagged | 1 Comment