ROS 1.0.3 has been released! This is a patch update for users of Box Turtle. We recommend that all Box Turtle users upgrade to this new version. This update also includes new compatibility with the recent release of Ubuntu Lucid Lynx. Our debian packages are currently building and should be ready soon. rosinstall and SVN users can update now.
We appreciate all the ROS users who have contributed bug reports and patches that have made this release possible. The list of changes is below.
We've updated quick a few stacks in the past week. As you can tell from the dual 1.0.x and 1.1.x releases, we are moving forward on our C-Turtle development, though we are continuing to patch problems with Box Turtle as necessary.
The ROS community has grown an amazing amount this year. As the Robots Using ROS has illustrated, there are all types of robots using ROS, from mobile manipulators, to autonomous cars, to small humanoids. As the types of robots has increased, so too has the variety of software you can use with ROS, whether it be hardware drivers, libraries like exploration, or even code for research papers. This diversity has allowed all types of developers, including researchers, software engineers, and students, to participate in this growing community.
Today we officially crossed the 1000 ROS package milestone. This is due in no small part to the many new ROS repositories that have come online this year. We are now tracking 25 separate ROS repositories that are providing open source code, including repositories from:
Several stacks have been updated, including the unstable ROS 1.1 series. Both joystick_drivers and laser_drivers have also started 1.1.x unstable development releases to test new features for C-Turtle.
We're very excited to announce that the mapping library from SRI International's Karto Robotics is now open source with an LGPL license. This mapping library contains a scan matcher, pose graph, loop detection, and occupancy grid construction -- all important building blocks for 2D navigation. When combined with Willow Garage's Sparse Pose Adjustment (SPA) for optimization (in the sba ROS package), it forms a complete stand-alone library for robust 2D mapping.
The Karto mapping library is being hosted on code.ros.org, and we've already integrated it with the ROS navigation stack. The Karto team recently benchmarked various SLAM systems on the RAWSEEDS dataset and found that newest Karto 2.0 with SPA is slightly less precise than Karto 1.1, but Karto 2.0 was more consistent and faster . The maximum error with Karto 2.0 performed as well as a localization-based solution (MCL). A paper describing the SPA technique is due to be published later this year.
: "Comparison of indoor robot localization techniques in the absence of GPS, Vincent, Regis, Limketkai, Benson, and Eriksen, Michael, In Proceedings of SPIE Volume: 7664 Detection and Sensing of Mines, Explosive Objects, and Obscured Targets XV (Proceedings Volume) of Defense, Security, and Sensing Symposium.
Karto SRI/Willow Garage Integration Team: Kurt Konolige, Benson Limketkai, Michael Eriksen, Regis Vincent, Brian Gerkey, Eitan Marder-Eppstein
ROS 1.1.2 has been released. This is an unstable release to test new features for ROS.
The major new feature with this release is integration of roslisp. roslisp is a full-featured Common Lisp client library for ROS that is being developed by Bhaskara Marthi (Willow Garage) and Lorenz MÃ¶senlechner (TUM). We are now including it in the ROS stack to enable more easily access its capabilities, especially in shared install setups where ROS messages have already been built.
This update does not require ROS users to install Lisp. We have created a separate "roslisp_support" stack with a "roslisp_runtime" package that will trigger an install of SBCL, when required. Current users will have to make sure to add a dependency on the roslisp_runtime package so that installation scripts work properly.
Another major update in this release is performance updates for rospy message serialization. The new optimizations should result in performance improvement of 2x or better for many types of messages, mainly thank to James Bowman's patches.
As with our previous unstable releases, we ask that general users refrain from updating to this release unless they need to test integration for the upcoming C-turtle distribution. We do recommend that roslisp users test this new setup and provide feedback on this new roslisp core integration.
ROS 1.0.2 has been released! This is a patch update for users of Box Turtle. We recommend that all Box Turtle users upgrade to this new version. Our debian packages are currently building and should be ready soon. rosinstall and SVN users can update now.
We appreciate all the ROS users who have contributed bug reports and patches that have made this release possible. The list of changes is below.
visualization_common 1.1.0 and visualization 1.1.0 have been released. This is an unstable development release to try out features considered for inclusion in their 1.2 versions. visualization 1.0 will work with visualization_common 1.1, but visualization 1.1 requires visualization_common 1.1. The largest changes in these releases involve additions to the visualization_msgs/Marker API.
OpenCV 2.1 has been released. In
addition to many improvements underneath the hood, OpenCV 2.1 adds the Grabcut (C. Rother, V.
Kolmogorov, and A. Blake) image segmentation algorithm. The stereo
libraries have also been updated with new and improved algorithms,
including H. Hirschmuller's semi-global stereo matching algorithm
Mac OS X users will be happy to know that OpenCV has been updated for
Snow Leopard. You can now build it as a 64-bit library and highgui has
been updated with new Cocoa and QTKit backends (thanks Andre Cohen and
Nicolas Butko). Windows users can also build on 64-bit using MSVC 2008
There are numerous other improvements with this release. We encourage
users to check the change
list to find out more.
On an administrative note, OpenCV has migrated from SourceForge to code.ros.org to take
advantage of faster servers. The ticket tracker has also
The Minoru is an inexpensive stereo webcam that can now be used with ROS. Bob Mottram recently updated the v4l2stereo library so you can now use both the library and the Minoru camera easily with ROS, as the video below demonstrates. The v4l2stereo library also integrates well with OpenCV.
You can find instructions on the Sentience site on how to easily remove the commercial packaging around the Minoru sensor, as well as instructions on how to use it with ROS.
HERB (Home Exploring Robotic Butler) is a mobile manipulation platform built by Intel Research Pittsburgh, in collaboration with the Robotics Institute at Carnegie Mellon University. HERB is designed to be a "robotic butler" and has been demonstrated in a variety of real-world kitchen tasks, such as opening refrigerator and cabinet doors, finding and collecting coffee mugs, and throwing away trash. HERB is powered by a variety of open-source libraries, including several developed by CMU researchers, like OpenRAVE and GATMO.
OpenRAVE is a software platform for robotics that was designed specifically for the challenges related to motion planning. It was created in 2006 by Rosen Diankov, and in late 2008 he integrated it with ROS. The benefits of this integration can be seen on HERB.
HERB has a Barrett WAM arm, a pair of low-power onboard computers, Pointgrey Flea and Dragonfly cameras, a SICK LMS lidar, a rotating Hokuyo lidar, and a Logitech 9000 webcam, all of which sit on a Segway RMP200 base. HERB communicates with off-board PCs over a wireless network.
ROS is glue for this setup: ROS is used for the hardware drivers, process management, and communication on HERB. ROS' ability to distribute processes across computers is used to help perform computation off the robot.
OpenRAVE provides an environment on top of this that unifies the controls and sensors for doing motion-planning algorithms, including sending trajectories to the arm and hand. OpenRAVE implements Diankov et. al's work on caging grasps, which enables HERB to perform tasks like opening and closing doors, drawers, cabinets, and turning handles.
In addition to manipulating objects, HERB has to be able to keep track of people and other movable objects that exist in real-world environments. HERB uses the GATMO (Generalized Approach to Tracking Movable Objects) library to track these movable objects. GATMO was developed by Garratt Gallagher and is available from gatmo.org. The GATMO library includes packaging and installation instructions for ROS.
The collaboration between CMU and Intel Labs Pittsburgh has produced numerous other libraries that have found their way into ROS. Rosen Diankov started the cmu-ros-pkg repository, which houses many of these libraries, and he also wrote rosoct, an Octave client library for ROS. Another library of note is the chomp_motion_planner package, which was implemented by Mrinal Kalakrishnan based on the work of Ratliff et. al.
We just released robot_model 1.1.0. This is an unstable development branch to prepare new features for the next ROS distribution. The most exciting new feature is the URDF to COLLADA conversion tool. For more details, check out the stack documentation and change list.
navigation 1.1.0 has been released. This is an unstable development release to try out features considered for inclusion in navigation 1.2. The 1.1.0 release contains a service call for navfn that allows plans to be made via ROS, a service call to move_base that clears unknown space in the costmap, and a fix to velocity computation in the base_local_planner that could allow robots to drive a bit faster but needs testing to be sure its safe. Unless you have an immediate need for features in the 1.1 series, it is recommended that you use the 1.0 series until things stabilize with a 1.2 release.
Hope all is well,
Added a clear_unknown_space service to move_base that allows external users to clear unknown space around the robot in the costmap.
Adding a make_plan service to navfn that allows the use of the planner via ROS.
Changing ROS_ASSERT to ROS_FATAL with a std::runtime_error when checking for legal configurations.
Costmap2DROS is now const correct
Bug fix for how velocities were being computed. This fix will make the robot a bit more aggressive in how it drives and is going into the 1.1 series for testing.
The Intelligent Autonomous Systems Group at TU MÃ¼nchen (TUM) built TUM-Rosie with the goal of developing a robotics system with a high-degree of cognition. This goal is driving research in 3D perception, cognitive control, knowledge processing, and highlevel planning. TUM is building their research on TUM-Rosie using ROS and has setup the open-source tum-ros-pkg repository to share their research, libraries, and hardware drivers. TUM has already released a variety of ROS packages and is in the process of releasing more.
TUM-Rosie is a mobile manipulator built on a Kuka mecanum-wheeled omnidrive base, with two Kuka LWR-4 arms and DLR-HIT hands. It has a variety of sensors for accomplishing perception tasks, including a SwissRanger 4000, FLIR thermal camera, Videre stereo camera, SVS-VISTEK eco274 RGB cameras, a tilting "2.5D" Hokuyo UTM-30LX lidar, and both front and rear Hokuyo URG-04LX lidars.
One of the new libraries that TUM is developing is the cloud_algos package for 3D perception of point cloud data. cloud_algos is being designed as an extension of the pcl (Point Cloud Library) package. The cloud_algos package consists of a set of point-cloud-processing algorithms, such as a rotational object estimator. The rotational object estimator enables a robot to create models for objects like pitchers and boxes from incomplete point cloud data. TUM has already released several packages for semantic mapping and cognitive perception.
TUM is also working on systems that combine knowledge reasoning with perception. The K-COPMAN (Knowledge-enabled Cognitive Perception for Manipulation) system in the knowledge stack generates symbolic representations of perceived objects. This symbolic representation allows a robot to make inferences about what is seen, like what items are missing from a breakfast table.
In the field of knowledge processing and reasoning for personal robots, TUM developed the KnowRob system that can provide:
spatial knowledge about the world, e.g. the positions of obstacles
ontological knowledge about objects, their types, relations, and properties
common-sense knowledge, for instance, that objects inside a cupboard are not visible from outside unless the door is open
knowledge about the functions of objects like the main task a tool serves for or the sequence of actions required to operate a dishwasher
KnowRob is part of the tum-ros-pkg repository, and there is a wiki with documentation and tutorials.
At the high level, TUM is working on CRAM (Cognitive Robot Abstraction Machine), which provides a language for programming cognitive control systems. The goal of CRAM is to allow autonomous robots to infer decisions, rather than just having pre-programmed decisions. Practically, the approach will enable tackling of the complete pick-and-place housework cycle, which includes setting the table, cleaning the table as well as loading the dishwasher, unloading it and returning the items to their storage locations. CRAM features showcased in this scenario include the probabilistic inference of what items should be placed where on the table, what items are missing, where items can be found, which items can and need to be cleaned in the dishwasher, etc. As robots become more capable, it will be much more difficult to explicitly program all of their decisions in advance, and the TUM researchers hope that CRAM will help drive AI-based robotics.
Researchers at TUM have also made a variety of contributions to the core ROS system, including many features for the roslisp client library. They are also maintaining research datasets for the community, including a kitchen dataset and a semantic database of 3d objects, and they have contributed to a variety of other open-source robotics systems, like YARP and Player/Stage.
Research on the TUM-Rosie robot has been enabled by the Cluster of Excellence CoTeSys (Cognition for Technical Systems).
For more information:
The Modlab at Penn designed the CKBot (Connector Kinetic roBot) module to be fast, small, and inexpensive. These qualities enable it to be used to explore the promise of modular robotics systems, including adaptability, reconfigurability, and fault tolerance. They've researched dynamic rolling gaits, which use a loop configuration to achieve speeds of up to 1.6/ms, as well as bouncing gaits by attaching passive legs. They are also using the CKBots to research the difficult problem of configuration recognition, and, for the Terminator 2 fans, they have even demonstrated "Self re-Assembly after Explosion" (SAE).
More recently, Modlab has developed ROS packages that can be used when the CKBots are connected to a separate ROS system. They have also created an open source repository, modlab-ros-pkg, for CKBot ROS users. The CKBot modules only have a few PIC processors -- not enough to run ROS -- so an off-board system enables them to use algorithms that require more processing power. In one experiment, they used a camera to locate AR tags on the CKBot modules. The locations were stored in tf, which was used to calculate coordinate transforms between modules. They have also used rviz to display the estimated position of modules during SAE when AR tags were not in use.
One of the projects Modlab is currently working on is a "mini-PR2" made out of CKBot modules. The mini-PR2 will be kinematically similar to the Willow Garage PR2 and is powered by a separate laptop. You can see an early prototype of mini-PR2 opening an Odwalla fridge:
pr2_controllers 1.1.0 has been released. This is an unstable development release to try out features considered for inclusion in pr2_controllers 1.2. The 1.1.0 release contains some additional configuration options and a new joint trajectory controller with an action interface built in.
The projector controller now updates the projector current setting
each time through the realtime loop. Otherwise, some lockout
conditions were causing the current to be set to zero, and it was
never being reset to the proper value.
pr2_controllers_msgs now autogenerates its action messages during
build step. Removed the autogenerated files from version control
pr2_gripper_action now depends explicitly on pr2_mechanism_model.
Upped the tolerance for determining that a joint is stopped in the
joint trajectory action to handle jittering in gazebo.
Removed push_back from the realtime part of the calibration
Made the stall velocity threshold configurable in pr2_gripper_action (#4073)
Created the joint trajectory action controller, which combines the
joint spline trajectory controller and the joint trajectory action.
Upped tolerance for determining if the controller is following a
different goal in the pr2_gripper_action (#4076)
Joint trajectory action controller now accepts goals with a
timestamp of 0 (#3896)