August 2010 Archives

Name that DARPA Robot

| No Comments | No TrackBacks

DARPA_ARMS.jpgDARPA is having a contest to name their new robot for the ARM program. "The ARM Robot" has two Barrett WAM arms, BarrettHands, 6-axis force torque sensors at the wrist, and pan-tilt head. For sensors, it has a color camera, SwissRanger depth camera, stereo camera, and microphone.

The final software architecture and APIs have not been released yet, but the FAQ notes:

The software architecture is TBD, but is leaning toward a nodal software architecture using a tool such as Robotic Operating System (ROS).

The software track for the ARM program currently includes Carnegie Mellon University, HRL Laboratories, iRobot, NASA-Jet Propulsion Laboratory, SRI International and University of Southern California. It would certainly be a great boost for the ROS community to have more common platforms to develop and share the latest perception and manipulation techniques.

Below is a video from Dr. Motilal Agrawal of SRI (via Hizook) showing it in action. Dr. Agrawal and SRI are looking for Ph.D/Masters students with experience in robotics, ROS, and OpenCV. Want a job?

RIND: ROS status INDicator

| No Comments | No TrackBacks

rind.png

In addition to WowWee drivers and OpenCV tutorials, I Heart Robotics has just released a great Ubuntu panel tool called RIND, which stands for Robot/ROS Status Indicator. You can use it to manage your local roscore as well get information on ROS nodes and topics. Checkout the documentation or read the announcement for more information.

The CityFlyer project at the CCNY Robotics and Intelligent Systems Lab is using Ascending Technologies Pelican and Hummingbird Quadrotor helicopters to do research in 3D mapping and navigation. The Ascending Technologies platform provides a 1.6Ghz Intel Atom processor, 500 gram payload, GPS, and barometric altimeter. The CityFlyer add several sensors, including a Hokuyo URG-04LX and IMU. The Hokuyo URG has been modified to double as a laser height estimator. The CityFlyer project is able to combine data from these sensors to do indoor SLAM using GMapping.

The CityFlyer project has also created an RGB-D sensor by combining data from a SwissRanger 4000 and Logitech Webcam. They use this to build 3D maps for indoor environments using a 3D Multi-Volume Occupancy Grid (MVOG). Their MVOG technique is described in their RGB-D 2010 paper and more videos are here and here. Although the full sensor package exceeds the payload of the quadrotor, they anticipate that advances in RGB-D will make these techniques feasible for micro UAVs.

CCNY has released a variety of drivers, libraries and tools to support the ROS community. These include drivers and tools for the AscTec platform, libraries for dealing with aerially mounted laser rangefinders, a New College Dataset parser, and libraries for using AR tags with ROS.

ground station screenshot

CCNY has also developed a "Ground Station" application that acts as a virtual cockpit for visualizing telemetry data from an AscTec quadrotor. It is also able to overlay GPS data on an outdoor map to visualize the UAV's tracks. I Heart Robotics has a great writeup on Ground Station, and you can also checkout the documentation on ROS.org.

The ccny-ros-pkg is an excellent resource for the ROS community with complete documentation on a variety of packages, including videos that demonstrate these packages in use.

Bag files for the video above can be downloaded here (elevator_2010-08*).

C Turtle Update 1

| No Comments | No TrackBacks

Robots Using ROS: Thecorpora's Qbo

| No Comments | No TrackBacks

Qbos.jpg

Qbo is a personal, open-source robot being developed by Thecorpora. Francisco Paz started the Qbo project five years ago to address the need for a low cost, open-source robot to enable the ordinary consumer to enter the robotics and the artificial intelligence world.

A couple months ago, Thecorpora decided to switch their software development to ROS and have now acheived "99.9%" integration. You can watch the video below of Qbo's head servos being controlled by the ROS Wiimote drivers, as well as this video of the Wiimote controlling Qbo's wheels. Their use of the ROS joystick drivers means that any of the supported joysticks can be used with Qbo, including the PS3 joystick and generic Linux joysticks.

Qbo's many other sensors are also integrated with ROS, which means that they can be used with higher-level ROS libraries. This includes the four ultrasonic sensors as well as Qbo's stereo webcams. They have already integrated the stereo and odometry data with OpenCV in order to provide SLAM capabilities (described below).

It's really exciting to see an open-source robot building and extending upon ROS. From their latest status update, it sounds like things are getting close to done, including a nice GUI that lets even novice users interact with the robot.

Qbo SLAM algorithm:

The algorithm can be divided into three different parts:

The first task is to calculate the movement of the robot. To do that we use the driver for our robot that sends an Odometry message.

The second task is to detect natural features in the images and estimate their positions in a three dimensional space. The algorithm used to detect the features is the GoodFeaturesToTrackDetector function from OpenCV. Then we extract SURF descriptors of those features and match them with the BruteForceMatcher algorithm, also from OpenCV.

We also track the points matched with the sparse iterative version of the Lucas-Kanade optical flow in pyramids and avoid looking for new features in places where we are already tracking another feature.

We take the images to this node from image messages synchronized and send a PointCloud message with the position of the features, their covariance in the three coordinates, and the SURF descriptor of the features.

The third task is to implement an Extended Kalman Filter and a data association algorithm based in the mahalanobis distance from the CloudPoint seen from the robot and the CloudPoint of the map. To do that we read the Odometry and PointCloud messages and we send also an Odometry message and a PointCloud message with the position of the robot and the features included in the map as an output.

Robots Using ROS: Lego NXT

| No Comments | No TrackBacks

Lego Mindstorms NXT is a low-cost programmable robotics kit that is used in education and by hobbyits throughout the world. One of the most visible NXT events is First Lego League. The developers of foote-ros-pkg have developed a bridge that connects NXT with ROS, allowing NXT users to leverage all the ROS tools and capabilities.

The NXT-ROS software stack provides many useful tools to interface NXT robots with ROS. Currently NXT users can take robot models created with Lego Digital Designer, and automatically convert them into robot models compatible with ROS. The converted robot model can be visualized in rviz, and in the future we hope to add simulation capabilities in Gazebo, our 3D simulator. The bridge between NXT and ROS creates a ROS topic for each motor and sensor of the NXT robot.

Once a robot is connected to ROS, you can start running applications such as the base controller, wheel odometry, keyboard/joystick teleoperation, and even assisted teleoperation using the ROS navigation stack. The NXT-ROS software stack includes a number of example robot models for users to play with and to get a feel for using NXT with ROS.

This new NXT-ROS software stack provides NXT users access to the open-source ROS community. NXT users now have access to state of the art open source robotics libraries available on ros.org.

Please see the nxt page on the ROS wiki for documentation, demos, and more. The developers would like to thank the nxt-python project for support and development.

New in ROS C Turtle: SMACH

| No Comments | No TrackBacks

One of the new features in ROS C Turtle was a critical component of our recent "hackathons." When fetching a drink out of a refrigerator, for example, a robot has to perform numerous tasks such as grasping a handle, opening a door, and scanning for drinks. These tasks have to be carefully orchestrated to deal with unexpected conditions and errors. We've previously used complex task-planning systems to orchestrate these actions, but our developers and researchers needed something more rapid for prototyping robot behaviors.

One of our interns came up with an answer. SMACH ("State MACHine", pronounced "smash") is a task-specification and coordination architecture that was developed by Jonathan Bohren as part of his second internship here at Willow Garage. Jonathan came to us from the GRASP Lab at University of Pennsylvania and is now headed off to the Laboratory for Computational Sensing and Robotics (LCSR) at Johns Hopkins.  During his extended stay here, SMACH was used in a variety of PR2 projects.

SMACH was first used in the rewrite of our plugging and doors code, then further refined during our billiards, cart-pushing, and drink-fetching hackathons. In all of these projects, the ability to code these behaviors quickly was critical, as was the ability to create more robust behaviors for dealing with failure.

SMACH is a ROS-independent Python library, so it can be used with and without ROS infrastructure. It comes with important developer tools like a visualizer for the current SMACH plan and introspection tools to monitor the internal state and data flow. There are already many SMACH tutorials that can be found on the ROS wiki, and we hope to see SMACH used to produce many more cool robotics apps!

ROS Client Library for Lua

| No Comments | No TrackBacks

Tim Niemueller has announced a ROS client for Lua

Hello ROS users.

During the last weeks we have developed a Lua-based API to write ROS nodes in the Lua programming language. It allows for communicating with other nodes and participate in the ROS universe. It has been developed at Intel Labs Pittsburgh as part of my research stay this year working with Dr. Siddhartha Srinivasa on the Personal Robotics project.

Some highlights of the implementation:

  • Completely written in Lua, no wrappers
  • Implements topic and service communication
  • Reads message specifications on the fly and generates appropriate data structures at run-time, avoiding offline code generation
  • Fully documented API
  • only about 2800 lines of code (ohcount)
  • Test scripts for all features and simple examples

The implementation benefits from the inherent single-threading in Lua, meaning that everything is processed in a single main loop. This is one of the major factors for its simplicity. No attempts have been made to incoporate Lua add-ons that would provide true multi-threading. The implementation has by far not the same versatility as the Python or C++ API (and we do not aim at that), but it does provide a very simple way to interact with ROS and do this directly without a middle-man from Lua.

The endeavor has been conducted to prepare for porting the Fawkes behavior engine to ROS, which is a framework for developing robot behavior employing Lua as its scripting language.

We would be delighted if the community would take a look at the implementation and provided some feedback. Once it has reached a certain stability and the feature set has been expanded to an acceptable coverage we would like to propose this for inclusion in the experimental package tree.

You can find the source code at http://github.com/timn/roslua. Some documentation on how to get started is provided in the README file.

Regards,
Tim

Robots Using ROS: Mini-PR2

| No Comments | No TrackBacks

The folks at the ModLab/GRASP Lab at Penn recently got their PR2 and used the occassion to test out "Mini-PR2". They used $5000 worth of CKBot modules to replicate the degrees of freedom of the real PR2 -- all except the torso. They used 18 modules (14 U-Bar, 4 L7, 4 motor) to create Mini-PR2, and they also added a counter-balance on the shoulder to help balance the arm.

The CKBot modules, which have previously been featured here, enable their lab to try out new ideas quickly and cheaply. In this case, they can use the PR2 simulator to drive their real robot, and they've used an actual PR2 to puppet Mini-PR2 (see 0:49 in video). They are now working on using the Mini-PR2 to puppet the actual PR2.

The CKBot modules don't have the computation power to run ROS on their own, but they can communicate with another computer that translates between the two systems. Their current system listens to the joint_states topic on the PR2 and translates those messages into CKBot joint angles.

I Heart Robotics has a five-part, but ongoing "Vision for Robotics" series that helps readers integrate OpenCV into their robotics applications. They also put together a useful review of ROS USB cam drivers, which might be a useful pre-requisite.

I Heart Robotics also has a ROS repository that includes code for all the tutorials below. Fantastic!

Announcement from Trevor Jay of Brown's RLAB

Brown is pleased to announce our beta version of rosjs, a light-weight Javascript binding for ROS.

rosjs is designed to enable users and developers to use the functionality of ROS through standard web browsers. Applications developers can leverage all of the power of HTML to build engaging applications and interfaces for robots as quickly as possible without recompiling ROS nodes. Additionally, users can access and run ROS-based applications from standard browsers without the need for any plugins.

rosjs consists of a server and a pure Javascript library. rosjs is not tied to any particular web-server or framework; it even works when served locally. Using websockets, latency is low enough for teleoperation or closed loop control. For example, the following video shows a user teleoperating the PR2 via rosjs from Providence to Palo Alto:

rosjs is currently available for download from the brown-ros-pkg repository via:

svn co https://brown-ros-pkg.googlecode.com/svn/trunk/experimental/rosjs rosjs

and you can view preliminary documentation here:

http://code.google.com/p/brown-ros-pkg/wiki/rosjs

We are making rosjs available now for the ROS community to use and provide feedback. Please play, create, and break stuff; then tell us about it.

Thanks again to the kind people at Bosch and Willow for use of their PR2.

_Robot Learning and Autonomy @ Brown (RLAB)

We are happy to announce the release of a number of stacks that provide core functionality for grasping and manipulation tasks, as well as example applications on the PR2 robot. These include:

Additional details, tutorials and example application can be found on the ROS Wiki. Good places to start checking out the available functionality are:

Complementing the manipulation stacks is a database of household objects containing information relevant for manipulation tasks, along with its ROS interface. Details can be found on the household_objects_database page.

Please note that these are "research stacks", under active development. While we have put a lot of effort into reviewing and documenting them, they are likely to be less stable than "core" ROS stacks which have reached 1.x releases. That said, we believe that they will provide useful functionality for building towards complex mobile manipulation applications.

Any feedback, bug reports or success stories of using these stacks will be greatly appreciated.

Best,
Your Merry ROS Manipulation Mob

Three Robots: One ROS Node

| No Comments | No TrackBacks

post by Trevor Jay of Brown to ros-users

Hi ROS-Users!

Recently thanks to the very nice people at Bosch and their remote lab efforts, we were able play around with an actual PR2. We wanted to share the following video of a single ROS node (svn co https://brown-ros-pkg.googlecode.com/svn/trunk/experimental/nolan nolan) running completely unmodified on three very different robots (including the PR2).

Each of the robots is using ar_recog, our ROS-compatible wrapper around ARToolkit to localize ARTags. A simple PID controller then directs them in following. As the Nao does not recognize Twist msgs, we have a simple ten-liner rebroadcasting them as Walk msgs, but even here the the original control node is running unmodified.

Anyone interested in the vision stack can check it out at the brown ros pkg repository.

Thanks everyone in the community who's helping to bring about this level of portability.

_RLAB (Robot Learning and Autonomy @ Brown)

ROS C Turtle Release

| No Comments | No TrackBacks

cturtle_poster.jpg

ROS C Turtle has been released!

ROS C Turtle is the second ROS distribution release. ROS Box Turtle was released March 2, 2010 and included stable releases of ROS and core libraries and tools like navigation, rviz, hardware drivers, and an image-processing pipeline.

ROS C Turtle builds on Box Turtle with across-the-board improvements to these core libraries and numerous bug fixes. These improvements include a new "nodelet" architecture that provides low-latency, zero-copy message passing within C++ nodes, official support for a Lisp client library, and an official firewire camera driver (thanks to Jack O'Quin). Numerous third-party libraries have been upgraded in this release, including Stage 3.2.2, Bullet 2.76, and Eigen 2.0.15, as well as newer versions of KDL and Gazebo. There are many, many other improvements listed in the change list.

This release includes new experimental libraries for 3D perception, manipulation, grasping, and visual odometry. We encourage early adopters to test out these libraries and provide feedback so that they can be stabilized for future ROS releases.

Since the release of Box Turtle, the ROS community has grown immensely. There are over a dozen new public, open-source repositories of ROS code, and ROS has been ported to a variety of different robot platforms, from mobile manipulators to autonomous boats. Commercial robotics software libraries like Urbi and Karto now have open-source offerings that are compatible with ROS, and the list of robot platforms that can be used with ROS continues to grow. We're excited at these new opportunities to collaborate within the community and hope that you all enjoy this C Turtle release.

ROS distribution releases occur on a six-month cycle. The successor to C Turtle, Diamondback, is expected in February of 2011.

sharp_robotica.pngBilly McCafferty of sharprobotica.com has put together a six-part series on design and testing concerns for developing ROS packages. This series tackles these challenges step-by-step, layer-by-layer:

If you're thinking about the right design patterns and architectures for ROS packages, this series is definitely worth a look.

Find this blog and more at planet.ros.org.


Please submit content to be reviewed by emailing ros-news@googlegroups.com.

Monthly Archives

About this Archive

This page is an archive of entries from August 2010 listed from newest to oldest.

July 2010 is the previous archive.

September 2010 is the next archive.

Find recent content on the main index or look in the archives to find all content.