January 2011 Archives

Thumbnail image for 3Dturtle.jpgMichael Ferguson is a prolific contributor to ROS. His entry into the ROS 3D Contest is "Improved AR Markers for Topological Navigation". AR markers are a cheap and effect way to find the position of objects in an image using cheap cameras. Michael recognized the opportunity to combine these markers with the Kinect, which has both camera and depth data, to transform them into markers in three dimensions. You can even use this to find the position of the robot by attaching the markers to known locations in your map.

We encourage you to check out the many different robots that Michael is building, from the iRobot Create and Dynamixel AX-12-based Nelson to the up-and-coming Create + Kinect + tripod Trike. The software for the contest entry along with these robots can be found in albany-ros-pkg, which also contains a Neato XV-11 driver for ROS.

OpenNI Developer Challenge

| No Comments | No TrackBacks

opennidevchall.jpg

ROS 3D Contest not enough for you? PrimseSense is sponsoring a contest to develop a natural interaction interface for a Web browser, with $20K as the top prize. For more details, see the OpenNI Developers Challenge.

Photo essay on Penn's GRASP Lab

| No Comments | No TrackBacks

Penn_photo_essay.png

Evan Mann did a nice photo essay of Penn's GRASP Lab. Enjoy.

Photo Essay: GRASP Lab

For extra fun, don't miss the construction quadrotors on Colbert's Threatdown.

Thanks Ben Cohen!

Thumbnail image for 3Dturtle.jpgPatrick Bouffard's "Quadrotor Altitude Control and Obstacle Avoidance" was featured back in December, when he first made waves on the Internet by mounting a Kinect to a quadrotor and flying it around his lab. The Kinect was used to detect the altitude as well as avoid obstacles.

Patrick has updated his video for the ROS 3D contest. He has also released starmac-ros-pkg, which contains the software used in his Berkeley lab to get these quadrotors in the air. starmac-ros-pkg includes ROS drivers for Vicon motion capture systems as well as an abstraction of the AscTec autopilot driver. It's a great complement to ccny-ros-pkg, which provides AscTec quadrotor drivers, computer vision libraries, and other tools.

ROS 3D Entries: Anaglyph Viewer

| No Comments | No TrackBacks

Thumbnail image for 3Dturtle.jpgColin Lea's Anaglyph Viewer entry into the ROS 3D Contest brings a bit of 3D retro to our entries. Colored glasses for seeing 3D are an inexpensive way of seeing 3D content on a 2D screen. If you are able to see the data in 3D, you can become more immersed in the data that is coming from the Kinect. For example, you can build more effective teleoperation cockpits that let you take advantage of your ability to see depth. Add more Kinect cameras and you can start becoming fully immersed in a 3D world.

PS: Our skateboarding turtle says thanks!

ROS 3D Entries: Kinemmings

| No Comments | No TrackBacks

Thumbnail image for 3Dturtle.jpgThe Kinemmings entry by Alberto Jose Ramirez Valadez, Jonathan Rafael Patino Lopez and Marcel Stockli Contreras, is a take on the classic Lemmings game. Now, it's up to you and your body to guide the Kinemmings safely to their exit.

Kinemmings has the distinction of being the only game entry into the ROS 3D contest. In fact, as far as we know, it may be the first game package in all of ROS. We appreciate it as it means we can now tell our boss that we're "working on ROS".

Thumbnail image for 3Dturtle.jpgYou've have your Kinect and want to mount it on your robot, but now you're faced with a challenge: you need to precisely determine the mounting point of your Kinect so that it the data from it can be interpreted correctly; e.g. if you want to use it to run autonomous navigation.

The "Automatic Calibration of Extrinsic Parameters" entry from François Pomerleau, Francis Colas and Stéphane Magnenat of the Autonomous Systems Lab at ETHZ makes solving this problem easy for users and does much more. If you run their software with the Kinect mounted, it will output the tf transform between your base_link and the camera, making configuration easy.

They also released several lower-level libraries to help build other applications on top: libnabo for running fast K Nearest Neighbor, and libpointmatcher, a modular ICP library. These are important for building tracking applications, as shown in the video, as well as building SLAM and other systems.

ROS C Turtle Update

| No Comments | No TrackBacks

Thumbnail image for cturtle_poster.jpgROS C Turtle has been updated with new debian packages. This update is mainly to include compatibility with image_transport_plugins on Ubuntu Maverick.

Changes:

Thumbnail image for 3Dturtle.jpgPatrick Goebel is the creator of Pi Robot, which is a custom-built, robotis-based hobby robot. Patrick has been a frequent contributor to the Trossen and ROS communities, including writing a detailed essay for hobbyists getting into ROS.

His entry for the ROS 3D contest builds on Taylor Veltrop's teleop control to adapt it for the Pi Robot, as well as add in a base controller and the ability to define new gestures for control. Patrick has also contributed a serializer package for those wishing to use the Robotis Serializer microcontroller in ROS. Pi Robot may be one of a kind, but, thanks to Patrick's contributions, you have the software you need to build your own.

Patrick will be giving the featured presentation at tonight's Homebrew Robotics Club meeting.

Thumbnail image for 3Dturtle.jpgTaylor Veltrop had the first ROS 3D contest entry with his teleoperation control of a humanoid KHR/Roboard robot. He wasn't content to leave it at that: he beefed up his teleoperation system with Wiimote and leg-based control. He also is running it on an Aldebaran Nao.

One of the difficulties in using the skeleton tracking libraries with the Kinect is that you do not get much information about the hands of the operator. For those trying to use the skeleton tracking to control a robot's arms, this creates a pickup problem: you can get the arm to location you wish to grab an item, but you don't have the control you need over the angle of the hand and the opening and closing of the gripper to complete the task.

Taylor solves this by enabling you to use Wiimotes in each hand. With this additional controls, the operator can seamlessly use the wiimotes to transmit the additional information about the correct hand position, and you can use the buttons on the Wiimote to perform additional operations, like opening and closing grippers.

Taylor also collaborated with Patrick Goebel to add in leg controls for moving a robot. Placing one leg forwards or backwards move the robot in that direction. Placing a leg to the side makes the robot turn.

You can watch Taylor's new video above, where he puts the Nao teleop through it's paces. If you have ever wanted to see a Nao wield a knife, play chess, or grab a tissue out of a box, check it out.

ROS 3D Entries: Nao Teleop Control

| No Comments | No TrackBacks

Halit Bener SUAY entered the ROS 3D contest with this entry that demonstrates teleoperation of an Aldebaran Nao using a Kinect. This is the not the only entry to tackle teleoperation, but it adds its own unique twists. Most notably, there are pre-defined gestures that enable the operator to switch between different modes of control. One leg controls starting and stopping the robot. Another enables the operator to switch between controlling the body and the head. Your arms can either directly control the robot arms are issue other commands, like directing the robot's gaze. All-in-all, it's a great demo of how we can go completely remoteless and still control a complex, walking robot like the Nao.

Humanoid Robot Control and Interaction

ROS 3D Entries: RGBD SLAM

| No Comments | No TrackBacks

Credit: Nikolas Engelhard, Felix Endres, Juergen Hess, Juergen Sturm, Daniel Kuhner, Philipp Ruchti, and Wolfram Burgard

The University of Freiburg team has put together an impressive 6D-SLAM library for entry into the ROS 3D Contest. By taking advantage of the additional 3D data that a Kinect provides, they've released a new benchmark for the state-of-the-art in the field. It's also a great demo that we can all try ourselves: pick up your Kinect, move it around, and build 3D models of your world.

RGBD-6D-SLAM

Robots Using ROS: CSIRO's Bobcat

| No Comments | No TrackBacks

csiro_bobcat.jpg

CSIRO's Bobcat is a S185 skid-steerer, complete with lift arms. This heavy duty outdoor robot enables CSIRO robots to interact with an environment, rather than just move through it. In order to do this, they have equipped the bobcat with a variety of sensors, including two horizontal lasers, a spinning laser, camera, two IMUs, GPS, wheel encoders, and more. They also plan on integrating stereo, Velodyne, multi-modal radar, hyper spectral, and other sensors.

CSIRO's current focus with the bobcat is shared and cooperative autonomy. With shared autonomy, a human tele-operator can intervene and provide corrections as the bobcat performs a task. With cooperative autonomy, the bobcat can leverage robots with other capabilities. This sort of coordination could enable a fleet of bobcats to autonomously excavate an area.

CSIRO is in the process of migrating the Bobcat to ROS. The Bobcat was originally developed using DDX (Dynamic Data eXchange). DDX is a third generation middleware developed by CSIRO and provides features, like shared memory data exchange, that are complementary to ROS. They will continue using DDX for low-level realtime control, but sensor drivers and higher level code are being migrated to ROS. They are also investigating adding DDX-like transports to ROS.

3Dturtle.jpgWe're now busy judging the eighteen awesome entries to the ROS 3D Contest. There's everything from teleoperation to games to libraries for registration and calibration. It's going to be tough choosing who gets a prize on them.

You can go ahead and checkout the entries yourself. In most cases, you should be able to even download and try them out on your own Kinect or PrimeSense device.

While we tally the results, we'll spotlight the entries here.

First off are Garratt Gallagher's entries. Garratt was our most prolific entrant and produced a total of five separate entries. Each is worth it's own blog post, and many of them already have been featured here:

We're grateful that Garratt has taken the time to, not only enter the contest, but go the extra mile to make sure that others can try out his libraries and build on his creative ideas. If you like what you see, you should consider helping out his Bilibot project, which is a low-cost Kinect + Create platform.

Garratt's newest entry is "Customizable Buttons". Using the Kinect, you can draw on a piece of paper to create your own music board. It's a lot of fun, as you'll see in the video:

Garratt's Entries:

ROS 3D Contest: Entries Due!

| No Comments | No TrackBacks

Thumbnail image for 3Dturtle.jpg

Today is the deadline for entries in the ROS 3D Contest. There's already a lot of great entries and we're looking forward to the judging.

Bill Mania gave an introductory presentation on ROS at ChiPy, the Chicago Python users group. He also gave a demo of his RoboMagellan robot that he's bringing up on ROS. This is a good overview for those of you just getting into ROS, especially from a Python perspective.

Recorded by Carl Karsten

New repository: csiro-asl-ros-pkg

| No Comments | No TrackBacks

Announcement by Fred Pauling to ros-users

CSIRO's Autonomous Systems Lab is pleased to announce the public availability of two ROS stacks:

  1. csiro-asl-drivers: is intended to contain hardware drivers, and currently contains a driver for the SICK LD-MRS 400001 Laser Scanner.

  2. csiro-asl-openrave-plugins: is intended to contain plugins for OpenRave, and currently contains an odephysics plugin.

The csiro-asl-ros-pkg project page is located at https://launchpad.net/csiro-asl-ros-pkg

The code for these stacks is available from the trunk of our bazaar repository on launchpad:

bzr branch lp:csiro-asl-ros-pkg

Further announcements will be made as other software is released.

Regards,

Fred Pauling
CSIRO Autonomous Systems Lab

Update: note corrected date.

The January 26th meeting of the Homebrew Robotics Club will be "ROS for the Rest of Us." The meeting will be held at Google in Mountain View, CA.

The featured speaker is Patrick Goebel, who will present his Pi Robot platform (pi-robot-ros-pkg). Patrick also wrote a tutorial on visual object tracking using ROS.

There should also be several other quick ROS presentations, including a Tony Pratkanis presenting his ROS + Neato platform, and some Willow Garage presentations on ROS + Arduino, and ROS + Create + Kinect.

Meeting Details

ROS 3D Contest: 8 days to go

| No Comments | No TrackBacks

Friendly reminder: only 8 days left to enter the ROS 3D contest. Enter early, enter often!

We also have a special new prize: two PrimeSense Developer Kit 5.0 devices, courtesy of PrimeSense! These are much easier to use with robotics platforms as they don't have the additional power requirements of the Kinect. These will be special "judges prizes" given out to entries we find most appropriate for them.

There's a new openni_pointer sample in the ni package thanks to Kei Okada, for those of you who wish to track hand positions using tf.

There are aso some new entries to the contest:

Humanoid Robot Control and Interaction

Quadrotor Altitude Control and Obstacle Avoidance

Aggressive Quadrotors now build towers

| No Comments | No TrackBacks

Credit: Quentin Lindsey, Daniel Mellinger, and Vijay Kumar from the GRASP Lab at the University of Pennsylvania

Previously: Aggressive Quadrotors 2, Aggressive Quadrotors 1

via I Heart Robotics

ROS C Turtle Update: Now with Erratic

| No Comments | No TrackBacks

Thumbnail image for cturtle_poster.jpgA new update to C Turtle has been released. This new release contains the first release of the erratic_robot stack, which supports the Videre Erratic platform. This stack is now maintained by Antons Rebguns/ua-ros-pkg and includes drivers, robot description, and navigation configuration.

ROS C Turtle Update

| No Comments | No TrackBacks

Thumbnail image for cturtle_poster.jpgA new release of debian packages is out. We have updated our debian-building infrastructure in preparation for ROS Diamondback, so please report any issues to ros-users.

We continue to strive to make the ROS development process more open, from REPs to exploring a ROS Foundation. One need identified by the community was making it easier to get involved in the core ROS development. It hasn't been clear what features could be developed by those looking to get into ROS core development. To facilitate that, we've setup a simple mechanism to expose open issues which are not actively being worked of to the community.

These new "Handoff Lists" link to feature enhancements that we think are good places to get started as a ROS hacker. We have only just started on adding tickets to this list and will hopefully add even more over time. If you'd like to contribute, these tickets are a good way to get familiar with parts of the code base and development process.

ROS and PR2 in Education

| No Comments | No TrackBacks

We think that ROS and the PR2 are great tools for educators. Both platforms allow students to focus on building the relevant parts of a system while incorporating less topical components from the open source community. Students get started faster and complete more impressive projects. Even more importantly, students can take components built in ROS to their next course, research project or job without worrying about licensing.

We've started a wiki page to list courses using ROS or the PR2, and to discuss teaching-related issues. Here are some course examples that you can use for inspiration:

Short Courses

University (Undergraduate & Graduate) Courses

If you're teaching a course using ROS or the PR2, please post a link at ros.org/wiki/Courses. If you have advice on setting up labs, course computers, or any other teaching-related topic, post those too. By sharing material, we'll all create effective courses more quickly.

Announcement from Patrick Goebel to ros-users

Hello All,

I have released a ROS package for the Serializer microcontroller made by the Robotics Connection. I have been using and developing this package several months along with at least one other ROS user so I hope I have most of the egregious bugs fixed. Please let me know if you run into trouble with either the code or the Wiki page:

The Serializer has both analog and digital sensor pins, a PWM servo controller, and a PID controller that works well with the ROS navigation stack.

The underlying driver is written in Python and provides some convenience functions for specific sensors such as the Phidgets temperature, voltage and current sensors, the Sharp GP2D12 IR sensor, and the Ping sonar sensor. It also includes two PID drive commands: Rotate(angle, speed) and TravelDistance(distance, speed) for controlling the drive motors. Most of the Serializer functions have been implemented though a few have not been tested since I don't currently have some of the supported sensors. The functions that have not been tested are:

  • step (used with a bipolar stepper motor)
  • sweep (also used with a bipolar stepper motor)
  • srf04, srf08 and srf10 (used with a Devantech SRF04, SRF08 and SRF10 sonar sensors)
  • tpa81 (used with the Devantech TPA81 thermopile sensor)

The driver requires Python 2.6.5 or higher and PySerial 2.3 or higher. It has been tested on Ubuntu Linux 10.04.

Sincerely,
Patrick Goebel
The Pi Robot Project

AVR and ROS

| No Comments | No TrackBacks

Announcement from Adam Stambler of Rutgers to ros-users

Hello Folks,

I am proud to announce a new tool for using Arduinos and AVR processors in ROS projects.  avr_bridge allows the AVR processors to directly publish or subscribe to ROS topics. This allows everything from Arduinos to custom robot boards to be first class ROS components.  This package can be found in Rutgers' new rutgers-ros-pkg.

avr_bridge is meant to simplify the use of an Arduino and avr processors in a ROS based robot by providing a partial ROS implementation in avr c++. In hobbyist robotics, these microcontrollers are often used to read sensors and perform low level motor control. Every time a robot needs to interface with an AVR board, a new communication system is written. Typically they all use a usb-to-serial converter and either a custom binary or text based protocol. AVR bridge replaces these custom protocols with an automatically generated ROS communication stack that allows the AVR processors to directly publish or subscribe to ROS topics.

avr_bridge has already been deployed on Rutger's PIPER robot and in the communications layer for a Sparkfun imu driver. In the next few weeks, it will be deployed on our newest robot, the Rutgers IGVC entry as the communication layer on all of our custom, low level hardware.   By using avr_bridge to communicate with our PCs, we have cut down on redundant code and simplified the driver by allowing the avr_processor to directly publish msgs.  It is our hope that by extending ROS to the 8-bit microcontroller level we will see more open-source hardware that can be quickly integrated into cheap, custom robot platforms.

Cheers,
Adam Stambler
Rutgers University

Inverse Dynamics and Dynamics Markers

| No Comments | No TrackBacks

Daniel Hennes from Maastricht University spent his internship at Willow Garage modeling the dynamics of robotic manipulators using statistical machine learning techniques. He also created a useful visualization utility for ROS/rviz users that enables users to intuitively visualize the joint motor torques of a robot. Please watch the video above for an overview or read the slides below (download pdf) for more technical details. The software is available as open source in the inverse_dynamics and dynamics_markers packages on ROS.org.

ompl-gui_path-small.jpg

Announcement from Mark Moll to robotics-worldwide

Dear colleagues,

The Kavraki Lab is pleased to announce the initial release of the Open Motion Planning Library (OMPL). OMPL is a lightweight, thread-safe, easy to use, and extensible library for sampling-based motion planning. The code is written in C++, includes Python bindings and is released under the BSD license.

Here are some of OMPL's features:

  • Implementations of many state-of-the-art sampling-based motion planning algorithms. For purely geometric planning, there are implementations of KPIECE, SBL, RRT, RRT Connect, EST, PRM, Lazy RRT, and others. For planning with differential constraints there are implementations of KPIECE and RRT. Addition of new planners poses very few constraints on the added code.
  • A flexible mechanism for constructing arbitrarily complex configuration spaces and control spaces from simpler ones.
  • A general method of defining goals: as states, as regions in configuration space, or implicitly.
  • Various sampling strategies and an easy way to add other ones.
  • Automatic selection of reasonable default parameters. Performance can be improved by tuning parameters, but solutions can be obtained without setting any parameters.
  • Support for planning with the Open Dynamics Engine, a popular physics simulator.
  • Tools for systematic, large-scale benchmarking.

OMPL is available at http://ompl.kavrakilab.org.

OMPL is also integrated in ROS and will be available as a ROS package (see http://www.ros.org/wiki/ompl/unstable for documentation and download instructions). It will be fully integrated with the next major release of ROS (D-Turtle), which is scheduled for release in early 2011.

On top of the OMPL library, we have developed OMPL.app: a GUI for rigid body motion planning that allows users to load a variety of mesh formats that define a robot and its environment, define start and goal states, and play around with different planners. The OMPL.app code also comes with concrete command-line examples of motion planning in SE(2) and SE(3) (with and without differential constraints), using ODE for physics simulation, PQP for collision checking, and Assimp for reading meshes. OMPL.app is distributed under the Rice University software license (essentially, free for non-commercial use).

The Kavraki lab is fully committed to further developing OMPL for research and educational purposes. Please check out http://ompl.kavrakilab.org/education.html and contact us if you are interested in using OMPL in your class.

This project is supported in part by NSF CCLI grant #0920721 and a generous gift by Willow Garage.

Ed: here's a video from a year and a half ago showing some of Ioan Sucan's work with OMPL and the PR2

Find this blog and more at planet.ros.org.


Monthly Archives

About this Archive

This page is an archive of entries from January 2011 listed from newest to oldest.

December 2010 is the previous archive.

February 2011 is the next archive.

Find recent content on the main index or look in the archives to find all content.