Recently in robots Category

Husqvarna Research Platform

| No Comments | No TrackBacks

From Stefan Grufman via ros-users@

We would like to announce support for ROS into some of our products. We will be showing this at ICRA 2016 (in Stockholm) during the 16/5 to 20/5.

Husqvarna Group has been manufacturing and selling robotic lawn mowers for more than 20 years. These robots are pretty basic when it comes to sensors and intelligence but we are of course researching how these products will be changed for the future. We have spent a some time doing internal research but in order for us to better work with you (the real researchers!) we have now adapted our robot (Automower 330X) to ROS by exposing an interface and implementing a driver for this (the driver will be available as open source soon). We really like the trend in robotics research towards robustness and long term autonomy. This is an area where we think we can help/boost the research by making our hardware available to researchers.

The idea is that we have a very robust & safe robot that will operate 24/7 in all weather conditions (except Scandinavian winter). It has a safety system (collision, lift and the loop around your area) and it will automatically return to the charging station when charging is needed. There are also plenty of space to include your own set of sensors as well as computational power, both inside the chassis as well as outside. We can provide mechanical drawings of mounts that you can print out on an SLS/SLA machine.

So, the offer to you is to get access to this, we call it the Husqvarna Research Platform (HRP), and use it as an outdoor mobile robotics platform for your research. If you need/like, the safety system can be used to run multiple battery cycles without need to handle docking/charging. This could for example be used when collecting data sets over long periods of time. The HRP also supports manual mode, and in this case you have full control of the motors (through the "/cmd_vel" topic) and can do whatever you need. You can mount extra computing power (we usually use an Odroid XU4) and/or sensors of your choice.

The platform will be presented and demoed by Husqvarna as well as one of our research partners, Örebro Univeristy (AASS) during ICRA 2016. We will have a booth at the ICRA expo and would like to invite you all to come and talk with us there. During ICRA 2016, we will also take ideas for your research ideas and hand out the mower shown at the demo to the best idea!

Husqvarna Group information can be found here: http://www.husqvarnagroup.com/en

Information on our robotic products can be found here: http://www.husqvarna.com/uk/products/robotic-lawn-mowers/

Ridgeback the mobility solution for Baxter

| No Comments | No TrackBacks
From Meghan Hennessey of Clearpath Robotics

RB_Baxter3.22.png

Clearpath Robotics announced the newest member of its robot fleet: an omnidirectional development platform called Ridgeback. The mobile robot is designed to carry heavy payloads and easily integrate with a variety of manipulators and sensors. Ridgeback was unveiled as a mobile base for Rethink Robotics' Baxter research platform at ICRA 2015 in Seattle, Washington.

 

"Many of our customers have approached us looking for a way to use Baxter for mobile manipulation research - these customers inspired the concept of Ridgeback. The platform is designed so that Baxter can plug into Ridgeback and go," said Julian Ware, General Manager for Research Products at Clearpath Robotics. "Ridgeback includes all the ROS, visualization and simulation support needed to start doing interesting research right out of the box."

 

Ridgeback's rugged drivetrain and chassis is designed to move manipulators and other heavy payloads with ease. Omnidirectional wheels provide precision control for forward, lateral or twisting movements in constrained environments. Following suit of other Clearpath robots, Ridgeback is ROS-ready and designed for rapid integration of sensors and payloads; specific consideration has been made for the integration of the Baxter research platform.

 

"Giving Baxter automated mobility opens up a world of new research possibilities," said Brian Benoit, senior product manager at Rethink Robotics. "Researchers can now use Baxter and Ridgeback for a wide range of applications where mobility and manipulation are required, including service robotics, tele-operated robotics, and human robot interaction."

 

Learn more about Ridgeback AGV at www.clearpathrobotics.com/ridgeback


ridgeback V0.109.png


Erle Robotics brain and vehicles

| No Comments | No TrackBacks
From Víctor Mayoral Vilches of Erle Robotics via ros-users@

Hi everyone,

I'd like to introduce Erle-Brain (https://erlerobotics.com/blog/product/erle-brain/) Linux autopilot, a ROS-powered embedded computer that allows to build different kind of drones and robots.

Using Erle-Brain we've build several vehicles (Erle-Copter, Erle-Plane, Erle-Rover, ...) displayed at http://wiki.ros.org/Robots and we keep exploring new paths. The brain runs the APM software autopilot (in Linux) which connects with ROS through the mavros bridge allowing to control the robots simply publishing to ROS topics.

This ROS package (https://github.com/erlerobot/ros_erle_takeoff_land) shows a simple example on how to autonomously take off and land a VTOL powered by Erle-Brain.

We are really excited to see what people can do with our Brain and vehicles so we've decided to launch a program that offers discounts for educational and research purposes called dronEDU (dronedu.es).
Feel free to get in touch with us if you are interested.
From Sammy Omari of Skybotix AG and ETHZ ASL via ros-users@

We are 
happy to announce an Early Adopter Program for the visual-inertial sensor that is developed by the Autonomous Systems Lab, ETH Zurich and Skybotix AG.

vi-sensor-front.jpg

The VI-Sensor is a light-weight, fully-calibrated and time-synchronized hardware platform for visual-inertial odometry applications, e.g. for UAV navigation in cluttered environments. It features a high-quality global shutter HDR stereoscopic camera and an industrial-grade inertial measurement system. A detailed spec-sheet can be found here.


The Early Adopter package includes:

- VI-Sensor (factory calibrated: intrinsic, spatial & temporal inter-sensor)

- Linux driver (ROS enabled)

- SDK with example code for working with image and IMU data in openCV and/or ROS 

- Ready for stereo visual odometry framework viso2 

- Power supply cable kit

- Access to the Wiki page 

- Driver and firmware updates

- VI-Sensor protection case

 

A detailed spec-sheet can be found here.

 

Price: EUR 3'900.00 (*)


The program addresses research groups around the world that are eager to get their hands on the sensor early.

If you are interested in receiving an Early Adopter quotation, please contact us via sales@skybotix.com indicating your invoice and shipping address.

Please submit your purchase order until May 31th. Purchase orders received after this date cannot be taken into account for the Early Adopter Program.

https://ci3.googleusercontent.com/proxy/8CBWEZYl_gju-O0mEKjDNw0Eh_qIbYUu1oZBKfZqOypmLO1gU9NfBBUnKK5HrwoN-VKG8iQn_PvQpZDlbxq0IwVL9dQOy_HcGJ0=s0-d-e1-ft#https://mail.google.com/mail/u/0/images/cleardot.gif

We are looking forward to hearing from you.


From Bert Willaert of Intermodalics.

Intermodalics is currently developing a depalletizing application for a client. The goal is to move an average of 2,000 crates per hour from standard pallets to a conveyor belt. Additional challenges include: more than 10 different crate types can occur in varying colors, the crates are not necessarily empty and they are randomly stacked.

The application consists of a UR10 robot from Universal Robots, a 3D camera, an Intermodalics Intelligent Controller (IIC) and an active pallet lift. The software for the application running on the IIC extensively uses ROS and the OROCOS toolchain. OROCOS is a software framework for realtime, distributed robot and machine control which is seamlessly integrated with ROS and has both Industrial and Academic users worldwide.

For finding the crates' position and orientation, Intermodalics developed a crate localizer that builds upon the PCL library as well as on a set of in-house developed point-cloud processing algorithms. The ROS visualization tool RViz proved absolutely invaluable during the realization of this product locator.


The use of the ROS-Industrial package for the UR robot allows both the motions and the application state machine to be simulated. This significantly facilitates the implementation of the whole application.

The integration of the UR controller and the IIC does not affect the inherent safety feature of the UR robot which makes the robot stop if it encounters excessive forces. If such a stop occurs, the application can be easily restarted by a simple human operator intervention.

HERE mapping cars run ROS

| No Comments | No TrackBacks

As reported at HERE Three Sixty, their global fleet of hundreds of mapping cars is running ROS!

HERE car

They carry laser range-finders, cameras, and GPS that are used to estimate the vehicle's posisiton and gather 3-D pictures of the surrounding environment. That data gets shipped back to their headquarters for processing.

As HERE's Michael Prados put it, "The system of sensors and computers means the software that's needed is very like that which is used to create robots." So they decided to build their cars' software on ROS. The software runs on a headless server in the car's interior, with the driver interacting via a mobile application on a tablet that he or she can operate easily from the seat.

HERE car interior

"We chose the open source ROS because it was the best solution, hands-down," Michael concludes. "And now we're looking into the ways that we might give back to OSRF, and help its future success."

Read the whole story at HERE Three Sixty.

PR2 Support transferred to Clearpath Robotics

| No Comments | No TrackBacks
From Clearpath Robotics via ros-news@

Clearpath Robotics welcomes the PR2 robot and community to its growing family

 

(Menlo Park, CA and Kitchener, ON, Canada - January 15, 2014)  Willow Garage, the developer of PR2, announces the immediate transfer of support and services responsibilities to Clearpath Robotics, a leader in mobile robotics for research and development.  Willow Garage's development of PR2 along with the Robot Operating System (ROS) has produced the world's leading mobile manipulation platform for research and development.  Willow Garage will continue to sell its remaining stock of PR2 systems while Clearpath Robotics now becomes the sole provider of hardware and software support to current and future PR2 customers.  Interest in PR2 systems should continue to be directed to Willow Garage through its portal at www.willowgarage.com, while members of the PR2 community should direct correspondence to www.clearpathrobotics.com.

 

"Willow Garage is committed to continue to support customers of its PR2 personal robotics platform,"  said Scott Hassan, Founder and Chairman, Willow Garage.  "I am delighted that Clearpath Robotics will be fulfilling that commitment at least through 2016."

 

"The PR2, along with ROS, changed the pace of robotics research and created history," said Matt Rendall, CEO at Clearpath Robotics. "We've been a champion of ROS since the start, so we understand and value the PR2 community and their work. We're ecstatic to take on service responsibilities for this piece of history, and advance development within the community."

 

The PR2 is a compliant mobile manipulation platform built by Willow Garage.  Released for production in 2010, the robot's safe, modular design spurred groundbreaking research in the fields of autonomy, mobile manipulation, and human robot interaction. The standardized platform enables researchers to share their work and leverage the open source software community (ROS); today over 1000 software libraries exist for the 40 PR2's in use in over a dozen countries. 

 

Clearpath Robotics has been a longstanding partner of Willow Garage as an early adopter of ROS, the first manufacturing partner for Turtlebot, founding sponsors of the annual ROS developers conference, ROSCon, and Clearpath's CTO, Ryan Gariepy, is a founding board member for the Open Source Robotics Foundation (OSRF).

 

In order to provide continued customer excellence for PR2 support, Clearpath Robotics is currently hiring Open Source Software Engineers. (http://www.jobscore.com/jobs/clearpathrobotics/open-source-support-engineer/bCzfUeyq8r44AXiGakhP3Q?ref=rss&sid=68).

Clearpath Robotics' Husky Goes Hydro

| No Comments | No TrackBacks

Clearpath Robotics announces Hydro Medusa Support


Husky software is now available for the new ROS Hydro Medusa Platform


(Kitchener, ON, Canada - October 17, 2013) Clearpath Robotics has launched Husky software for ROS' latest distribution, Hydro. The largest change in the package was the transition to the new ROS build system, catkin. Significant improvements to the Gazebo-based Husky simulator and basic autonomy are also included, completely free of charge. The package, and all other Husky ROS software, is available at www.github.com/husky.


"Along with the improvements we've made to our own libraries, upgrading to ROS Hydro makes available a significant set of other new features, including compatibility with the latest version of Gazebo and easy access to the alpha release of the MoveIt! manipulation library," said Ryan Gariepy, CTO at Clearpath Robotics. "The release of ROS Hydro validates the ability of the global robotics community to sustain a common software framework, while simultaneously adding exciting new functionality."


This Husky software release has maintained topic names and clearpath_base message types to ensure a smooth transfer from ROS versions as far back as ROS Electric. Users can continue to use rosbuild workspaces in Hydro. It is recommended to begin the migration to the catkin framework as rosbuild will be unavailable in the next ROS distribution (Indigo Igloo), currently slated for release in April 2014. Husky for Hydro can be used alongside previous versions of ROS on the same workstation. 


For those who don't yet own a Husky, a simulator package is available here (http://wiki.ros.org/husky_simulator) to enable dynamically-accurate Husky simulation on Ubuntu desktop with one command.


Clearpath will continue to publish detailed guides to the use of ROS software, starting with an introductory article which can be found here (http://www.clearpathrobotics.com/husky-for-ros-hydro/)


University of Costa Rica Explores Aerospace Research

| No Comments | No TrackBacks
Clearpath Robotics announces the use of their Husky in Costa Rica

Martinez1.jpg

Dr. Geovanni Martinez from the University of Costa Rica has developed a novel visual odometer algorithm for accurate and more efficient tracking of Mars rover navigation. Dr. Martinez is utilizing Clear path Robotics' Husky to test and validate the algorithm that uses one-stage maximum-likelihood estimation, rather than traditional two-stage algorithms.


"It's fantastic to witness breakthrough research of this nature, and to know that it is being validated and furthered because of our mobile robotic platform," said Matt Rendall, Chief Executive Officer at Clearpath Robotics.


Dr. Martinez' team is creating a real time image acquisition system consisting of three IEEE-1394 cameras. The system is being developed under Ubuntu 12.04.2 LTS, ROS Fuerte and the programing language C". The image acquisition system corrects, in real time, the radial and tangential distortions due to the camera lens. With regard to the hardware, Dr. Martinez commented, "We like Husky A200 because the software for image acquisition, and driving the robot, was easy to implement using ROS. It saved us a lot of development time. Additionally, it is strong enough to be driven in extreme environments."


Using the algorithm, the rover's motion will be estimated by maximizing the conditional probability of the frame to frame intensity differences at the observation points. The conditional probability is computed by expanding the intensity signal by a Taylor series and neglecting the nonlinear terms. This results in the well-known optical flow constraint, as well as using a linearized 3D observation point position transformation, which transforms the 3D position of an observation point before motion into its 3D position after motion given the rover's motion parameters. Perspective projection of the observation points into the image plane and zero-mean Gaussian stochastic intensity errors at the observation points are also assumed.

hektaros.png

RoadNarrows is pleased to announce the compatibility of its Hekateros family of robotic manipulators with ROS, the Robot Operating System, making them easier than ever to use in research and light manufacturing. Developed by the Open Source Robotics Foundation, ROS provides a standardized framework allowing easy integration of robots, sensors, and computing platforms to solve complex problems through the combination of simple ROS-enabled components.


Hekateros manipulators are available for sale in 4DOF (articulated planar arm) and 5DOF (articulated planar arm + rotating base) configurations. The Hekateros platforms are ideal for applications and research in robotic control systems, visual-servoing, machine intelligence, artistic installations, and light manufacturing.


Standard Hekateros models have an impressive range of motion (see below) with a fully extended reach of approximately 1m and a payload capacity of nearly 1kg. Each arm can be special-ordered to meet custom length and loading requirements in order to suit the needs of almost any application. All versions of Hekateros come with a default end-effector based on the RoadNarrows Graboid gripper, with a built-in webcam for visual-servoing applications.


The wrist and rotating base both provide continuous rotation while passing power and data (USB, video, Dynamixelâ„¢ ,GPIO and I2C) to the processor in the base. Open mechanical and electrical interfaces allow for the integration of additional sensors and actuators beyond those included on the standard arm. Add-ons may be integrated directly into the base of the robot, at an equipment deck on top of the rotating base, or at the open end-effector interface.


Hekateros manipulators are network enabled, and onboard processing is powered by the 1GHz ARM processor of the Gumstix® Overo® FireSTORM COM. This allows for autonomous operation of Hekateros in isolation from a computing infrastructure, or the ability to connect Hekateros with a powerful computing array for computationally intensive tasks. Multiple manipulators may also be configured for simultaneous and cooperative operation, and can easily be integrated with a variety of networked platforms and sensors.


The Hekateros platform is built around the powerful Dynamixelâ„¢ actuators by ROBOTISâ„¢, which provide many advanced features such as: continuous rotation through 360 degrees; high-resolution encoders; excellent torque, position, and velocity feedback and control; and an extensive low-level interface to monitor the servo state and health. On top of the Dynamixel firmware, RoadNarrows has built libraries and utilities that expose all of the features of Dynamixel servos through a clean and uniform C++ interface. Key features of the RoadNarrows Dynamixel library include virtual odometry for continuous rotation, a software PID for motion control in continuous rotation, and a command line utility (dynashell) for accessing and controlling a chain of Dynamixel servos. The Hekateros ROS packages bring the full power of Dynamixel actuators to ROS.


The Hekateros ROS interface exposes all functionality of the Hekateros manipulators as services, subscriptions and action servers. The control interface also publishes extensive state data on every aspect of the arm. The hekateros_control node conforms with ROS standards, such as the ROS industrial interface and MoveIt! motion planning suite.


In addition to the hekateros_control node, users of the Hekateros ROS interface are also provided with:

  • a graphical interface (hekateros_panel) that provides easy access to every feature of the control node,

  • numerous launch files including live demos, simulations, and integration with advanced motion planning libraries, and

  • extensive documentation in the wiki.


More detailed information about the Hekateros ROS interface is available on the wiki pages on GitHub.


The Hekateros platform was developed thanks in part to the support of the NSF SBIR program under grant number 1113964.


Please direct all inquiries to info@roadnarrows.com.


About RoadNarrows LLC


Based in Loveland, Colorado, RoadNarrows is a privately-held robotics and technologies company founded in 2002. RoadNarrows Research & Development develops intelligent peripheral components and accessories, including cameras, mobile sensor architectures, and open-source platform software, to give robotics researchers advanced time- and resource-saving tools. RoadNarrows' retail operation sells and provides technical support for some of the most popular robotic product lines used by the academic and research community world-wide.


For more information, visit: www.roadnarrows.com.




Links:


The Hekateros project main page: http://www.roadnarrows.com/Hekateros/


Hekateros for sale on the RoadNarrows Store: http://www.roadnarrows-store.com/hekateros-arm.html


Hekateros' range of motion on YouTube: https://www.youtube.com/watch?v=13lpd655wC4


The RoadNarrows Graboid Gripper for sale on the RoadNarrows store: http://www.roadnarrows-store.com/roadnarrows-graboid-series-d.html


The Gumstix® Overo® FireSTORM COM, used in Hekateros, for sale on the RoadNarrows store: http://www.roadnarrows-store.com/gumstix-overo-firestorm-com.html


The Gumstix® main page: https://www.gumstix.com/


Dynamixelâ„¢ servos by ROBOTISâ„¢ for sale on the RoadNarrows Store: http://www.roadnarrows-store.com/manufacturers/robotis/dynamixel-servos.html


The ROBOTISâ„¢ main page: http://www.robotis.com/


The MoveIt! main page: http://moveit.ros.org/


The Hekateros ROS interface documentation wiki pages on GitHub: https://github.com/roadnarrows-robotics/hekateros/wiki


NSF GRANT SUPPORT AND DISCLAIMER - The project described above is supported by Grant Number 1113964. from the National Science Foundation. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.



PAL Robotics is proud to announce the upcoming release of our newest robotic platform REEM-C, the first commercially available biped robot from PAL Robotics. It leverages our experience developing the REEM-A and REEM-B biped robots and the commercial service robot REEM (our first ROS-compatible robot).


PAL Robotics - REEM-C.jpg

REEM-C has been developed to meet the needs of the academic community for robust and versatile robotic research platforms. This platform allows conducting research on walking, grasping, navigation, whole-body control, human-robot interaction, and more. It is integrated into ROS and Orocos (for real-time motion generation and control).


REEM-C is an adult size humanoid (165 cm), and it has 44 degrees of freedom, two i7 computers, force/torque and range finders on each feet, stereo camera, 4 microphones, and other devices that make REEM-C one of the best equipped research platforms today. Optionally, a depth camera can also be attached to the head. It also has already developed software for walking, grasping, navigation and human-robot interaction.


We would like also to announce that there are promotional conditions for orders before August 31st.


For further information about REEM-C, please contact PAL Robotics at info@pal-robotics.com, or visit REEM-C's webpage. For information on promotion conditions and inquiries, please contact business@pal-robotics.com.

A New Robot Joins the ROS Community

| No Comments | No TrackBacks
Over the past few years ROS has grown incredibly fast. ROS support now exists
for a wide range of robots, such as manipulators, UAV's, surface vessels, ground
vehicles, humanoids, and many more. With Clearpath Robotics' introduction of
Grizzly, a whole new category of robots is added to ROS: the Robotics Utility
Vehicle.

grizzly_ros.jpeg

Designed for the most aggressive of agriculture, mining and defense robotics
research programs, Grizzly is an ATV-sized robotic platform built to perform
like a tractor with the precision of an industrial robot. It can pull a plow, carry
a massive 600 kg payload, and mount a wide range of standard utility vehicle
accessories.

Grizzly is a ROS-native robot, allowing users to pull from a huge resource of
information and code, as well as cooperate with a fast growing community of
experts. Using ROS also allows code to be ported from one robot to another,
enabling you to take your lab research into the field quickly and easily.

Grizzly is aptly named. This bot is equipped with an extremely powerful drivetrain
delivering a maximum drawbar pull of 6300 N (1400 lbf). It can survive the
toughest tests, providing modularity while maintaining the rugged and robust
design, which has become a Clearpath trademark. With 26" all-terrain tires and
an oscillating front axle, Grizzly can conquer large obstacles with all four wheels
securely on the ground. It also offers top of the line control system performance.
Independent high power DC motors with individual closed loop control give fine
control even in the toughest terrain, while high resolution encoders and an array
of internal sensors provide detailed feedback on the robot's state.



ROS welcomes Grizzly to the community!

If you're looking for more information, check out the Clearpath Robotics Grizzly
website.

Robots Using ROS: BIRD MURI

| No Comments | No TrackBacks
Getting Small UAVs to Imitate Human Pilots Flying through Dense Forests
bird_muri.png

The Robotics Institute at CMU has been developing systems to learn from humans. Using a Machine Learning class of techniques called Imitation Learning the group has developed AI software for a small commercially available off-the-shelf ARdrone to autonomously fly through the dense trees for over 3.4 km in experimental runs. They are also developing methods to do longer range planning with such purely vision-guided UAVs. Such technology has a lot of potential impact for surveillance, search and rescue and allowing UAVs to safely share airspace with manned airspace.
Watch the flight in unstructured environments below.

ROS on Toyota's HSR

| No Comments | No TrackBacks

Cross Posted from the Open Source Robotics Foundation Blog

On the heels of the recent announcement that Rethink's Baxter was built on ROS, we heard today from our friends at Toyota that their new robot is also running ROS!

Toyota's Human Support Robot, or HSR, will provide assistance to older adults and people with disabilities. A one-armed mobile robot with a telescoping spine, the HSR is designed to operate in indoor environments around people. It can reach the floor, tabletops, and high counters, allowing it to do things like retrieve a dropped object or put something away in its rightful place. An exemplar of the next generation of robot manipulators, the arm is low-power and slow-moving, reducing the chance of accident or injury as it interacts with people.

HSRHSR

And it runs ROS. Dr. Yasuhiro Ota, Manager of the Toyota Partner Robot Program, tells us that the HSR runs ROS Fuerte [http://ros.org/wiki/fuerte] and uses a number of ROS packages, including: roscpprospyrviztfstd_msgspclopencv. As for why they chose to use ROS, Dr. Ota says, "ROS provides an excellent software developmental environment for robot system integration, and it is also comprised of a number of useful ready-to-use functions."

Rethink ROS

| No Comments | No TrackBacks
Cross Posted from the Open Source Robotics Foundation Blog

There's exciting news out of Boston today with the launch of Rethink Robotics's new robot. Rethink Robotics is developing a family of low cost and highly intelligent robots that can perform simple tasks in a manufacturing environment, increasing the productivity of the people around them. Rethink Robotics was founded by Rodney Brooks, former Director of the MIT Computer Science & Artificial Intelligence Laboratory, and co-Founder of iRobot Corporation.

Rethink's robots can be taken out of the box, taught a task by anyone, and start work in a few hours, eliminating the need for systems integration. They are safe to interact with people at close range and are easy to train and retrain on the fly. They are nothing like any existing industrial robots.

While all of this is very exciting for the robotics industry, and certainly for our friends at Rethink, what we personally find most exciting is the role played by ROS in today's news. Rethink's new Baxter robot is, in the words of CEO Scott Eckert, "built upon ROS." We had some hint from Rethink's (then Heartland's) support of ROSCon 2012 that they were doing something with ROS, but we were very pleasantly surprised today to hear that ROS is such a central part of Baxter.

Rethink's
Baxter

As ROS edges closer to its five-year anniversary, this is a great milestone for the ROS community. Rethink is actively hiring for a Senior Developer Relations Engineer with expertise in ROS, and expects that individual to play an important role as part of the ROS community.

Congratulations to everyone at Rethink Robotics, and we are looking forward to their contribution to the ROS community.

romeo stack annoucement

| No Comments | No TrackBacks

romeo_hardware_main1.pngAnnouncement by Thomas Moulard to ros-users

Hello everyone,

I am pleased to announce the release of the romeo stack for ROS. This stack is a joint work with François Keith.

Romeo is a 143-cm humanoid robot designed by Aldebaran Robotics.

Its full description is available here:

http://projetromeo.com/romeo-documentation/index.html

The romeo stack contains the URDF model of this robot and its associated meshes.

This robot is AFAIK the first humanoid robot whose full description is freely available in ROS (i.e. kinematic chain, dynamic information and meshes). We would like to thank Aldebaran Robotics for authorizing us to publish these data.

The package is still not yet complete (we are missing the hands, the eyes and the sensors position for instance) but we will be updating it as soon as possible. We wlll also provide SRDF and contact zones information (see rcpdf stack).

Best,

Thomas Moulard



It looks like the TurtleBots at ClearPath Robotics are having some springtime fun. I wonder if there are any TurtleBot easter eggs you can find.

Guest post from Mikkel Rath Pedersen, Department of Mechanical and Manufacturing Engineering, Aalborg University

aalborg-little-helper-1.jpgThe autonomous industrial mobile manipulator "Little Helper" has been the focus of many research activities since the first robot was designed in 2008, at the Department of Mechanical and Manufacturing Engineering at Aalborg University, Denmark. The focus has always been on flexible automation, since this is paramount as production companies experience a shift from mass production to mass customization. An aim is to use existing, industrial hardware, and incorporating these components into a fully functioning industrial mobile manipulator.

Since the original design, the robot has been rebuilt several times. At the present time, the department has two versions of the Little Helper, at the two campuses of the department in Aalborg and Copenhagen. The two systems use the same hardware, the only differences being minor in the construction and electrical system.

Both systems include the following components:

  • KUKA Light Weight Robot (LWR) arm (7DOF, integrated torque sensors in each joint)
  • Neobotix MP-L655 differential drive platform, equipped with
    • Two SICK S300 Professional laser scanners
    • Five ultrasonic sensors
    • Eight 12V batteries, yielding 152 Ah @ 24V total
  • Schunk WSG-50 electrical parallel gripper
  • Microsoft Kinect RGBD Camera
  • Onboard ROS computer (workstation on one, laptop on the other)

A recent focus has been on the implementation of ROS on the entire system, in order to make the transition from vendor-specific communication protocols to something more general. This required the use of some existing packages, that were readily available on the ROS website, including the stacks for the Kinect camera (openni_camera and openni_tracker), and the Neobotix stacks (neo_driver, neo_common and neo_apps) that were recently made available by Neobotix. However, much work has also gone into creating ROS packages for communicating with the KUKA LWR (through the Fast-Research Interface available with the robot arm) and the Schunk gripper.

The goal of some current and future research projects are:

  • modular architectures for mobile manipulators,
  • task-level programming using robot skills,
  • gesture-based instruction of mobile manipulators, and
  • mission planning and control

The Little Helper is involved in the EU-FP7 projects TAPAS and GISA(ECHORD).

For more information see www.machinevision.dk or www.m-tech.aau.dk

Contacts:

  • PhD Student Mikkel Rath Pedersen, mrp@m-tech.aau.dk
  • PhD Student Carsten Høilund, ch@m-tech.aau.dk
  • Postdoc Simon Bøgh, sb@m-tech.aau.dk
  • Postdoc Mads Hvilshøj, mh@m-tech.aau.dk
  • Professor Ole Madsen, om@m-tech.aau.dk
  • Associate Professor Volker Krüger, vok@m-tech.aau.dk

aalborg-little-helper-rviz.png

Guest post from Simon Roder from University of Bern, Institute for Surgical Technology & Biomechanics

The goal of our research is the development of a precision approach for minimally invasive hearing aid implantations. Our approach centers around a imaged-guided surgical robot system, capable of drilling a direct tunnel access (diameter 1.2 mm) from the outside of the skull, through the Temporal Bone into the middle ear. The drill trajectory is planned using high-resolution cone beam computer tomography. The deviation between the planned and the actual drill trajectory shall be less then 0.5 mm in order to avoid damaging sensible nerves within the temporal bone.

To achieve such accuracy, our system consists of a specifically developed robotic manipulator with a 5 DOF serial kinematic, guided by an optical tracking system with a tracking accuracy of 20 microns. The robot weights around 5 kg and can thus be mounted directly to an OR table. It comprises a sensitive force-torque sensor in its tool tip and a electromyography (EMG) sensor integrated in the instrument tip. The surgeon controls the system functionalities by means of a graphical user interface and the robot system itself through haptic force feedback.

The robot system together with available patient data is modeled using ROS to observe the robots movements, its sensors and possible collisions in real-time. The model is updated and visualized using Rviz running on a dedicated client computer connected via CAN to the robot control system.

Link

coroware_logo.jpgPress release from CoroWare

Kirkland, WA - January 31, 2012 - CoroWare, Inc. (COWI.OB), today announced a new upgrade offer for existing CoroBot® Classic and CoroBot Explorer unmanned ground vehicle (UGV) customers. These upgrades will help bring earlier CoroBot UGV models up to date, and will enable a new class of CoroBot applications based on ROS from Willow Garage.

The ROS software platform is rapidly becoming the standard for open robotics development, and has a large and active developer community. CoroWare's ROS Upgrade Program will help its customers migrate their existing CoroBot UGV platforms, which is based on Linux and Player software distributions, to Robot Operating System (ROS), which has been deployed on unmanned ground vehicles, air vehicles, and surface vehicles around the world.

"Willow Garage is delivering the ROS software platform with which vendors, such as CoroWare, can provide affordable and open mobile robot platforms that robot scientists need for prototyping robotics applications," said Brian Gerkey of Willow Garage. "CoroWare's announcement today will help grow the community of robotics researchers and educators who are building applications based on ROS."

CoroWare's ROS Upgrade Program includes an initial assessment of the CoroBot that the customer purchased. For some customers, software upgrades will only be required and will be free of charge. For other customers who purchased older CoroBot models, hardware upgrades may be required and will be priced accordingly.

"CoroWare's ROS Upgrade Program will give our customers a greater choice of ROS-based applications and software modules to run on their existing CoroBot platforms", said Andrew Zager, product marketing engineer at CoroWare. "Because ROS is not limited to any robotics platform, we look forward to migrating any third party mobile robots and applications to ROS in the future."

CoroWare's ROS Upgrade Program for all CoroBot platforms is available now. Customers may get further details by visiting our website at robotics.coroware.com; or sending e-mail inquiries to sales@coroware.com, or contacting us at 1-800-641-2676, option 1.

Raven II open-source surgical robots

| No Comments | No TrackBacks

The Raven II is helping the open-source community advance the state of the art in surgical robotics. In a joint venture between the University of Washington and UC Santa Cruz, the National Science Foundation funded the development of seven identical Raven II surgical robots. Each system has a two-armed surgical robot, a guiding video camera, and a surgeon-interface system built on top of ROS.

These surgical robots are linked via the Internet so researchers can easily share new surgical robotics research and developments. Five Raven II robots are being given to major medical facilities at Harvard University, Johns Hopkins University, the University of Nebraska, UC Berkeley, and UCLA.

According to Blake Hannaford at the University of Washington:

"These are the leading labs in the nation in the field of surgical robotics, and with everyone working on the same platform, we can more easily share new developments and innovations."

For more information, please see the UC Santa Cruz press release and this gizmag.com article.

Links:

Weeding in organic orchards is a tedious process done either mechanically or by weed burning. Researchers at University of Southern Denmark and Aarhus University created the ASuBot (Aarhus and Southern Denmark University Robot), a self-driving tractor, to handle navigate around trees in organic orchards. Weeding is done using gas burners that the ASuBot makes sure is not damaging the trees.

ASuBot is built on a Massey Fergusson 38-15 garden tractor outfitted with a SICK laser range finder and Topcon AES-25 steering. It is able to navigate autonomously without the use of GPS antennas, which would not work under shaded trees and would also make the robot more costly.

The FroboBox, the ASuBot's on-board computer, is a Linux-powered computer running the FroboMind software that runs on top of ROS. FroboMind provides a common, conceptual architecture for field robots and has already been integrated with five different platforms.

For more information about ASuBot and FroboMind, please see fieldrobot.dk.

Theocorpora has posted a new update on Qbo showing their open-source robot platform driving around on its own and tracking people. Head over the their blog to find out more about the various sensors and techniques they use to make Qbo more social.

BiliBot: New Video, Now with Rebate

| No Comments | No TrackBacks

The Bilibot developers have been busy with their Create+Kinect platform and have two new things to share. First, there's a new video (above) that shows off their "Developer Edition", including the brand-new arm. Second, they're offering a rebate of up to $350 if you release original, innovative applications for Bilibot to the rest of the community. It's a great incentive to put the platform in the hands of more developers and encourage collaboration in the community.

For more information, please visit Bilibot.com.

  corobot2.png corobot1.jpg

Coroware has announced support for ROS on their CoroBot and Explorer mobile robots. They will be supporting ROS on both Ubuntu Linux and Windows 7 for Embedded Systems and plan to start shipping with ROS in the second quarter of this year.

"CoroWare's research and education customers are asking for open robotic platforms that offer a freedom of choice for both hardware and software components," said Lloyd Spencer, President and CEO of CoroWare. "We believe that ROS will futher CoroWare's commitment to delivering affordable and flexible mobile robots that address the needs of our customers worldwide."

In order to get their users up and running on ROS, Coroware will be using hosting a "ROS Early Adopter Program" using their CoroCall HD videoconferencing system.

For more information, you can see CoroWare's press release, or you can visit robotics.coroware.com.

ROS on MAVs with MAVLink

| No Comments | No TrackBacks

mavlink-logo.png

Users of Micro Air Vehicles (MAVs) will be happy to hear that the MAVLink developers have released software for ROS compatibility. MAVLink is a lightweight message transport used by more than five MAV autopilots and also offers support for two Ground Control Stations. This broad autopilot support allows ROS users to develop for multiple autopilot systems interchangeably. MAVLink also enables MAVs to be controlled from a distance: if you are out of wifi range, MAVLink can be used with radio modems to retain control up to 8 miles.

MAVLink was developed in the PIXHAWK project at ETH Zurich, where it is used as main communication protocol for autonomous quadrotors with onboard computer vision. MAVLink can also be used indoors on high-rate control links in systems like the ETH Flying Machine Arena.

qgroundcontrol-win-sf-bay.pngMAVLink is compatible with two Ground Control Stations: QGroundControl and HK Ground Control Station. Ground Control Stations allow users to visualize the MAV's position in 3D and control its flight. Waypoints can be directly set in the 3D map to plan flights. You can customize the layout of QGroundControl to fit your needs, as shown in this video:

MAVLink is by now used by several mainstream autopilots:

For more information:

dvurkoaritz.jpg
aritzsurf.jpg

Guest post from Urko Esnaola of Tecnalia

Tecnalia and Pukas have cooperated to integrate sensors in a high-performance surfboard to record data of relevant surfing parameters in real operation -- while surfing waves.

The aim of the project is to have information about "what's going on" in a high-performance surfboard while a surfer is riding the board. This will help: (i) surfboard manufacturers: to have valuable information to fabricate optimal performance surfboards; (ii) surfing community: to have very complete information about their surfing technique.

Strain gauges have been included to record the flex and torsion of the surfboard in real operation. One XSens MTi-G integrating gyroscopes, accelerometers, compass and GPS has been incorporated to record data about the surfboard accelerations, speed and movements. Pressure sensors have been installed on the surfboard deck to record data about the surfer's feet position. All the data is recorded in a flash memory stick through an IGEPv2 embedded computer.

After a surf session has been finished, data is transmitted over wifi to a PC. The software system to visualize and process the data has been developed in ROS.

Phase 1 of board construction and electronics performance validation has successfully finished. The exciting Phase 2 has started to do data analysis to find the keys for the mechanical behavior of surfboards and to improve surfer's surfing technique. Professional surfers Aritz Aranburu, Hodei Collazo, Kepa Acero and Mario Azurza have already tested the surfboard. Other professionals like Tiago Pires, Joan Duru, Tim Boal and Eneko Acero are waiting for their chance.

Pukas - Tecnalia Surfsens project from Pukas Surf on Vimeo.

Footage of the visualization software:

Robots Using ROS: CSIRO's Bobcat

| No Comments | No TrackBacks

csiro_bobcat.jpg

CSIRO's Bobcat is a S185 skid-steerer, complete with lift arms. This heavy duty outdoor robot enables CSIRO robots to interact with an environment, rather than just move through it. In order to do this, they have equipped the bobcat with a variety of sensors, including two horizontal lasers, a spinning laser, camera, two IMUs, GPS, wheel encoders, and more. They also plan on integrating stereo, Velodyne, multi-modal radar, hyper spectral, and other sensors.

CSIRO's current focus with the bobcat is shared and cooperative autonomy. With shared autonomy, a human tele-operator can intervene and provide corrections as the bobcat performs a task. With cooperative autonomy, the bobcat can leverage robots with other capabilities. This sort of coordination could enable a fleet of bobcats to autonomously excavate an area.

CSIRO is in the process of migrating the Bobcat to ROS. The Bobcat was originally developed using DDX (Dynamic Data eXchange). DDX is a third generation middleware developed by CSIRO and provides features, like shared memory data exchange, that are complementary to ROS. They will continue using DDX for low-level realtime control, but sensor drivers and higher level code are being migrated to ROS. They are also investigating adding DDX-like transports to ROS.

Neato XV-11 Driver for ROS, albany-ros-pkg

| No Comments | No TrackBacks

All,

I would like to announce the availability of a simple driver for the Neato Robotics XV-11 for ROS. The neato_robot stack contains a neato_driver (generic python based driver) and neato_node package. The neato_node subscribes to a standard cmd_vel (geometry_msgs/Twist) topic to control the base, and publishes laser scans from the robot, as well as odometry. The neato_slam package contains our current move_base launch and configuration files (still needs some work).

I've uploaded two videos thus far showing the Neato:

I also have to announce our repository, since we've never officially done that: albany-ros-pkg.googlecode.com

I hope to have documentation for this new stack on the ROS wiki later today/tonight.

Mike Ferguson
ILS Social Robotics Lab SUNY Albany

The RGB-D project, a joint research effort between Intel Labs Seattle and the University of Washington Department of Computer Science & Engineering, has lots of demo videos on their site showing the various ways in which they have been using the PrimeSense RGB-D sensors in their work. These demos include 3D modeling of indoor environments, object recognition, object modeling, and gesture-based interactions.

In the video above, the "Gambit" chess-playing robot uses the RGB-D sensor to monitor a physical chessboard and play against a human opponent. And yes, that is the ROS rviz visualizer in the background.

More Videos/RGB-D Project Page

re2_automatic.drop.jpgRobotics Engineering Excellence (re2, Inc.) is a research and development company that focuses on advanced mobile manipulation, including self-contained manipulators and payloads for mobile robot platforms. As a spin-out of Carnegie Mellon, they've developed plug-n-play modular manipulation technologies, a JAUS SDK, and unmanned ground vehicles (UGV). They focus on the defense industry and their clients include DARPA, the US Armed Forces (Army, Navy, Air Force), Robotics Technology Consortium, and TSWG. RE2 has recently adopted ROS as a platform to architect and organize code.

RE2 has several projects using ROS, including interchangeable end-effectors and force/tactile feedback for manipulators. Their Small Robot Toolkit (SRT) is a plug-n-play robot arm with interchangeable end-effector tools, which can be used as a manipulator payload for mobile platforms. RE2 has also developed the capability to automatically change out end-effectors, which is being used with a modular recon manipulator for vehicle-borne IEDs. Bomb technicians can switch between various tools, like drills, saws, and scope cameras, to inspect vehicles remotely. RE2 is also working on a force and tactile sensing manipulator, which provides haptic feedback for an operator. This sort of feedback makes it easier to perform tasks like inserting a key into a lock, or controlling a drill.

RE2's manipulation technologies are also being used on mobile platforms. They are developing a Robotic Nursing Assistant (RNA) to help nurses with difficult tasks, such as helping a patient sit up, and transferring a patient to a gurney. The RNA uses a mobile hospital platform with dexterous manipulators to create a capable tool for nurses to use. RE2 is also working on an autonomous robotic door opening kit for unmanned ground vehicles.

RE2's expertise in manipulation made them a natural choice to be the systems integrator for the software track of the DARPA ARM program. The goal of this track is to autonomously grasp and manipulate known objects using a common hardware platform. Participants will have to complete various challenges with this platform, like writing with a pen, sorting objects on a table, opening a gym bag, inserting a key in a lock, throwing a ball, using duct tape, and opening a jar. There will also be an outreach track that will provide web-based access. This will enable a community of students, hobbyists, and corporate teams to test their own skills at these challenges.

RE2 had it own set of challenges: build a robust and capable hardware and software platform for these participants to use. The ARM robot is a two-arm manipulator with sensor head. The hardware, valued at around half a million dollars, includes:

  • Manipulation
    • Two Barrett WAM arms (7-DOF with force-torque sensors)
    • Two Barrett Hands (three-finger, tactile sensors on tips and palm)
  • Sensor head
    • Swiss Ranger 4000 (176x144 at 54fps)
    • Bumblebee 2 (648x488 at 48fps)
    • Color camera (5MP, 45 deg FOV)
    • Stereo microphones (44kHz, 16-bit)
    • Pan-tilt neck (4-DOF, dual pan-tilt)

A future version of the robot will incorporate a mobile base.

The software platform on the ARM robot is built on top of ROS. ROS was selected by RE2 for its modularity and tools. The modularity was important as the DARPA ARM project features an outreach program that will be providing a simulator. Users can switch between using the simulated and real robot with no changes to their code. The ARM platform also takes advantage of core ROS tools like rostest for testing and rosbag for data logging.

ROS has already proven itself on the similar CMU HERB robot, which has two Barrett arms and a mobile base. The various participants, including those in the outreach track, will be able to take advantage of the many ROS libraries for perception, grasping, and manipulation. This includes open-source frameworks like OpenRAVE, which was used on HERB for grasping and manipulation tasks.

ROS Flying at ETH Zurich and Skybotix

| No Comments | No TrackBacks

Skybotix put out a video of their CoaX helicopter running with ROS teloperation:

You can read more at I Heart Robotics.

Roland Philippsen sent in a bunch of photos of the ROS flag flying proudly with many of the ETH Zurich Autonomous Systems Lab robots running ROS, which we covered back in October.

ETH Zurich/Autonomous Systems Lab ETH Zurich/Autonomous Systems Lab

ETH Zurich/Autonomous Systems Lab ETH Zurich/Autonomous Systems Lab

Pictured: Markus Achtelik, Andreas Breitenmoser, Gregory Hitz, Ming Liu, Roland Philippsen.

Additional credits: Cedric Pradalier, Ralf Kaestner, Stephane Magnenat, Prof. Roland Siegwart.

Robot links: Magnebike, Limnobotics, sFly.

Robots Using ROS: ASL@ETH Zurich's Family of Robots

veltrop-rviz-stereo-screen.jpg

Taylor Veltrop has announced veltrop-ros-pkg as well as tools for Roboard-based humanoids

I am pleased to announce the Veltrop ROS Repository!

If any of you out there are using using small servo based robots, especially humanoid, then then check this out!

The Veltrop ROS Repository leverages ROS to get hobbyists and researchers quickly up and running with the Roboard operating a humanoid robot.

The Roboard is a small 1Ghz 486 platform that has built in PWM control, and many IO ports:

Info on KHR style humanoid

The repository consists of a stack suitable for the Roboard, and another stack specialized for small joint based robots.

The hobby community seems to be reinventing the wheel with each person that combines an embedded PC with one of these humanoid robots. When the beginner tries to do this it's too daunting, and for others it is very time consuming. So I hope to alleviate this, and get some help back too.

Here's a summary of some of the features:

  • Pose the robot based on definitions in an XML file
  • Execute motions by running a series of timed poses (XML)
  • Stabilization via gyro data
  • Definition of a KHR style robot linkage for 3D virtual modeling and servo control (URDF)
  • Calibrate trim of robot with GUI
  • Calibrate gyro stabilization with GUI
  • Import poses and trim (not motions) from Kondo's Heart2Heart RCB files
  • Control robot remotely over network with keyboard
  • Control robot with PS3 controller over bluetooth
  • Support for HMC6343 compass/tilt sensor
  • Support for Kondo gyro sensors
  • Stereo video capture and processing into point cloud
  • CPU heavy tasks (such as stereo processing) can be executed on remote computer
  • Controls Kondo PWM servos

Here's some missing parts (maybe others would like to contribute here?)

  • Control Kondo serial servos
  • GUI for editing and running poses/motions
  • Tool to capture poses
  • More sophisticated motion scripting
  • GUI for calibration of A/D inputs

My next goals for this project are to incorporate navigation, and arm/gripper trajectory planning.

The documentation is here: http://taylor.veltrop.com/robotics/khrhumanoidv2.php?topic=veltrop-ros-pkg There's a lot of other relevant information to the robot throughout the site.

The repository is hosted on sourceforge: http://sourceforge.net/projects/veltrop-ros-pkg

I hope someone out there has a chance to try this out and contribute!

Taylor

asl_robots_640w.png

The Autonomous Systems Lab (ASL) at ETH Zurich is interested in all kinds of robots, provided that they are autonomous and operate in the real world. From mobile robots to micro aerial vehicles to boats to space rovers, they have a huge family of robots, many of which are already using ROS.

As ASL is historically a mechanical lab, their focus has been on hardware rather than software. ROS provides them a large community of software to draw from so that they can maintain this focus. Similarly, they run their own open-source software hosting service, ASLforge, which promotes the sharing of ASL software with the rest of the robotics community. Integrating with ROS allows them to more easily share code between labs and contribute to the growing ROS community.

The list of robots that they already have integrated with ROS is impressive, especially in its diversity:

  • Rezero: Rezero is a ballbot, i.e. a robot that balances and drives on a single sphere.
  • Magnebike: Magnebike is a compact, magnetic-wheeled inspection robot. Magnebike is designed to work on both flat and curved surfaces so that it can work inside of metal pipes with complex arrangement. A rotating Hokuyo scanner enables them to do research on localization in these complex 3D environments.
  • Robox: Robox is a mobile robot designed for tour guide applications.
  • Crab: Crab is a space rover designed for navigation in rough outdoor terrain.
  • sFly: The goal of the sFly project is to develop small micro helicopters capable of safely and autonomously navigating city-like environments. They currently have a family of AscTec quadrotors.
  • Limnobotics: The Limnobotics project has developed an autonomous boat that is designed to perform scientific measurements on Lake Zurich.
  • Hyraii: Hyraii is a hydrofoil-based sailboat.

That's not all! Stéphane Magnenat of ASL has contributed a bridge between ROS and the ASEBA framework. This has enabled integration of ROS with many more robots, including the marXbot, handbot, smartrob, and e-puck. ASL also has a Pioneer mobile robot using ROS, and their spinout, Skybotix, develops a coax helicopter that is integrated with ROS. Not all of ASL's robots are using ROS yet, but there is a chance that we will soon see ROS on their walking robot, autonomous car, and AUV.

ASL has created an ASLForge project to provide ROS drivers for Crab, and they will be working over the next several months to select more general and high-quality libraries to release to the ROS community.

ASL's family of robots is impressive, as is their commitment to ROS. They are single-handedly expanding the ROS community in a variety of new directions and we can't wait to see what's next.

Many thanks to Dr. Stéphane Magnenat and Dr. Cédric Pradalier for help putting together this post.

ROS interface for the Parrot AR.Drone

| No Comments | No TrackBacks

parrot_ardrone3.jpgNate Roney from the Mobile Robotics Lab at SIUE has announced drivers for the Parrot AR.Drone, as well as the siue-ros-pkg repository

Greetings everyone,

I'd like to share a project I've been working on with the ROS community.

Some may be familiar with the Parrot AR.Drone: an inexpensive quadrotor helicopter that came out in September. My lab got one, but I was pretty disappointed that it didn't have ROS support out of the box. It does have potential, though, with 2 cameras and a full IMU, so it seemed like a worthwhile endeavor to create a ROS interface for it.

So, I would like to announce the first public release of the ROS interface for the AR.Drone. Currently, it allows control of the AR.Drone using a geometry_msgs/Twist message, and I'm working on getting the video feed, IMU data and other relevant state information published as well. Unfortunately, the documentation on how the Drone transmits it's state information is a bit sparse, so getting at the video (anyone with experience converting H.263 to a sensor_msgs/Image, get in touch!) and IMU data are taking more time than I'd hoped, but it's coming along. Keep an eye on the ardrone stack, it will be updated as new features are added.

For now, anyone hoping to control their AR.Drone using ROS, this is the package for you! Either send a Twist from your own code, or use the included ardrone_teleop package for manual control.

You can find the ardrone_driver and ardrone_teleop packages on the experimental-ardrone branch of siue-ros-pkg, which itself never had a proper public release. This repository represents the Mobile Robotics Lab at SIUE, and contains a few utility nodes I have developed for some of our past projects, with more packages staged for addition to the repository once we have time to document them properly for a formal release.

http://github.com/siue-cs/siue-ros-pkg

http://github.com/siue-cs/siue-ros-pkg/tree/experimental-ardrone

I'm hopeful that someone will find some of this useful. Feel free to contact me with any questions!

Cheers,
Nate Roney

ROS/ASEBA Bridge

| No Comments | No TrackBacks

marxbot-complete-detour.jpgStéphane Magnenat from the Autonomous System Lab at ETH Zurich has announced a ROS/ASEBA bridge

Dear list,

Thanks to your quick and precise answers, I have programmed a bridge between ASEBA and ROS:

http://github.com/stephanemagnenat/asebaros

This bridge allows to load source code, inspect the network structure, read and write variables, and send and receive events from ROS.

This brings ROS to the following platforms:

  • Mobots' marxbot, handbot and smartrob
  • e-puck

Kind regards,
Stéphane

The Humanoid Robots Lab at the University of Freiburg is using the Aldebaran Nao robot to do a variety of research, from climbing stairs, to imitating human motions, to footstep planning. One of their Naos, nicknamed "Osiris", has a special modification: a Hokuyo laser rangefinder head. This modification enables their research on localization for humanoid robots in complex environments.

Localization on humanoid robots is much more difficult due to the shaking motion of the robot while moving. Using techniques that will be outlined in an upcoming IROS paper [1], they are able to do 6D localization of the Nao's torso based on laser, odometry, IMU, and proprioception data. In the video above, you can see Osiris localizing itself while walking and climbing stairs.

The researchers at Uni Freiburg have been long-time contributors to ROS and run their own alufr-ros-pkg open source repository, which contains libraries for articulation models, 3d occupancy grids (OctoMap), and a Nao stack that builds on Brown's Nao driver to provide additional ROS integration.

Uni Freiburg hopes to build on their research with humanoids to work towards a full navigation stack for humanoids. This will include a footstep planning library, which they will be releasing in alufr-ros-pkg soon. Below are some screenshots of their 3D scans and footstep plans in rviz.

[1] "Humanoid Robot Localization in Complex Indoor Environments" by Armin Hornung, Kai M. Wurm, and Maren Bennewitz (to be presented at IROS 2010).

Previously: Robots Using ROS: Aldebaran Nao

osiris_plan_intro.png

osiris_3d_1.png

Robots Using ROS: Kitemas LV1

| No Comments | No TrackBacks

We first covered Takashi Ogura's (aka OTL) robot projects back in March when he got the ROS PS3 joystick driver working with an i-Sobot. He has many more fun projects that are too numerous to cover: White Bear Robot (Roomba + Navigation stack), Arduino board for the i-Sobot, Twitter control for humanoid robot, and an all-time classic, humanoid robot with iPhone 3GS head.

Along the way, OTL has been putting together tutorials and previews of ROS libraries for his Japanese audience on ros-robot.blogspot.com, such as a Japanese speech node, Twitter for ROS using OAuth, URDF tutorial, Euslisp demos, and many more.

Many of those tutorials and projects came together in the video above: Kitemas LV1. Kitemas LV1 is a fun drink ordering robot that lets you order a drink and then pours it for you. Judging from previous posts, it looks like Kitemas is using a Roomba with Hokuyo laser range finder for autonomous navigation, as well as a USB web camera. Drink selection can be done either through colored coasters or a Twitter API, and the robot can be driven manually with a PS3 joystick.

Here's a software diagram that shows the various ROS nodes working together:

OTL has also created otl-ros-pkg, so readers of his blog can get code samples for his various tutorials and even see code for robots like Kitemas above. You can watch a video with a more dressed up version of Kitemas LV1 here.

Robots Using ROS: PIXHAWK Helicopters

| No Comments | No TrackBacks

PIXHAWK is an open-source framework and middleware for micro air vehicles (MAVs) that focuses on computer vision. The framework is being developed by students at ETH Zurich, and they recently won second place the EMAV 2009 Indoor Autonomy Competition. The PIXHAWK software runs on several MAVs, including the PIXHAWK Cheetah Quadrotor and the Pioneer Coax Helicopter. The Cheetah Quadrotor was demoed at ECCV 2010 demonstrating stable autonomous flight using onboard computer vision and some interaction using ball tracking. A parts lists and assembly instructions for the Cheetah are available on the PIXHAWK web site.

The PIXHAWK middleware, MAVLink, runs on top of MIT's LCM middleware system and the PIXHAWK team has also integrated their system with ROS to provide access to tools like rviz. With rviz, PIXHAWK users can visualize a variety of 3D data from the MAVs, including pose estimates from the computer vision algorithms, as well as waypoints and IMU measurements. Other ROS processes can easily be interfaced with a PIXHAWK system.

pixhawk_rviz.png

The PIXHAWK team has also made their own open-source contributions to visualization tools for MAVs. Their QGroundControl mission planning tool provides a variety of visualizations, including real-time plotting of telemetry data. It was was initially developed for PIXHAWK-based systems, but now open to the whole MAV community.

The rest of the PIXHAWK software, including computer vision framework and flight controller software, is also available as open source. You can checkout their winter 2010 roadmap, which includes release of their ARTK hovering code base with ROS support.

The PIXHAWK team is also taking orders for a batch production run of their pxIMU Autopilot and Inertial Measurement Unit Board ($399). It provides a compact, integrated solution for those building their own quadrotors. The firmware is open source and compatible with the PIXHAWK software like QGroundControl.

We've previously featured Penn's AscTec quadrotors doing aggressive maneuvers; now you can see them out and about doing "Autonomous Multi-Floor Indoor Navigation with a Computationally Constrained MAV":

All of the computation is done onboard the 1.6Ghz Intel Atom processor and uses ROS for interprocess communication.

Credit: Shaojie Shen, Nathan Michael, and Vijay Kumar

Update: the GRASP lab also has the quadrotors running through thrown hoops:

Robots Using ROS: Meka's Robots

| No Comments | No TrackBacks

meka_a2_h2_bimanual_manipulators.jpg

Above: Meka bimanual robot using Meka A2 compliant arm and H2 compliant hand

Meka builds a wide-range of robot hardware targeted at mobile manipulation research in human environments. Meka's work was previously featured in the post on the mobile manipulator Cody from Georgia Tech, which uses Meka arms and torso.

Meka was started by Aaron Edsinger and Jeff Weber to capitalize on their experience building robots like Domo, which featured force-controlled arms, hands, and neck built out of series-elastic actuators. Meka's expertise with series-elastic actuators allows them to target their hardware at human-centered applications, where compact, lightweight, compliant, force-controlled hardware is desired. Georgia Tech's HRI robot Simon, which uses Meka torso, head, arms, and hands, has proportions similar to a 5'7" female.

meka_base01.jpgMeka initially built robot hands and arms, but is now transitioning into building all the components you need for a mobile manipulation platform. As Meka began to make this transition, they also started to transition to ROS. As a small startup company, they didn't have the resources to design and build the software drivers and libraries for a more complete mobile manipulation platform. They were also transitioning from a single real-time computer to using multiple computers, and they needed a middleware platform that would help them utilize this increased power.

One of Meka's new hardware products is the B1 Omni Base, which is getting close to completion. The B1 is based on the Nomadic XR4000 design and uses Holomni's powered casters. It is also integrated with the M3 realtime system and will have velocity, pose, and operational-space control available. The base houses a RTAI Ubuntu computer and can have up to two additional computers.

Meka is also designing two sensor heads that will be 100% integrated with ROS. The more fully-featured of the two will have five cameras, including Videre stereo, as well as a laser range finder, microphone array, and IMU. The tilting action of the head will enable to robot to use the laser rangefinder as a 3D sensor, in addition to the stereo.

The Meka software system consists of the Meka M3 control system coupled with ROS and other open-source libraries like Orocos' KDL. M3 is used to manage the realtime system and provide low-level GUI tools. ROS is used to provide visualizations and higher-level APIs to the hardware, such as motion planners that incorporate obstacle avoidance. ROS is also being used to integrate the two sensor heads that Meka has in development, as well as provide a larger set of hardware drivers so that customers can more easily integrate new hardware.

ROS is fully available with Meka's robots starting with last month's M3 v1.1 release. For lots of photos and video of Meka's hardware in action, see this Hizook post.

skybotix.jpg

Skybotix is offering their CoaX helicopter complete with basic ROS setup so customers can use ROS right out of the box.

The CoaX helicopter is a micro UAV targeted at the research and educational markets. The small 320g helicopter includes an IMU, a downward-looking and three optional sideward-looking sonars, pressure sensor, color camera, and Bluetooth, XBee, or WiFi communication. In addition to two DSPs (dsPIC33), the CoaX has an optional Gumstix Overo computer that can run ROS. You can see more of the specs on their hardware wiki page.

Skybotix fully supports open source with the CoaX. The CoaX API, including low-level firmware and controller, is available open source under a GNU LGPL license. Their Gumstix Overo setup comes with a basic ROS installation. They include a ROS publisher for the CoaX state, a demo application for transmitting video data, and a GUI for visualizing both. Although the CoaX comes with minimal additional ROS libraries, there is a growing community of micro-UAV developers using ROS, including the micro-UAV-focused ccny-ros-pkg repository.

The CoaX was developed in collaboration with ETH Zurich. The Skybotix Youtube channel has videos of ETH Zurich student projects. Skybotix released recently a speed module for CoaX based on optical sensor, which enables indoor speed control as well as indoor hovering (video).

Name that DARPA Robot

| No Comments | No TrackBacks

DARPA_ARMS.jpgDARPA is having a contest to name their new robot for the ARM program. "The ARM Robot" has two Barrett WAM arms, BarrettHands, 6-axis force torque sensors at the wrist, and pan-tilt head. For sensors, it has a color camera, SwissRanger depth camera, stereo camera, and microphone.

The final software architecture and APIs have not been released yet, but the FAQ notes:

The software architecture is TBD, but is leaning toward a nodal software architecture using a tool such as Robotic Operating System (ROS).

The software track for the ARM program currently includes Carnegie Mellon University, HRL Laboratories, iRobot, NASA-Jet Propulsion Laboratory, SRI International and University of Southern California. It would certainly be a great boost for the ROS community to have more common platforms to develop and share the latest perception and manipulation techniques.

Below is a video from Dr. Motilal Agrawal of SRI (via Hizook) showing it in action. Dr. Agrawal and SRI are looking for Ph.D/Masters students with experience in robotics, ROS, and OpenCV. Want a job?

The CityFlyer project at the CCNY Robotics and Intelligent Systems Lab is using Ascending Technologies Pelican and Hummingbird Quadrotor helicopters to do research in 3D mapping and navigation. The Ascending Technologies platform provides a 1.6Ghz Intel Atom processor, 500 gram payload, GPS, and barometric altimeter. The CityFlyer add several sensors, including a Hokuyo URG-04LX and IMU. The Hokuyo URG has been modified to double as a laser height estimator. The CityFlyer project is able to combine data from these sensors to do indoor SLAM using GMapping.

The CityFlyer project has also created an RGB-D sensor by combining data from a SwissRanger 4000 and Logitech Webcam. They use this to build 3D maps for indoor environments using a 3D Multi-Volume Occupancy Grid (MVOG). Their MVOG technique is described in their RGB-D 2010 paper and more videos are here and here. Although the full sensor package exceeds the payload of the quadrotor, they anticipate that advances in RGB-D will make these techniques feasible for micro UAVs.

CCNY has released a variety of drivers, libraries and tools to support the ROS community. These include drivers and tools for the AscTec platform, libraries for dealing with aerially mounted laser rangefinders, a New College Dataset parser, and libraries for using AR tags with ROS.

ground station screenshot

CCNY has also developed a "Ground Station" application that acts as a virtual cockpit for visualizing telemetry data from an AscTec quadrotor. It is also able to overlay GPS data on an outdoor map to visualize the UAV's tracks. I Heart Robotics has a great writeup on Ground Station, and you can also checkout the documentation on ROS.org.

The ccny-ros-pkg is an excellent resource for the ROS community with complete documentation on a variety of packages, including videos that demonstrate these packages in use.

Bag files for the video above can be downloaded here (elevator_2010-08*).

Robots Using ROS: Thecorpora's Qbo

| No Comments | No TrackBacks

Qbos.jpg

Qbo is a personal, open-source robot being developed by Thecorpora. Francisco Paz started the Qbo project five years ago to address the need for a low cost, open-source robot to enable the ordinary consumer to enter the robotics and the artificial intelligence world.

A couple months ago, Thecorpora decided to switch their software development to ROS and have now acheived "99.9%" integration. You can watch the video below of Qbo's head servos being controlled by the ROS Wiimote drivers, as well as this video of the Wiimote controlling Qbo's wheels. Their use of the ROS joystick drivers means that any of the supported joysticks can be used with Qbo, including the PS3 joystick and generic Linux joysticks.

Qbo's many other sensors are also integrated with ROS, which means that they can be used with higher-level ROS libraries. This includes the four ultrasonic sensors as well as Qbo's stereo webcams. They have already integrated the stereo and odometry data with OpenCV in order to provide SLAM capabilities (described below).

It's really exciting to see an open-source robot building and extending upon ROS. From their latest status update, it sounds like things are getting close to done, including a nice GUI that lets even novice users interact with the robot.

Qbo SLAM algorithm:

The algorithm can be divided into three different parts:

The first task is to calculate the movement of the robot. To do that we use the driver for our robot that sends an Odometry message.

The second task is to detect natural features in the images and estimate their positions in a three dimensional space. The algorithm used to detect the features is the GoodFeaturesToTrackDetector function from OpenCV. Then we extract SURF descriptors of those features and match them with the BruteForceMatcher algorithm, also from OpenCV.

We also track the points matched with the sparse iterative version of the Lucas-Kanade optical flow in pyramids and avoid looking for new features in places where we are already tracking another feature.

We take the images to this node from image messages synchronized and send a PointCloud message with the position of the features, their covariance in the three coordinates, and the SURF descriptor of the features.

The third task is to implement an Extended Kalman Filter and a data association algorithm based in the mahalanobis distance from the CloudPoint seen from the robot and the CloudPoint of the map. To do that we read the Odometry and PointCloud messages and we send also an Odometry message and a PointCloud message with the position of the robot and the features included in the map as an output.

Robots Using ROS: Lego NXT

| No Comments | No TrackBacks

Lego Mindstorms NXT is a low-cost programmable robotics kit that is used in education and by hobbyits throughout the world. One of the most visible NXT events is First Lego League. The developers of foote-ros-pkg have developed a bridge that connects NXT with ROS, allowing NXT users to leverage all the ROS tools and capabilities.

The NXT-ROS software stack provides many useful tools to interface NXT robots with ROS. Currently NXT users can take robot models created with Lego Digital Designer, and automatically convert them into robot models compatible with ROS. The converted robot model can be visualized in rviz, and in the future we hope to add simulation capabilities in Gazebo, our 3D simulator. The bridge between NXT and ROS creates a ROS topic for each motor and sensor of the NXT robot.

Once a robot is connected to ROS, you can start running applications such as the base controller, wheel odometry, keyboard/joystick teleoperation, and even assisted teleoperation using the ROS navigation stack. The NXT-ROS software stack includes a number of example robot models for users to play with and to get a feel for using NXT with ROS.

This new NXT-ROS software stack provides NXT users access to the open-source ROS community. NXT users now have access to state of the art open source robotics libraries available on ros.org.

Please see the nxt page on the ROS wiki for documentation, demos, and more. The developers would like to thank the nxt-python project for support and development.

Robots Using ROS: Mini-PR2

| No Comments | No TrackBacks

The folks at the ModLab/GRASP Lab at Penn recently got their PR2 and used the occassion to test out "Mini-PR2". They used $5000 worth of CKBot modules to replicate the degrees of freedom of the real PR2 -- all except the torso. They used 18 modules (14 U-Bar, 4 L7, 4 motor) to create Mini-PR2, and they also added a counter-balance on the shoulder to help balance the arm.

The CKBot modules, which have previously been featured here, enable their lab to try out new ideas quickly and cheaply. In this case, they can use the PR2 simulator to drive their real robot, and they've used an actual PR2 to puppet Mini-PR2 (see 0:49 in video). They are now working on using the Mini-PR2 to puppet the actual PR2.

The CKBot modules don't have the computation power to run ROS on their own, but they can communicate with another computer that translates between the two systems. Their current system listens to the joint_states topic on the PR2 and translates those messages into CKBot joint angles.

sr-ros-interface_release1.png

You can now use Shadow Robot hardware with ROS! Engineers at Shadow Robot have been busy building a ROS stack and have now reached their first release. This initial release includes an interface to both simulated and real hardware, which means that, whether or not you have a Shadow Dextrous Hand of your own, you can use your ROS software to see the Shadow hand move inside of ROS tools like rviz.

To get started, you should check the Shadow Robot ROS FAQ or checkout the Launchpad site.

Robots Using ROS: Robotino

| No Comments | No TrackBacks

Robotino_Imagefoto.JPG

Robotino is a commercially available mobile robot from Festo Didactic. It's used for both education and research, including competitions like RoboCup. It features an omnidirectional base, bumps sensors, infrared distance sensors, and a color VGA camera. The design of Robotino is modular, and it can easily be equipped with a variety of accessories, inluding sensors like laser scanners, gyroscopes, and the Northstar indoor positioning system.

REC has been supportive of the Openrobotino community, which provides open-source software for use with the Robotino, and now, they are providing official ROS drivers in the robotino_drivers stack. Their current ROS integration already supports the ROS navigation stack, and you can watch the video below that shows the Robotino being controlled inside of rviz.

We're very excited to see commercially available robot hardware platforms being support with official ROS drivers. There are over a thousand Robotino systems around the world and we hope that these drivers will help connect the Robotino and ROS communities.

Robots Using ROS: Penn Quadrotors

| No Comments | No TrackBacks

ROS has taken to the air! In a video that's quickly making the rounds on the Internet, you can see quadrotors from Penn's GRASP Lab performing all sorts of "aggressive" acrobatic stunts, from flying through narrow windows to landing on vertical perches. The entire system uses a mix of high-level ROS software for modularization and communication, as well as low-level microcontroller code.

The goal of this project was to fly a quadrotor precisely along aggressive trajectories. The basic components of the system are the quadrotor, a control laptop, and the Vicon motion capture system. The onboard microcontroller runs an attitude control loop at 1 kHz. The control laptop runs the higher-level position control loop. The control computer communicates with the quadrotor via an XBee link.

Communication between different programs on the control computer is done through ROS. A motion-capture node sends pose messages to a central controller, which in turn outputs control messages to code that sends the commands to the quadrotor. Experimentation was performed in a 3D simulator using a quadrotor model that contains a very accurate description of the dynamics of the actual quadrotor. The simulator communicates through ROS in a similar way as the hardware does, allowing for minimal overhead to switch between experimentation in simulation and on the actual quadrotor. ROS made it easy to modularize the code and write programs for each aspect of the entire problem independently.

Daniel Mellinger's Quadrotor Page
GRASP Lab

Thanks to Daniel Mellinger of Penn for helping to put together this post.

The Media and Machines Lab at Washington University in St. Louis has integrated several of their robots with ROS, including an iRobot B21r and several Videre ERRATICs. They are also maintaining wu-ros-pkg, which is a repository of research projects, drivers, and utilities related to these robots.

Wash U.'s B21r, known as Lewis, is best known for being a mobile robot photographer. Lewis is currently being used for HRI research, and they are also reimplementing the photographer functionality in ROS. Lewis is fully integrated with ROS, including sensor data from 48 sonar sensors, 56 bump sensors, 2 webcams, and a Hokuyo laser rangerfinder. There is also Directed Perception PTU-46 pan-tilt unit that they have mounted the webcams on (driver).

The B21r community will be happy to know that Wash U. has deeply integrated this platform with ROS. They have created an urdf model, complete with meshes for visualizing in rviz, and they have also integrated the B21r with the ROS navigation stack. They are also providing an rwi stack, which includes their rflex driver. The rflex driver is capable of driving other iRobot/RWI robot platforms, including the B18, ATRV, and Magellan Pro.

Wash U. has also integrated their four Videre ERRATICs with ROS. They've named these robots Blood, Sweat, Toil, and Tears, and have equipped them with Hokuyo laser rangerfinders and webcams. The ERRATICs enable them to explore research in multi-robot coordination and control. They're also developing on iRobot Creates using drivers from brown-ros-pkg.

The research at the Media and Machines Lab has led to several interfaces and visualizations for using robots. This includes RIDE (Robot Interactive Display Environment), which takes cues from Real Time Strategy (RTS) video games to provide an interface for easily controlling multiple robots simultaneously. They have also developed a visualization for mapping sensor data over time for search tasks and a 3D interface for binocular robots. RIDE is available in the ride stack, and much of their other research will soon be released in wu-ros-pkg.

Robots Using ROS: CMU/Intel's HERB

| No Comments | No TrackBacks

herb-lowres.jpgHERB (Home Exploring Robotic Butler) is a mobile manipulation platform built by Intel Research Pittsburgh, in collaboration with the Robotics Institute at Carnegie Mellon University. HERB is designed to be a "robotic butler" and has been demonstrated in a variety of real-world kitchen tasks, such as opening refrigerator and cabinet doors, finding and collecting coffee mugs, and throwing away trash. HERB is powered by a variety of open-source libraries, including several developed by CMU researchers, like OpenRAVE and GATMO.

OpenRAVE is a software platform for robotics that was designed specifically for the challenges related to motion planning. It was created in 2006 by Rosen Diankov, and in late 2008 he integrated it with ROS. The benefits of this integration can be seen on HERB.

HERB has a Barrett WAM arm, a pair of low-power onboard computers, Pointgrey Flea and Dragonfly cameras, a SICK LMS lidar, a rotating Hokuyo lidar, and a Logitech 9000 webcam, all of which sit on a Segway RMP200 base. HERB communicates with off-board PCs over a wireless network.

ROS is glue for this setup: ROS is used for the hardware drivers, process management, and communication on HERB. ROS' ability to distribute processes across computers is used to help perform computation off the robot.

OpenRAVE provides an environment on top of this that unifies the controls and sensors for doing motion-planning algorithms, including sending trajectories to the arm and hand. OpenRAVE implements Diankov et. al's work on caging grasps, which enables HERB to perform tasks like opening and closing doors, drawers, cabinets, and turning handles.

In addition to manipulating objects, HERB has to be able to keep track of people and other movable objects that exist in real-world environments. HERB uses the GATMO (Generalized Approach to Tracking Movable Objects) library to track these movable objects. GATMO was developed by Garratt Gallagher and is available from gatmo.org. The GATMO library includes packaging and installation instructions for ROS.

The collaboration between CMU and Intel Labs Pittsburgh has produced numerous other libraries that have found their way into ROS. Rosen Diankov started the cmu-ros-pkg repository, which houses many of these libraries, and he also wrote rosoct, an Octave client library for ROS. Another library of note is the chomp_motion_planner package, which was implemented by Mrinal Kalakrishnan based on the work of Ratliff et. al.

You can find more videos of HERB in action at the Personal Robotics Intel site. For more on how HERB uses ROS, OpenRAVE, and GATMO, you can read "HERB: a home exploring robot butler".

Robots Using ROS: TUM-Rosie

| No Comments | No TrackBacks

The Intelligent Autonomous Systems Group at TU München (TUM) built TUM-Rosie with the goal of developing a robotics system with a high-degree of cognition. This goal is driving research in 3D perception, cognitive control, knowledge processing, and highlevel planning. TUM is building their research on TUM-Rosie using ROS and has setup the open-source tum-ros-pkg repository to share their research, libraries, and hardware drivers. TUM has already released a variety of ROS packages and is in the process of releasing more.

tum-ias-robot-illustration.jpg

TUM-Rosie is a mobile manipulator built on a Kuka mecanum-wheeled omnidrive base, with two Kuka LWR-4 arms and DLR-HIT hands. It has a variety of sensors for accomplishing perception tasks, including a SwissRanger 4000, FLIR thermal camera, Videre stereo camera, SVS-VISTEK eco274 RGB cameras, a tilting "2.5D" Hokuyo UTM-30LX lidar, and both front and rear Hokuyo URG-04LX lidars.

One of the new libraries that TUM is developing is the cloud_algos package for 3D perception of point cloud data. cloud_algos is being designed as an extension of the pcl (Point Cloud Library) package. The cloud_algos package consists of a set of point-cloud-processing algorithms, such as a rotational object estimator. The rotational object estimator enables a robot to create models for objects like pitchers and boxes from incomplete point cloud data. TUM has already released several packages for semantic mapping and cognitive perception.

tum-ias-cloud.png

TUM is also working on systems that combine knowledge reasoning with perception. The K-COPMAN (Knowledge-enabled Cognitive Perception for Manipulation) system in the knowledge stack generates symbolic representations of perceived objects. This symbolic representation allows a robot to make inferences about what is seen, like what items are missing from a breakfast table.

In the field of knowledge processing and reasoning for personal robots, TUM developed the KnowRob system that can provide:

  • spatial knowledge about the world, e.g. the positions of obstacles
  • ontological knowledge about objects, their types, relations, and properties
  • common-sense knowledge, for instance, that objects inside a cupboard are not visible from outside unless the door is open
  • knowledge about the functions of objects like the main task a tool serves for or the sequence of actions required to operate a dishwasher

KnowRob is part of the tum-ros-pkg repository, and there is a wiki with documentation and tutorials.

tum-knowrob.png

tum-ias-robot.jpgAt the high level, TUM is working on CRAM (Cognitive Robot Abstraction Machine), which provides a language for programming cognitive control systems. The goal of CRAM is to allow autonomous robots to infer decisions, rather than just having pre-programmed decisions. Practically, the approach will enable tackling of the complete pick-and-place housework cycle, which includes setting the table, cleaning the table as well as loading the dishwasher, unloading it and returning the items to their storage locations. CRAM features showcased in this scenario include the probabilistic inference of what items should be placed where on the table, what items are missing, where items can be found, which items can and need to be cleaned in the dishwasher, etc. As robots become more capable, it will be much more difficult to explicitly program all of their decisions in advance, and the TUM researchers hope that CRAM will help drive AI-based robotics.

Researchers at TUM have also made a variety of contributions to the core ROS system, including many features for the roslisp client library. They are also maintaining research datasets for the community, including a kitchen dataset and a semantic database of 3d objects, and they have contributed to a variety of other open-source robotics systems, like YARP and Player/Stage.

Research on the TUM-Rosie robot has been enabled by the Cluster of Excellence CoTeSys (Cognition for Technical Systems). For more information:

Robots Using ROS: Modlab's CKBots

| No Comments | No TrackBacks

ckbot.jpg

The Modlab at Penn designed the CKBot (Connector Kinetic roBot) module to be fast, small, and inexpensive. These qualities enable it to be used to explore the promise of modular robotics systems, including adaptability, reconfigurability, and fault tolerance. They've researched dynamic rolling gaits, which use a loop configuration to achieve speeds of up to 1.6/ms, as well as bouncing gaits by attaching passive legs. They are also using the CKBots to research the difficult problem of configuration recognition, and, for the Terminator 2 fans, they have even demonstrated "Self re-Assembly after Explosion" (SAE).

More recently, Modlab has developed ROS packages that can be used when the CKBots are connected to a separate ROS system. They have also created an open source repository, modlab-ros-pkg, for CKBot ROS users. The CKBot modules only have a few PIC processors -- not enough to run ROS -- so an off-board system enables them to use algorithms that require more processing power. In one experiment, they used a camera to locate AR tags on the CKBot modules. The locations were stored in tf, which was used to calculate coordinate transforms between modules. They have also used rviz to display the estimated position of modules during SAE when AR tags were not in use.

One of the projects Modlab is currently working on is a "mini-PR2" made out of CKBot modules. The mini-PR2 will be kinematically similar to the Willow Garage PR2 and is powered by a separate laptop. You can see an early prototype of mini-PR2 opening an Odwalla fridge:

CKbots trace their ancestry back to Professor Mark Yim's work on the PolyBot system at PARC. The PolyBot system had an impressive range of demonstrations, including fence and stair climbing, tricycle riding, and even transforming from a loop, to a snake, to a spider.

Modlab does a variety of other modular robotics research, and has even demonstrated a quick-change end effector for the PR2.

Rovio driver for ROS

| No Comments | No TrackBacks

rovio.jpgI Heart Robotics has released a rovio stack for ROS, which contains a controller, a joystick teleop node, and associated launch files for the WowWee Rovio. There are also instructions and configuration for using the probe package from brown-ros-pkg to connect to Rovio's camera.

You can download the rovio stack from iheart-ros-pkg:

http://github.com/IHeartRobotics/iheart-ros-pkg

As the announcement notes, this is still a work in progress, but this release should help other Rovio hackers participate in adding new capabilities.

Marvin is an autonomous car from Austin Robot Technology and the Department of Computer Science at The University of Texas at Austin. The modified 1999 Isuzu VehiCross competed in the 2007 DARPA Urban Challenge and was able to complete many of the difficult tasks presented to the vehicles, including merging, U-turns, intersections, and parking.

The team members for Marvin have a long history of contributing to open-source robotics software, including the Player project. Recently, Marvin team members have been porting their software to ROS. As part of this effort, they have setup the utexas-art-ros-pkg open-source code repository, which provides drivers and higher-level libraries for autonomous vehicles.

Like many Urban Challenge vehicles, Marvin has a Velodyne HDL lidar and Applanix Position and Orientation System for Land Vehicles (POS-LV). Drivers for both of these are available in the utexas-art-ros-pkg applanix package and velodyne stack, respectively. The velodyne stack also includes libraries for detecting obstacles and drive-able terrain, as well as tools for visualizing in rviz.

The Marvin team has also released an art_vehicle stack that provides the libraries that make Marvin go, including their navigation system. You can try it out with their simulator built on Stage.

marvin.JPG

Professor Peter Stone's group in the Department of Computer Science has been using Marvin to do multiagent research. You can learn about the algorithms used in the Urban Challenge in their paper, "Multiagent Interactions in Urban Driving". More recently, they have been doing research in "autonomous intersection management". This research is investigating a multiagent framework that can handle intersections for autonomous vehicles safely and efficiently. As you can see in the video above, these intersections for autonomous vehicles can handle far more vehicles than intersections designed for human-driven vehicles. For more information, you can watch a longer clip and read Kurt Dresner and Peter Stone's paper, "A Multiagent Approach to Autonomous Intersection Management"

Many people have contributed to the development of Marvin in the past. Current software development, including porting to ROS, is being led by Jack O'Quin and Dr. Michael Quinlan under the supervision of Professor Peter Stone.

Robots Using ROS: Bosch RTC's Robot

| No Comments | No TrackBacks

Bosch's Research and Technology Center (RTC) has a Segway-RMP based robot that they have been using with ROS for the past year to do exploration, 3D mapping, and telepresence research. They recently released version 0.1 of their exploration stack in the bosch-ros-pkg repository, which integrates with the ROS navigation stack to provide 2D-exploration capabilities. You can use the bosch_demos stack to try this capability in simulation.

segway_rtc.640w.jpgThe RTC robot uses:

  • 1 Mac Mini
  • 2 SICK scanners
  • 1 Nikon D90
  • 1 SCHUNK/Amtec Powercube pan-tilt unit
  • 1 touch screen monitor
  • 1 Logitech webcam
  • 1 Bosch gyro
  • 1 Bosch 3-axis acceleromoter

Like most research robots, it's frequently reconfigured: they added an additional Mac mini, Flea camera, and Videre stereo camera for some recent work with visual localization.

Bosch RTC has been releasing drivers and libraries in the bosch-ros-pkg repository. They will be presenting their approach for mapping and texture reconstruction at ICRA 2010 and hope to release the code for that as well. This approach constructs a 3D environment using the laser data, fits a surface to the resulting model, and then maps camera data onto the surfaces.

Researchers at Bosch RTC were early contributors to ROS, which is remarkable as bosch-ros-pkg is the first time Bosch has ever contributed to an open source project. They have also been involved with the ros-pkg repository to improve the SLAM capabilities that are included with ROS Box Turtle, and they have been providing improvements to a visual odometry library that is currently in the works.

ele.jpg

The Healthcare Robotics Lab focuses on robotic manipulation and human-robot interaction to research improvements in healthcare. Researchers at HRL have been using ROS on EL-E and Cody, two of their assistive robots. They have also been publishing their source code at gt-ros-pkg.

HRL first started using ROS on EL-E for their work on Physical, Perceptual, and Sematic (PPS) tags (paper). EL-E has a variety of sensors and Katana arm mounted on a Videre ERRATIC mobile robot base. The video below shows off many of EL-E's capabilities, including a laser pointer interface -- people select objects in the real-world for the robot to interact with using a laser pointer.

HRL does much of their research work in Python, so you will find Python-friendly wrappers for much of EL-E's hardware, including the Hokuyo UTM laser rangefinder, Thing Magic M5e RFID antenna, and Zenither linear actuator. You can also get CAD diagrams and source code for building your own tilting Hokuyo 3D scanner.

HRL also has a new robot, Cody, which you can see in the video below:

Update: you can read more on Cody at Hizook.

The end effector and controller are described in the paper, "Pulling Open Novel Doors and Drawers with Equilibrium Point Control" (Humanoids 2009). They've also published the CAD models of the end effector and the source code can be found in the 2009_humanoids_epc_pull ROS package.

Whether it's providing open source drivers for commonly used hardware, CAD models of their experimental hardware, or source code to accompany their papers, HRL has embraced openness with their research. For more information:

Robots Using ROS: JSK's Kawada HPR2-V

| No Comments | No TrackBacks

HRP-2V.640w.jpg

The Kawada HRP-2V is a variant of the HRP-2 "Promet" robot. It uses the torso, arms, and sensor head of the HRP-2, but it is mounted to an omni-directional mobile base instead of the usual humanoid legs. The JSK Lab at Tokyo University uses this platform for hardware and software research.

In May of 2009 at the ICRA conference, the HRP-2V was quickly integrated with the ROS navigation stack as a collaboration between JSK and Willow Garage. Previously, JSK had spent two weeks at Willow Garage integrating their software with ROS and the PR2. ICRA 2009 was held in Kobe, Japan, and Willow Garage had a booth. With laptops and the HRP-2V setup next to the booth, JSK and Willow Garage went to work getting the navigation stack on the HRP-2V. By the end of the conference, the HRP-2V was building maps and navigating the exhibition hall.

prairiedog_cups.640w.jpg

Like the Aldebaran Nao, the "Prairie Dog" platform from the Correll Lab at Colorado University is an example of the ROS community building on each others' results, and the best part is that you can build your own.

Prairie Dog is an integrated teaching and research platform built on top of an iRobot Create. It's used in the Multi-Robot Systems course at Colorado University, which teaches core topics like locomotion, kinematics, sensing, and localization, as well as multi-robot issues like coordination. The source code for Prairie Dog, including mapping and localization libraries, is available as part of the prairiedog-ros-pkg ROS repository.

Prairie Dog uses a variety of off-the-shelf robot hardware components: an iRobot Create base, a 4-DOF CrustCrawler AX-12 arm, a Hokuyo URG-04LX laser rangefinder, a Hagisonic Stargazer indoor positioning system, and a Logitech QuickCam 3000. The Correll Lab was able to build on top of existing ROS software packages, such as brown-ros-pkg's irobot_create and robotis packages, plus contribute their own in prairiedog-ros-pkg. Prairie Dog is also integrated with the OpenRAVE motion planning environment.

Starting in the Fall of 2010, RoadNarrows Robotics will be offering a Prairie Dog kit, which will give you all the off-the-shelf components, plus the extra nuts and bolts. Pricing hasn't been announced yet, but the basic parts, including a netbook, will probably run about $3500.

For more information, please see:

IMG_5654.JPG

Photo: Prairie Dogs busy creating maps for kids and parents

cob3_irex_640w.jpg

The Care-O-bot 3 is a mobile manipulation robot designed by Fraunhofer IPA that is available both as a commercial robotic butler, as well as a platform for research. The Care-O-bot software has recently been integrated with ROS, and, in just short period of time, already supports everything from low-level device drivers to simulation inside of Gazebo.

The robot has two sides: a manipulation side and an interaction side. The manipulation side has a SCHUNK Lightweight Arm 3 with SDH gripper for grasping objects in the environment. The interaction side has a touchscreen tray that serves as both input and "output". People can use the touchscreen to select tasks, such as placing drink orders, and the tray can deliver objects to people, like their selected beverage.

The goals of the Care-O-bot research program are to:

  • provide a common open source repository for the hardware platform
  • provide simulation models of hardware components
  • provide remote access to the Care-O-bot 3 hardware platform

Those first two goals are supported by the care-o-bot open source repository for ROS, which features libraries for drivers, simulation, and basic applications. You can easily download the source code and perform a variety of tasks in simulation, such as driving the base and moving the arm. These support the third goal of providing remote access to physical Care-O-Bot hardware via their webportal.

cob3_tech_specs.640w.jpg

For sensing, the Care-O-bot uses two SICK S300 laser scanners, a Hokuyu URG-04LX laser scanner, two Pike F-145 firewire cameras for stereo, and Swissranger SR3000/SR4000s. The cob_driver stack provides ROS software integration for these sensors.

The Care-O-bot runs on a CAN interface with a SCHUNK LWA3 arm, SDH gripper, and a tray mounted on a PRL 100 for interacting with its environment. It also has a SCHUNK PW 90 and PW 70 pan/tilt units, which give it the ability to bow through its foam outer shell. The CAN interface is supported through several Care-O-bot ROS packages, including cob_generic_can and cob_canopen_motor, as well as wrappers for libntcan and libpcan. The SCHUNK components are also supported by various packages in the cob_driver stack.

The video below shows the Care-O-bot in action. NOTE: as the Care-O-bot source code is still being integrated with ROS, the capabilities you see in the video are not part of the ROS repository.

Stanford_Junior.640w.jpg

Junior is the Stanford Racing team's autonomous car that most famously finished in a close second at the DARPA Urban Challenge. It successfully navigated a difficult urban environment that required obeying traffic rules, parking, passing and many other challenges of real-world driving.

Those of you familiar with Junior are probably saying, "Junior doesn't use ROS! It uses IPC!"

That's mostly true, but researchers have recently started using ROS-based perception libraries in Junior's obstacle classification system.

From the very start, one of the goals of ROS was to keep libraries small and separable so that you could use as little, or as much, as you want. In the case of the tiny i-Sobot, a developer was able to just use ROS's PS3 joystick driver. When frameworks get too large, they becomes much more difficult to integrate with other systems.

In the case of Junior, Alex Teichman was able to bring his image descriptor library for ROS onto Junior. He has been using this library, along with ROS point cloud libraries, to develop Junior's obstacle classification system. Other developers on the team will also be allowed to choose ROS for their programs where appropriate.

You can find out more about Alex's image descriptor library at ros.org/wiki/descriptors_2d.

Robots Using ROS: Aldebaran Nao

| No Comments | No TrackBacks

The Aldebaran Nao is a commercially available, 60cm tall, humanoid robot targeted at research lab and classrooms. The Nao is small, but it packs a lot into its tiny frame: four microphones, two VGA cameras, touch sensors on the head, infrared sensors, and more. The use of Nao with ROS has demonstrated how quickly open-source code can enable a community to come together around a common hardware platform.

rvizThe first Nao driver for ROS was released by Brown University's RLAB in November of 2009. This initial release included head control, text-to-speech, basic navigation, and access to the forehead camera. Just a couple of days later, the University of Freiburg's Humanoid Robot Lab used Brown's Nao driver to develop new capabilities, including torso odometry and joystick-based tele-operation. Development didn't stop there: in December, the Humanoid Robot Lab put together a complete ROS stack for the Nao that added IMU state, a URDF robot model, visualization of the robot state in rviz, and more.

The Nao SDK already comes with built-in support for the open-source OpenCV library. It will be exciting to see what additional capabilities the Nao will gain now that it can be connected to the hundreds of different ROS packages that are freely available.

Brown is also using open source and ROS as part of their research process:

Publishing our ROS code as well as research papers is now an integral part of disseminating our work. ROS provides the best means forward for enabling robotics researchers to share their results and more rapidly advance the state-of-the-art.

-- Chad Jenkins, Professor, Brown University

The University of Freiburg's Nao stack is available on alufr-ros-pkg. Brown's Nao drivers are available on brown-ros-pkg, along with drivers for the iRobot Create and a Gstream-based webcam driver.

Robots Using ROS: i-Sobot

| No Comments | No TrackBacks

ROS is starting to gain traction in Japan thanks to some dedicated early adopters and community-based translation efforts. Last year, the ROS Navigation stack was ported to Tokyo University's Kawada HRP2-V robot, and now it's finding use with hobby robots as well.

ROS libraries are designed to be small and easily broken apart. In this case, a small use of ROS has led to the claim of "smallest humanoid robot controlled by ROS." As the video explains, ROS isn't running on the robot. The i-Sobot is hooked up to an Arduino, which talks to a PC, which uses the ROS PS3 joystick driver. We're always thrilled to see code being reused, whether it's something as big as the ROS navigation stack, or something as small as a PS3 joystick driver.

The video and demo was put together by "Ogutti", who has been maintaining a Japanese blog on ROS at ros-robot.blogspot.com/. Most recently, he has been blogging about using the Care-O-bot 3 simulation libraries.

In addition to Ogutti's Japanese ROS blog, you can go to ros.org/wiki/ja to follow the progress of the Japanese translation efforts for the ROS documentation.

Robots Using ROS: STAIR 1

| No Comments | No TrackBacks

stair_april2007_small.jpgWith so many open-source repositories offering ROS libraries, we'd like to highlight the many different robots that ROS is being used on. It's only fitting that we start where ROS started with STAIR 1: STanford Artificial Intelligence Robot 1. Morgan Quigley created the Switchyard framework to provide a robot framework for their mobile manipulation platform, and it was the lessons learned from building software to address the challenges of mobile manipulation robots that gave birth to ROS.

Solving problems in the mobile manipulation space is too large for any one group. It requires multiple teams tackling separate challenges, like perception, navigation, vision, and grasping. STAIR 1 is research robot built to address these challenges: a Neuronics Katana Arm, a Segway base, and an ever-changing array of sensors, including a custom laser-line scanner, Hokuyo laser range finder, Axis PTZ, and more. The experience developing for this platform in a research environment provided many lessons for ROS: small components, simple reconfiguration, lightweight coupling, easy debugging, and scalable.

STAIR 1 has tackled a variety of research challenges, from accepting verbal commands to locate staplers, to opening doors, to operating elevators. You can watch the video of STAIR 1 operating an elevator below, and you can watch more videos and learn more about the STAIR program at stair.stanford.edu. You can also read Morgan's slides on ROS and STAIR from an IROS 2009 workshop.

In addition to the many contribution made to the core, open-source ROS system, you can also find STAIR-specific libraries at sail-ros-pkg.sourceforge.net/, including the code used for elevator operation.

The science of robotics has suffered from the inability of researchers to replicate each other's results. Replicating results begins by being able to run demonstrations in different laboratories, often on different hardware. The JSK lab at the University of Tokyo and Willow Garage have recent had some success in this area.

In March, Professors Inaba, Okada and four students visited Willow Garage to create demos on PR2 robots combining their infrastructure with ROS. At ICRA in May, Ken Conley from Willow Garage worked with the JSK team to bring ROS and those same demonstrations up on an HRP-2V robot from Kawada industries. The HRP-2V combines the torso of an HRP-2 walking humanoid with an omni-directional wheeled base, producing a platform that is similar in structure to the PR2, but with different sensor configuration, different kinematics, etc...

On both occasions, the combined team was able to complete their work in under a week, demonstrating that replicating results in robotics is possible at a relatively low cost.

Find this blog and more at planet.ros.org.


Monthly Archives

About this Archive

This page is an archive of recent entries in the robots category.

reps is the previous category.

ros.org is the next category.

Find recent content on the main index or look in the archives to find all content.