March 2014 Archives

Call for Participation: ROS Kong 2014

| No Comments | No TrackBacks

We're pleased to announce that registration is now open for ROS Kong 2014, an international ROS users group meeting, to be held on June 6th at Hong Kong University, immediately following ICRA:


This one-day event, our first official ROS meeting in Asia, will complement ROSCon 2014, which will happen later this year (stay tuned for updates on that event).

Register for ROS Kong 2014 today: Early registration ends April 30, 2014.

ROS Kong 2014 will feature: * Invited speakers: Learn about the latest improvements to and applications of ROS software from some of the luminaries in our community. * Lightning talks: One of our most popular events, lightning talks are back-to-back 3-minute presentations that are scheduled on-site. Bring your project pitch and share it with the world! * Birds-of-a-Feather (BoF) meetings: Get together with folks who share your specific interest, whether it's ROS in embedded systems, ROS in space, ROS for products, or anything else that will draw a crowd.

To keep us all together, coffee breaks and lunch will be catered on-site. There will also be a hosted reception (with food and drink) at a classic Hong Kong venue at the end of the day. Throughout the day, there will be lots of time to meet other ROS users both from Asia and around the world.

If you have any questions or are interested in sponsoring the event please contact us at

Sincerely, Your ROS Kong 2014 Organizing Committee Tully Foote, Brian Gerkey, Wyatt Newman, Daniel Stonier

From Bert Willaert of Intermodalics.

Intermodalics is currently developing a depalletizing application for a client. The goal is to move an average of 2,000 crates per hour from standard pallets to a conveyor belt. Additional challenges include: more than 10 different crate types can occur in varying colors, the crates are not necessarily empty and they are randomly stacked.

The application consists of a UR10 robot from Universal Robots, a 3D camera, an Intermodalics Intelligent Controller (IIC) and an active pallet lift. The software for the application running on the IIC extensively uses ROS and the OROCOS toolchain. OROCOS is a software framework for realtime, distributed robot and machine control which is seamlessly integrated with ROS and has both Industrial and Academic users worldwide.

For finding the crates' position and orientation, Intermodalics developed a crate localizer that builds upon the PCL library as well as on a set of in-house developed point-cloud processing algorithms. The ROS visualization tool RViz proved absolutely invaluable during the realization of this product locator.

The use of the ROS-Industrial package for the UR robot allows both the motions and the application state machine to be simulated. This significantly facilitates the implementation of the whole application.

The integration of the UR controller and the IIC does not affect the inherent safety feature of the UR robot which makes the robot stop if it encounters excessive forces. If such a stop occurs, the application can be easily restarted by a simple human operator intervention.

Announcing a ROS Japan Users Group Meetup

| No Comments | No TrackBacks
From Daiki via ros-users@

Content? Explanation of the concept of ROS
Organizer? ROS Japan User's Group and Mamezou Inc.
Number of participants? 30
Venue? 2-7-1 Nishi Shinjuku, Shinjuku City, Tokyo
Dates? April 12, 2014 at 13:30 ~ 18:30
Twitter hashtag? #rosjp

Scheduled to be held every month.

ROS Indigo Igloo buildfarm running

| No Comments | No TrackBacks
We're pleased to announce the ROS build farm for Indigo Igloo is now available.  It includes over 180 packages already for Ubuntu 13.10, Saucy Salamander, and Ubuntu 14.04, Trusty Tahr. We expect that number to continue to grow rapidly. Installation instructions already exist for Ubuntu using debians or for compiling from source [1], and you can see the status of Indigo packages on this page:

If you are a maintainer please look at what packages have been released and consider releasing yours as soon as your upstream dependencies have been satisfied.  If you are blocked on another package being released please contact the maintainer.  And if you cannot reach the maintainer please email (join if you aren't a member already).

If you are planning to release into Indigo please read the information provided in the migration guide [2] and refer to the bloom tutorials [3] for doing the release. Please also contribute to the migration guide for updates relating to your package.

After releasing your packages the build farm will keep you notified of the status of the individual jobs. Please pay attention to the automated emails from the buildfarm, if jobs are failing they block downstream packages from releasing and waste our build resources. 

Job Opening at Clearpath Robotics

| No Comments | No TrackBacks
From Ryan Gariepy via ros-users@

Position:               Multi-Robot Autonomy Engineer
Location:              Kitchener, Ontario
Experience:          1-5 Years Relevant Work Experience
Education:            Graduate Degree in Related Field

About Us

Clearpath Robotics Inc. specializes in the design and manufacture of
unmanned vehicle systems, software, and components for academic and
industrial research and development.  Our clients range from small
local businesses to some of the best known technical institutions on
the planet.  Based in Kitchener-Waterloo, Clearpath Robotics employs
highly talented people who live and breathe robotics.  We believe that
work must have a high "cool" factor, and we're looking for people who
share in our passion to create remarkable products and change the

About the Job

We require robust implementations of the latest multi-agent control
and planning algorithms that can function within the constraints of an
unstructured environment, real-world motion dynamics and sensing
constraints. We've been building robots for a while and our clients
are now asking for more than one of our robots to work together in the

You will stay on top of recent developments in multi-agent control and
planning. You will continually evaluate how these algorithms will
benefit our current customers and product offering. Additionally, you
will have to figure out methods to organically incorporate multi-agent
autonomy into the autonomy features currently offered on our robots.
This includes appropriately interfacing with advanced control and
perception algorithms. Finally, you will be field testing these
algorithms to ensure robustness on the field and in real applications.
You will be spending warm summer days driving robots around outside
(cold winters too; this is Canada after all).

 Your primary responsibilities will be:
*    Multi-agent controller design and optimization for autonomous
vehicles with varying dynamics
*    Multi-agent simulation development
*    Algorithm prototyping and implementation

Additional tasks may include:
*    Developing & carrying out system test plans
*    General software development & testing
*    Mentoring and assisting with supervision of interns
*    Explaining our newest shiny toys to the sales & marketing team

About You

You want to work for a small company that thinks big and dreams huge.
You are driven, view work as more than just a job, and are never
satisfied with a project left half-done.  You want to be surrounded by
people like you; creative, fun-loving, and passionate about their
work.  You are motivated by making an impact on your workplace and you
thrive on challenging and rewarding problems.   Oh, and you have some
form of higher education with the common sense to back it up.

Required Technical Skills:
*    Graduate degree in engineering or a related field, with
applicable background
Practical knowledge of  multi-agent planning and control based in a
(primarily) centralized framework
*    Working knowledge of decentralized decision making and/or swarm
Strong software development skills (C, C++, Python preferred),
Proficiency with Linux
*    Hands-on experience with autonomous systems

Desired Soft Skills:

*    Ability to efficiently and clearly communicate ideas, including
to those who may have a limited theoretical background in the area
*    Comfortable with abrupt changes to project deadlines, job
responsibilities and the local gravity field

Bonus points for:

*    ROS, MATLAB, LabVIEW, Gazebo, Player, experience
*    Multi-agent networking or mesh network experience
*    Understanding of sensors and their error models, particularly
laser rangefinders, GPS systems, and vision systems
*    Experience with the control of skid-steer and differential drive
ground vehicles
*    Ability to perform general hands-on troubleshooting of
electromechanical systems
*    Exposure to SLAM and vehicle control methodologies
*    Ability to diagnose broken robots by their sounds and smells

What Now?

Apply through our online job portal using this link: Please submit cover letter along with your
resume. Instructions for sending supporting documentation, including
testimonials as well as conference papers, journal articles, source
code, portfolio media, references, or other indications of exceptional
past work will be provided in the confirmation email sent by our
system upon receiving your application. Please include "Multi-Robot
Autonomy Engineer" in the subject of any further communications. If
your skills don't fit this job description, but you're still
interested in working with us please apply to our "General Robotics
Enthusiast" position. No recruiters or form cover letters, please.
They do not please our mechanical masters.

New Package nav2d

| No Comments | No TrackBacks
From Sebastian Kasperski via ros-users@

Hello ROS users,
I would like to share a set of ROS packages that provide nodes for autonomous exploration and map building for mobile robots moving in planar environments. More information and some help can be found in the ROS-Wiki:
The source is available via Github:
It contains ROS nodes for obstacle avoidance, basic path planning and graph based multi-robot mapping using the OpenKarto library. Autonomous exploration is done via plugins that implement different cooperation strategies. Additional strategies should be possible to implement with only little overhead.
These nodes have been used on a team of Pioneer robots, but other platforms should also do. A set of ROS launch files is included to test the nodes in a simulation with Stage. Please feel free to try it and post issues on Github.

Software Engineer at Exciting 3D Mapping Startup

| No Comments | No TrackBacks

From Ryan Thompson via ros-users@

Quanergy is a Silicon-Valley-based startup developing smart sensing solutions for real-time 3D mapping and object detection, tracking, and classification. We're a small company run by engineers, dedicated to building next-generation LiDAR technology for autonomous vehicles and advanced driver assistance systems. By joining our team at this point, you'll play a key role in the development of our company, not just our software. We're looking for someone extremely bright, driven, a great communicator and explainer, and just as passionate about the future of transportation and perception as we are!

Job Description:

The Software Engineer at Quanergy will be responsible for designing, developing, and maintaining our map data structure and access system and parallelizing localization with a GPU, all based on point cloud data generated by our next-generation LiDAR sensors. She will work closely with co-workers to test and optimize code for real-time application on the embedded CPU and GPU. He will keep current with the latest research and advances in the field, help shape the direction of software side of the company, and contribute to the sensor integration, mapping, and perception efforts of the software team.


  • B.S., M.S., or Ph.D. in Computer Science, Electrical Engineering, or a related field

  • Fluency in C++ and Linux

  • CUDA (or OpenGL) expertise

  • Strong mathematical foundation

  • Willingness and ability to tackle problems outside his/her areas of expertise

  • Academic or professional experience in at least one of: Robotics, Parallel Programming, Real-Time Embedded Systems, Game Development


  • ROS and/or PCL familiarity

  • Experience with optimization for real-time computing

  • Able to start immediately


Quanergy offers very competitive Silicon Valley salaries and equity.


Email for more information. To apply, email with a resumé and cover letter, or apply on Stack Overflow:

HERE mapping cars run ROS

| No Comments | No TrackBacks

As reported at HERE Three Sixty, their global fleet of hundreds of mapping cars is running ROS!

HERE car

They carry laser range-finders, cameras, and GPS that are used to estimate the vehicle's posisiton and gather 3-D pictures of the surrounding environment. That data gets shipped back to their headquarters for processing.

As HERE's Michael Prados put it, "The system of sensors and computers means the software that's needed is very like that which is used to create robots." So they decided to build their cars' software on ROS. The software runs on a headless server in the car's interior, with the driver interacting via a mobile application on a tablet that he or she can operate easily from the seat.

HERE car interior

"We chose the open source ROS because it was the best solution, hands-down," Michael concludes. "And now we're looking into the ways that we might give back to OSRF, and help its future success."

Read the whole story at HERE Three Sixty.

New Package: catkin_lint

| No Comments | No TrackBacks
From Timo Röhling via ros-users@

I have created a tool to check catkin packages for common build
configuration errors. I announced it to the ROS Buildsystem SIG a while
ago, and I think it is ready for public scrutiny:

PyPI Package:
Ubuntu PPA:

It runs a static analysis with a simplified CMake parser. Among the
checks are order constraints of macros, missing dependencies, missing
files, installation of targets and headers, and a few other things. The
checks are inspired by the catkin manual and issues I encountered in my
daily work routine.

Give it a try and feel free to post any issues on Github.

New Package: ROS Glass Tools

| No Comments | No TrackBacks
From Adam Taylor via ros-users@

We would like to announce ros_glass_tools, an open source project that aims to provide easy voice control, topic monitoring, and background alerts for robot systems running ROS using the Google Glass.  It communicates with ROS using the rosbridge_suite.  

More information about the tools can be found at the following links.


| No Comments | No TrackBacks
Crossposted from

Albert II is famous for being the first monkey in space, in June 1949. Laika is equally renowned for being the first animal to orbit the Earth, in 1957. On Sunday, March 16th, at 4:41am (unless inclement weather intervenes), ROS will celebrate its own celestial milestone when it is launched into space aboard a SpaceX rocket as part of a resupply mission to the International Space Station (ISS).

Albert II

In conjunction with NASA's Robot Rocket Rally March 14-16 at the Kennedy Space Center in Florida, SpaceX's third mission will include a set of robotic legs for the Robonaut 2 (R2) humanoid torso that is currently aboard the ISS. Once those legs are attached to R2, ROS will officially be running in space.

For the last few years, the NASA/GM team at the Johnson Space Center has been using ROS for R2 development here on Earth. We first heard about that at ROSCon 2012 in Stephen Hart's keynote presentation, where he described how they combine ROS and OROCOS RTT to achieve flexible, real-time control of R2. Following the launch this weekend, that open source software will be running on the R2 that's on ISS.

Robonaut 2 legs
Robonaut 2 simulation

The R2 team also uses the open source Gazebo simulator to simulate R2 when they're doing development and testing. They've released their models of R2 and ISS as open source for the community to work with. We recently integrated those models into an immersive teleoperation Gazebo demonstration that we'll be running at the Robot Rocket Rally this weekend. Drop by our booth and find out what it's like to "be" Robonaut 2!

ROS has already powered robots in the air, on the ground, on and under the water, and on every continent, but we at OSRF couldn't be more excited about ROS journeying to outer space.

From Kel Guerin at Johns Hopkins University


At the Laboratory for Computational Sensing and Robotics at Johns Hopkins University, we have utilized the extensive visualization tools available in ROS to create an immersive virtual reality environment for interacting with robots. The versatile plug-in system for the RVIZ visualization package has allowed us to create virtual user interfaces, information displays, and interactive objects that co-exist with other resources in the RVIZ environment. Additionally, the excellent Oculus Rift RVIZ plugin gave us the perfect starting point for using RVIZ as a VR environment. This provides us an excellent test-bed for virtually teleoperating an teleprogramming our robots. Finally, the flexibility of ROS lets us deploy IVRE on several robots in our lab, including industrial systems and surgical robots. For more information on the tools we used, checkout the oculus rviz plugin and the RVIZ plugin API.

New Package: Announcing ROS/DDS proxies

| No Comments | No TrackBacks
From Ronny Hartanto of DFKI GmbH via ros-user@

Hi Everyone,

We are happy to announce the ros_dds_proxies:

As recently, there was some discussion on using DDS as communication layer in ros. This package contains our implementation on using DDS middleware for a multi-robot systems. We have been successfully using this implementation in our project (IMPERA). In our experiments, all the messages were successfully delivered to all robots, even with communication outage for about 15 minutes. 

Any comment or improvement are welcome.

From Nick Weldin at Middlesex University

Middlesex University London is running an Introduction to ROS summer school in Lundon, June 14th-18th. It will be a practical hands on class with 10 turtlebot 2 robots and a Baxter Research Robot. More details are available at
From Angel Merino Sastre & Simon Vogl via ros-users@

Hi all,

We are happy to announce the sentis-tof-m100 ros package:

This package provides support for the Bluetechnix Sentis ToF M100 camera
based on the software API that is provided with the camera, along with
a detailed installation how-to and a ready-to-use launch file with a
visualization example based on rviz.

Any comment/suggestions are welcome

The Shadow Robot Company is excited to announce the next module in the RoNeX range - the RoNeX SPI Module!

This Module allows the connection of multiple SPI (Serial Peripheral Interface) devices to ROS. It provides 4 discrete SPI ports, plus 6 Analogue inputs and 6 General Purpose Digital I/O lines. The Digital I/O can be used as additional SPI chip select lines, allowing the module to interface with up to 10 SPI devices in total.

RoNeX makes data from these SPI devices directly accessible via ROS topics, and of course a single RoNeX stack can comprise a mix of GIO Modules and SPI Modules that best suits your project.

The RoNeX ROS wiki page can be found here and source here. More details on RoNeX here.

Find this blog and more at

Monthly Archives

About this Archive

This page is an archive of entries from March 2014 listed from newest to oldest.

February 2014 is the previous archive.

April 2014 is the next archive.

Find recent content on the main index or look in the archives to find all content.