Recently in misc Category

Announcing ROSComponents online store

From Román Navarro via ros-users@

I'd like to announce a new online store for robots, sensors and components supported by ROS: https://www.roscomponents.com

Why ROS-Components?

In recent years, ROS has become the standard in Service and Research Robotics, and it's making great advances in the Industry.

Most of the robots and components in the market support ROS, though sometimes finding which are really supported, what ROS version they make use, and how to get them is a difficult task. One of our main purposes is to make it easier and simpler for the customer, linking the products with their ROS controllers, explaining how to install and configure them and showing where to find useful information about them.

All the products in the site are supported by ROS, either available directly in the ROS distribution or through their source code. The ROS community has a new meeting point in ROS Components!

ROS as standard

From ROS-Components we strongly believe that ROS is and will be the standard in Robotics for many more years. Therefore we want to encourage roboticists to use it (whether you are not already doing so) as well as manufacturers to give support to it.

Supporting ROS and its Community

As you know, the ROS core is currently being maintained by the Open Source Robotics Foundation (OSRF), which is an independent non-profit R&D company leading the development, maintenance of ROS versions and hosting all the necessary infrastructure.

From ROS Components we try to encourage the use of ROS as well as its maintenance and growth. Therefore we are going to donate part of the benefits of every sale to the OSRF. So, every time you buy in ROS Components, you'll be contributing to the ROS maintenance and development.

Volvo ROAR project

From Per-Lage Götvall, PM ROAR

The basic reason for forming this project was the question: "How do we make autonomous machines working together on common task?". E..g. when using an autonomous wheel loader loading gravel on a truck, who is deciding on their relative positions; the truck, the loader or a supervising system?

To make a first approach we decided to this in the frame of Volvo Group Academic Preferred Partner (APP) network, involving students and researchers from Chalmers and Mälardalen universities in Sweden and Penn State University, Pennsylvania, US and the Swedish waste mgmt. company, Renova. We all agreed that using ROS was a must to, on one hand, coordinate the three universities and also, use the development made within the frame of ROS (e.g. Gazibo, Rviz, Moveit, Drivers etc.). Thank's to a great engagement from the researchers and students, and of course the ROS components we manage to make this (and a lot more, not shown in the video).

More than 30,000 Questions on ROS Answers

We've reached another milestone for ROS Answers, 30,000 questions asked!

answers.ros.org_30000.png

The 30,000th question was asked Friday by @Mani who regularly helps answer others questions as well.

To see the many contributors to the site please view the list of users

Congratulations to the answers.ros.org community for making the site the thriving resource that it is today. Keep up the fantastic work, and keep the questions--and answers--coming.

With the awareness on the site. If you've asked a question and not marked it answered. Please consider revising it with more details or to add clarity. And likewise consider trying to answer one question each time you're on the site.

Gaitech Educational Portal for ROS

From Anis Koubaa via ros-users@

Gaitech International Ltd http://www.gaitech.hk/ is happy to announce the release of its educational portal, Gaitech EDU.

Website: http://edu.gaitech.hk/

Gaitech EDU provides a comprehensive educational framework on Robot Operating System (ROS) through a series of tutorials and online videos.

Gaitech EDU is an educational website on robotics and in particular on Robot Operating System (ROS). The objective is to provide an easy-to-follow educational content that helps in better mastering the concepts of ROS and promoting its use for developing robotics software. Gaitech company strives to contribute to the development of ROS and provides its customers and ROS users with technical support and an open education framework to learn ROS.

Gaitech Education website is NOT meant to be a substitue of ROS wiki documentation website, but a complementary website that is more oriented to providing education and teaching material.

As the primary objective of Gaitech EDU is to promote education of ROS, tutorials were designed with teaching objectives in mind. Each tutorial starts with Learning outcomes that the student or the learned is expected to know at the end of the tutorial. Then, the tutorial is provided in both textual format and/or video illustrations. Finally, a series of review questions are proposed so that the student self-evaluation his understanding about the concepts presented in the tutorial. It can be used as additional teaching resources in robotics courses using ROS.

More details could be found in the FAQs in the website http://edu.gaitech.hk/

In addition, Gaitech provides the Gaitech EDU Forum http://forum.gaitech.hk/ where users and customers may ask questions and post comments about the educational content. In addition, a mailing list http://lists.gaitech.coins-lab.org/listinfo.cgi/gaitech_edu_users-gaitech.coins-lab.org is available to stay tuned with any updates of the educational content.

Enjoy using and sharing Gaitech EDU portal. We will be happy to receive your comments about the Gaitech EDU Portal.

Qt Creator IDE Plug-in for ROS

From Paul Hvass @ROS-Industrial

The ROS Qt Creator Plug-in is developed specifically for ROS to increase a developers' efficiency by simplifying tasks and creating a centralized location for ROS tools. Since it is built on top of the Qt Creator platform, users have access to all of its existing features like: syntax highlighting, editors (C++ , Python, etc.), code completion, version control (Git, Subversion, etc.), debuggers (GDB, CDB, LLDB, etc.), and much more

The ROS Qt Creator Plug-in provides the following capabilities:

  • Import/Create Catkin Workspaces
  • Create Catkin Packages
  • Custom Build and Run Configuration

    • catkin_make (Debug, Release, Release with Debug Info, Minimum Size Release)
    • roslaunch
    • rosrun
    • sourcing workspace
      Note: The Qt Creator Plug-in supports multiple configurations to enable quick switching between configurations, and everything is saved
  • Integrated Tabbed Terminal

  • Templates
    • Industrial Robot Support Package
    • Basic Launch File
    • Basic URDF File
    • Basic Node File
      Note: Users may create custom templates.

Check out two videos. The first is a short overview of the Qt Creator and its default capabilities. The second video is an overview of the ROS Qt Creator Plug-in developed by Levi Armstrong from Southwest Research Institute. It concludes with an invitation for other to begin using the plug-in for ROS development.


From Mark Siliman

Robotic startup Tend.ai, which just came out of stealth mode today successfully built the world's first fully automated 3D printing system controlled by cloud robots.

One robot fully automates ten 3D printers in the video. The prints are boxed and pushed down a conveyor belt. Any 3D printer can be used, and Tend.ai's artificial intelligence "reads" (OCR) the printers' displays as well as pushes the buttons just like a human would.

Tend.ai allows you to train, control and monitor most collaborative robots from any device (e.g. your mobile phone) without any technical expertise. Tend.ai automatically monitors the state of all machines and optimally executes them.

Tend.ai utilizes ROS in the cloud to control, train and monitor suites of robots from any device. Thanks to cloud computing, standard webcams (< $100) can be used for the vision system.

Tend.ai can tend most machines without any modification or networking.

ROS Platforms Survey

We are interested in knowing which hardware platforms are the favorite ones to run ROS so we'd like to ask a few minutes of your time to fill the following survey. We'll be sharing the results once it's closed. Thanks for your collaboration!

Click here to take the survey.

From Anis Koubaa via ros-users@

I am happy to announce the call for chapters for the Springer Book on Robot Operating System (ROS) Volume 2 is now open. 

The book will be published by Springer. 

We look forward to receiving your contributions to make this book successful and useful for ROS community. 

In Volume 1, we accepted 27 chapters ranging from beginners level to advanced level, including tutorials, case studies and research papers. The Volume 1 is expected to be released by Feb 2016.
After negotiation with Springer, the authors have benefited of around 80% of discount on hardcopies as an incentive to their contribution, in addition to publishing their work. 

The call for chapters website (see above) presents in detail the scope of the book, the different categories of chapters, topics of interest, and submission procedure. There are also Book Chapter Editing Guidelines that authors need to comply with. 

In this volume, we intend to make a special focus on unmanned aerial vehicle using ROS. Papers that present the design of a new drone and its integration with ROS, simulation environments of unmanned aerial vehicle with ROS and SITL, ground station to drone communication protocols (e.g. MAVLink, MAVROS, etc), control of unmanned aerial vehicles, best practices to work with drones, etc. are particularly sought.

In a nutshell, abstracts must be submitted by February 15, 2016 to register the chapters and to identify in advance any possible similarities of chapter contents. Full chapters submission is due on April 20, 2016.
Submissions and the review process will be handle through EasyChair. Link will be provided soon.

Each chapter will be reviewed by at least three expert reviewers, one at least should be a ROS user and/or developer. 

Want to be a reviewer for some chapters?
We look for the collaboration of ROS community users to provide reviews and feedback about proposals and chapters to be submitted for the book. If you are interested to participate in the review process, please consider filling in the following reviewer interest form

We look forward to receiving your contribution for a successful ROS reference!
From Mirko Bordignon via ros-users@

individuals and teams from industry and academia are invited to submit an application for the upcoming euRobotics Technology Transfer Award, which will be a part of the "European Robotics Forum" to be held in Ljubljana 21-23 March 2016(http://www.erf2016.eu/).

 

Detailed information on the application procedure is available at http://www.erf2016.eu/index.php/techtransfer-award/

In case of questions you can contact Martin Hägele at martin.haegele@ipa.fraunhofer.de

New ROS computing platform Snickerdoodle

| No Comments | No TrackBacks
From Ryan Cousins via ros-users@

We've been working on a new ROS/robotics development platform for a while and recently launched a pre-order campaign:


It's specifically targeted at robotics and UAV applications and we're supporting ROS running on top of Snappy Ubuntu.

I would be interested in any feedback the group might have, as well as to hear from anyone who might be interested in participating in the Alpha program.

Call for videos for an 8 Years of ROS montage

| No Comments | No TrackBacks

Please help us make another great ROS Montage for the upcoming 8th anniversary of ROS. To show off the great variety of things people are doing with ROS we need your videos to share with the community.


Please submit your videos to be considered for inclusion in the 8 Years of ROS montage before November 1st.


Submit your videos here!

 

As an example here is the montage we put together 3 years ago to celebrate 5 years. 

New sig for Shadow Robot

| No Comments | No TrackBacks
From Ugo Cupcic via ros-users@

I just created a SIG to synchronize discussions around our robots and software: http://wiki.ros.org/sig/shadow_robot

The global aim is to facilitate discussion with our user base. For example, we're in the process of refactoring our repositories right now to make for an easier release process and more modular approach. During that process we'd love to have more feedback from our end users but it is quite hard for us to reach them. We also think that it'd be a good place to discuss different uses of our hardware / software, sharing exciting demos or tutorials, discussing new features, etc.

If you are using our software (the Shadow hand simulation or real Hardware, the Cyberglove package, etc...), I hope you'll be joining that list!

From Florian Lier via ros-users@

please find the call for the first CITEC Month of Open Research below. We offer stipends to (PhD) students from all over the world.

We currently offer two ROS-enabled projects:


The CITEC Month of Open Research (MORe) is an international program that offers students stipends to contribute to open source or open data projects related to the research areas of cognitive interaction technology. MORe is organized by the Excellence Cluster Cognitive Interaction Technology (CITEC) of Bielefeld University, Germany. We fund students from abroad who would like to participate in exciting projects that are proposed by experienced mentors of CITEC research groups. Participants can gain a maximum paycheck of 1.500 EUR. We accept applications of English speaking students from all over the world!

  • The application period started: July 27, 2015
  • Deadline for student applications: August 21, 2015
  • Notification of acceptance: August 28, 2015

Available projects: https://cit-ec.de/en/more/projects

Fore information, please visit: * https://cit-ec.de/en/more * https://cit-ec.de/en/more/faq

Get involved!

Kind regards, The MORe organizing team

ROS installation for OS X

| No Comments | No TrackBacks
From Mike Purvis via ros-users@

Greg Brill and I have been working on establishing a script that can more or less automatically set up a ROS desktop_full install on Mavericks and Yosemite; we'd be glad for a few brave souls to give it a try:


Some key ways in which this differs from the official instructions is that it:
  • uses brewed python instead of system python,
  • uses catkin_tools to build in parallel instead of catkin_make_isolated,
  • requires very minimal and well-documented sudo use, and
  • tries to automatically detect some problematic configurations and provide appropriate prompts/suggestions.
The desktop_full build itself is about 30 minutes; the remainder of the time is spent fetching and building dependencies (especially VTK and Gazebo 5). Total time on most systems should be < 1h.

Our intent is to set up some form of CI which can periodically re-run this setup on a vanilla machine, and thus keep it from regressing, however such a thing is still to come-- this overall procedure is long enough that it's not a good fit for Travis CI.

Bugs and pull requests welcome,

New manipulation SIG

| No Comments | No TrackBacks
From Ugo Cupcic via ros-users@

As discussed with a few people, I created a new SIG for manipulation / grasping: http://wiki.ros.org/sig/manipulation.

The idea is to create a group of people interested in these subjects to discuss common problems, focalise some of the development, etc...

Here's a short list of topics that could be of interest (this is of course non exhaustive):
 - common messages and interfaces
 - tactile sensors 
 - planners
 - generic nodes / base class
 - new interesting hardware

Proposed: CAD to ROS Focused Technical Project

| No Comments | No TrackBacks

Cross posted from ROS-I

he ROS-Industrial Consortium is tackling a topic that is of interest to the whole ROS community: conversion of CAD data to ROS-interpretable file types (e.g. URDF, SRDF). This work will be conducted over the next three years by the TU Delft Robotics Institute. To help us make ROS even more convenient to use:

Real-Time ROS for Embedded Systems

| No Comments | No TrackBacks
From Yigit Gunay via ros-users@

We are developing a lightweight implementation of the ROS middleware on STM32F4Discovery for interfacing embedded and general-purpose software. Currently, we can run multiple ROS nodes concurrently on STM32, and we can send ROS messages between a PC and STM32 over Ethernet (only UDPROS).

Please take a look at our repository on Github if you are interested in our real-time ROS development: https://github.com/bosch-ros-pkg/stm32.

I would appreciate your comments. Thanks for your attention!

Gazebo survey

| No Comments | No TrackBacks
Last year we sent out a survey asking what the community would like to see from simulation. http://gazebosim.org/blog/survey_2014

We didn't get to everything on the list, but we did keep busy. A few highlights from the past year are:

1. Improved documentation and tutorials
2. Easier installation: Gazebo will be in Ubuntu Vivid
3. Windows support: Still working on an installer
4. Aerodynamics and subsurface water simulation

We'd like to get your feedback again. The following is another survey with 13 short questions. 

http://goo.gl/forms/thCpcy6Hsh

Thanks your time and support,

ROS and rospy on Talk Python To Me Podcast

| No Comments | No TrackBacks
Episode 7 of Talk Python To Me features Dirk Thomas talking about the use of Python in ROS and rospy. The episode description is: 

Programming is fun. Robots are fun. Programming robots is awesome! This episode Michael speaks with Dirk Thomas from the ROS (Robot Operating System) project. You will learn how to use ROS and ROSPy to program robots.

We discuss how to use ROS from some of the largest and most complex robots built (including one on the International Space Station!) all the way down to basic robots controlled via micro-controllers such as arduinos.  

You can listen to the podcast or download it from: http://www.talkpythontome.com/episodes/show/7/robot-operating-system-ros-and-rospy

Robotics Fast Track now accepting applications

| No Comments | No TrackBacks
Cross posted from www.osrfoundation.org

We're excited to announce that OSRF and BIT Systems are seeking innovative and revolutionary robotics projects for the Robotics Fast Track (RFT) effort, sponsored by the Defense Advanced Research Projects Agency (DARPA).

The goals of Robotics Fast Track are:

  1. Enable rapid, cost-effective development of new robotics capabilities designed to respond to, and even anticipate, quickly evolving needs in space, maritime, ground, and air operations. RFT will focus on the development of groundbreaking robotic hardware and software by funding novel approaches as well as creative adaptations of existing technologies.
  2. Achieve breakthrough capabilities in less time and at a fraction of the cost typical of government-supported robotic development processes by engaging highly agile organizations and individuals who traditionally have not worked with the U.S. government.

Learn more and apply at rft.osrfoundation.org!

ROS Cheatsheet updated for Indigo Igloo

| No Comments | No TrackBacks
From Aaron Blasdel via ros-users@

The good old ROS CheatSheet has just been released for Indigo. If you know anyone just starting out in ROS please send this on to them.

I recently performed some much needed cleanup, reformatting, and content addition for the CheatSheet. Most notably the GUI tools section has been greatly improved and includes information on the RQT toolset.

Further it now comes in two flavors, New and Improved Catkin Flavor and Original Extra Crispy Rosbuild. Many thanks to Kei Okada of the JSK lab for adding this dual build functionality and his edits for Hydro!

If you find any errors or glaring omissions please create an issue so we can discuss them or a pull request to fix it on the ros/cheatsheet repo.

I hope this is helpful!

Clearpath offers ROS consulting service

| No Comments | No TrackBacks
Reposted from OSRF Blog

Our friends at Clearpath Robotics announced today that they're offering ROS consulting services for enterprise R&D projects. And they've committed to giving part of the proceeds to OSRF, to support the continued development and support of ROS!

This service is something that we've heard requested many times, especially from our industry users, and we're excited that Clearpath is going to offer it. If you're looking for help or advice in using ROS on a current or upcoming project, get in touch with Clearpath.

From Meghan Hennessey

Clearpath Robotics, a leader in unmanned vehicle robotics, has combined resources with Christie®, one of the most innovative visual technologies companies in the world, to create a three-dimensional video game using robots. The pairing of Clearpath and Christie bridges two technologies, from unrelated fields, to create an interactive experience in a way that has never been done before. 

 

The project was produced during Clearpath's "hack week," where team members experiment and innovate with new technology and ideas. Computer graphics were displayed on the floor using Christie's 3D projection mapping equipment to create a digital arena, while robots dueled with laser beams. Clearpath was inspired by a project from MIT; however, they wanted to create a version using open source software and run as a completely interactive program.

 

"Teaming up with Christie allowed us to experiment with the latest 3D projection mapping technology in combination with our Jackal robots and open source software. This was our recipe for an augmented reality video game," said Ryan Gariepy, Co-Founder & Chief Technology Officer at Clearpath Robotics. "Combining both of our technologies resulted in a one-of-a-kind experience that was fun to work on and even more fun to play with."

 

Augmented reality is a term used to describe the superimposing of a computer image in the real world.

Utilizing Christie's overhead 3D projectors, the Clearpath team created an overlay under their Jackal unmanned ground vehicles to display weapons, recharging shields, hitpoints, and sound effects for a two player (or human vs. A.I.) game.

 

For this project, Christie provided four Christie HD14K-M 14,000 lumens 3DLP® projectors and two cameras. The projectors use Christie AutoCal™ software and have Christie Twist™ software embedded right in. Christie rigged the four projectors in a 2 x 2 configuration on the ceiling of our warehouse. The cameras captured what was happening on the floor and sent that information on the Christie AutoCal™ software, which then automatically aligned and blended the four projectors into one, giant, seamless 30-foot projection-mapped digital canvas. The Christie hardware and software, in conjunction with two of Clearpath's Jackal robots and computer system allowed for the augmented reality experience to take place.

 

For more details and a video of the project visit http://www.clearpathrobotics.com/blog/hack-week-augmented-reality.

ROS-I 3-yrs. Montage Video

| No Comments | No TrackBacks
From Paul Hvass, posted on ROS-Industrial blog


Thanks to those in the ROS-I community who contributed to the ROS-I 3 yrs. Montage video! We would like to ackowledge:

  • Calibration of camera to robot: SwRI
  • Denso VS060 path planning using ROS-Industrial Cartesian Planner: TORK
  • Cartesian Planner plug-in for MoveIt!: BioRobotics Institute at Scuola Superiore Sant'Anna/MicroBio Robotics Institute at the Italian Institute of Technology/SwRI/GSoC
  • Process Simulate to ROS bridge: Siemens
  • Path planner optimization and planning request adapter plug-in for MoveIt!: IDEXX/RIC-Americas
  • Block pick and place: Technolution
  • Palletizing unknown products: Alten Mechatronics
  • Plastic crate depalletizing with lightweight robot: Intermodalics
  • Pick and place with obstacle avoidance: Deere and Co.
  • Factory-in-a-day, EU FP7 Factory of the Future 2013 Programme (FP7-2013-NMP-ICT-FoF)
  • Robotic 3D scanning: Institute Maupertuis
  • ROS-I training class pick and place exercise: RIC-Americas
  • ROS-Industrial Consortium Robotic Routing FTP, Testing at CNRC: RIC-Americas
  • ROS-Industrial Consortium Robotic Blending FTP Milestone 2 Update: RIC-Americas
  • 8-DOF microscope positioning for TEM: Alten Mechatronics
  • Multiscale teleoperation: UT Austin Nuclear and Applied Robotics Group
  • Mobile robotic 3D scanning: UT Austin Nuclear and Applied Robotics Group
  • Rob@Work3 logistics: Fraunhofer IPA
  • Euler automated warehousing: SwRI
  • PRACE dual-arm robot: Fraunhofer IPA
  • YouBots pick and place multiple arm cooperation: NIST
  • Dual arm robot coordinated motion: Fraunhofer IPA/Yaskawa Motoman Robotics
  • BMDA3 dual arm robot: Fraunhofer IPA/Yaskawa Motoman Robotics/SwRI
  • Rangar TT: Blue Force Robotics

New release of ROS build farm

| No Comments | No TrackBacks
A new version of the new Docker-based ROS build farm has been released.
It now covers all jobs necessary for ROS Indigo and Jade (which we plan to deploy to jenkins.ros.org in the near future).

If you are already using the new ROS build farm you should consider updating to the latest Python package "ros_buildfarm" 0.2.0.
The documentation has also been updated since the previous release to cover all new configuration options as well as the build files for doc jobs.
All relevant information is referenced from the ROS wiki page http://wiki.ros.org/buildfarm.

We will continue to run our test installation covering ROS Indigo as well as Jade until it gets deployed on jenkins.ros.org.
You can find Jenkins here: http://54.183.26.131:8080
and the apt repository as well as the generated documentation here: http://54.183.65.232/

If you are trying to deploy a custom ROS buildfarm and have any issues doing so please let us know.
You can do this either via the build farm mailing list (https://groups.google.com/forum/#!forum/ros-sig-buildfarm) or in the issues tracker of the relevant GitHub repository.
Even if you have didn't run into any problems when setting up a custom build farm please consider sending us a brief message so that we know that it works for others.

Call for testing new version of RViz

| No Comments | No TrackBacks
From William Woodall via ros-users@

We have new versions of rviz in both Indigo (1.11.7 up from 1.11.4) and Hydro (1.10.19 up from 1.10.18) in the shadow fixed repository, and I am looking for some help testing these new versions out.

You can see a summary of the changes here:


The shadow fixed repository, for those who do not know, is the staging repository which we use for testing before making new versions public. You can think of it as a place for release candidates. See http://wiki.ros.org/ShadowRepository for information about the process and how you can try out packages from it.

If you use rviz on Indigo or Hydro and have some spare cycles I would appreciate you testing out the new versions either from shadow fixed or by building it from source. Any issues you might find, please file them on the rviz issue tracker.

I'll keep the "release candidates" in the shadow fixed repository for about week unless we run into problems.

Thanks!

P.S. the link to the Indigo changes above may take some time to catch up, if you don't see 1.11.7, look here instead: https://github.com/ros-visualization/rviz/blob/11fcdadbbcc4c9d38a0bd4d580be6f0b49cbbc47/CHANGELOG.rst

ROS Answered [beta] Announcement

| No Comments | No TrackBacks
From David Lu!! via ros-users@

Like Jonathan, I too started a little hack over the holiday. 

Going back to the last ROS Metrics report, I'd been wondering how many questions on ROS Answers actually get answered. This site aggregates all the information by topic. 

The site's not perfect, but it gets the point across. I know the pages load slow, and it turns out that yaml isn't the best database format for 8M of data. 

Feedback and contributions welcome!

-David!!

P.S. Friendly reminder to add yourself and your institution to ROS Map (http://metrorobots.com/rosmap.html)

Art using ROS

| No Comments | No TrackBacks
Two recent art installations have leveraged ROS. 

In Portland Shelley Jordan and Kurt Rohde created

(Lost) in the Woods

For more information see this review
 

In Paris, Diller Scofidio + Renfro created 

Musings on a Glass Box

For more information see the announcement from the Cartier Foundation


Below is their online video

ROS MAV SIG Call for Participation

| No Comments | No TrackBacks

ROS Aerial Vehicle Users,


We'd like to invite you to participate in an effort to develop a standard set of messages for communicating within robotics components on Micro Air Vehicles (MAVs). At the IROS workshop on MAV's (proceedings) this fall it was identified that the MAV community has many different implementations of the same capabilities. They are often all closely related and are almost compatible but rarely is it easy to switch between different implementations, or use different implementations together. From that discussion it was proposed to work toward building up a common way to communicate and enable the MAV community to collaborate most effectively.

To make this happen we have setup a mailinglist and wiki pages to be a place to coordinate this effort (MAV SIG, mailing list). If you are interested in this topic we ask that you join, listen and participate so that we can get as broad a spectrum of voices as possible.


We have chosen the ROS SIG format as it has proven effective at developing standard messages which are used by many people every day. ROS SIG's are relatively unstructured and allow adaptation for differences in each community and process.


We plan to use the ROS .msg format as a way to formalize the messages, since it is a relatively compact way to express messages which has representations in many languages. The most important part of the process will not be the actual msg file that comes out, but the datatypes for which people can rely on being isomorphic when transitioning between systems.


Having common datatypes will allow us to have better modularity and interoperability. As an example from the ROS ecosystem, there are 10+ different laser scanner drivers in the ROS ecosystem and 18+ different camera drivers (ROS sensors). Because these drivers all use a standard set of messages a user of those sensors can switch which sensor they are using on their system, or deploy systems with different sensors and the rest system will continue to operate without modifications. There are more complicated examples such as the navigation stack which has a standard set of messages for sending commands and providing feedback. This same interface has been used for differential drive, holonomic, free flying, and even walking robots.


There are already dozens of MAV related ROS packages released and we hope that developing these standard messages can help coordinate the efforts of the many contributors already working on aerial vehicles in ROS.


If you would like to know more please check out the SIG (LINK). if you're at all interested please join the process. We've started a thread at here to kick off the process.  


Tully Foote (OSRF), Lorenz Meier (ETHZ / CVG, PX4), Markus Achtelik (ETHZ / ASL).

Groovy Galapagos EOL Complete

| No Comments | No TrackBacks
As we have have now released indigo and are looking forward to Jade, it is time to retire Groovy. 

Groovy was first officially released at the end of 2012, but work toward the release had been started in early 2012.[1] During it's life cycle Groovy almost double the number of packages released reaching a maximum of 900. 

Reviewing the history of the rosdistro repository which contains the release metadata reveals that there was 2912 commits from 127 contributors over the history of the Groovy release. This represents the maintainers making the releases and does not count the many more contributors to the source code of the individual packages. There were commits on 612 different days over the 794 days tracked in this repository. This means on average there were releases of groovy packages more than 5 days per week. For a quick visualization of the activity on the repository we've put together a rendering of commits to the groovy subdirectory:These statistics only count catkin based releases, not the 178 rosbuild packages indexed separately.) 

ROS Groovy Galapagos Rosdistro Git Activity from OSRF on Vimeo.




As you may have already noticed, last week we disabled all the groovy jobs on the farm. We have kept them there for reference but do not intend to reenable them. Along those same lines, we can accept pull-requests to keep source builds working on groovy(such as  if a repository is relocated to a new host), but cannot accept pull-requests for new groovy releases. 

As always we'd like to pay trubute to the hundreds of people who put the time in to make groovy happen. It would not have happened without your efforts. 

ROSCon 2015 Location and Date Survey

| No Comments | No TrackBacks
We had a great time at ROSCon 2014! (if you missed it we've posted videos of all the presentations online now at http://roscon.ros.org/2014/program/ )

Although it's a long way off still we need to look forward to when and where to hold the next instance. To help facilitate that process we'd like the communities feedback on what times and locations would best fit into their schedules. Please take a minute to let us know where you would be able to join us for our next event.  


There is a place for your name and email, but it's not required. 

ROS Development Survey

| No Comments | No TrackBacks
From Ryan Gariepy of Clearpath Robotics via ros-users@

Clearpath Robotics, an early adopter of ROS, is working with the Open
Source Robotics Foundation (OSRF) to determine how the worldwide ROS
development community can best be supported. This may be via support
services, resources, or tools offered by the OSRF or community
members. Now is your opportunity to let us know what you need and how
Clearpath and OSRF can work together to best support you.
Please take a moment to complete this short survey:

http://fluidsurveys.com/surveys/clearpathrobotics-B/ros-development-survey-final/

Virtual machines with ROS Indigo pre-installed

| No Comments | No TrackBacks
From Nootrix via ros-users@

Hi there,

Just wanted to let you know that we have issued two virtual machines with ROS Indigo Igloo pre-installed: one 64 bits and the other 32bits.
http://nootrix.com/2014/09/ros-indigo-virtual-machine/

Enjoy,
Eddy


ROS Dependency Analysis Graph

| No Comments | No TrackBacks
From Ben Arvey via @ros-users

Hello, my name is Ben Arvey and I've been developing a set of analysis tools for ROS under the direction of Dr. Bill Smart. Our lab is giving a talk at ROSCon concerning our research, of which this is one aspect.


I'm looking for some preliminary feedback from developers. Any information about what you need in an analysis tool would be very helpful!

Here's the web app (Chrome works best):
http://http404error.github.io/roseco/graph.html?id=ros.json

Here's a page with some basic documentation info and suggestions for feedback:

ROS Art

| No Comments | No TrackBacks
from the Shadow Robot Company

Over at Queen Mary University London, they run a postgraduate course in Media and Arts Technology, and one of the students there, Ed Burton, created this innovative performance using ROS and the RoNeX.

This is still a work in progress but here's a  teaser of Ed's work, using ROS for a beautiful project:



And here is a quick peak at his RoNeX installation:



Making Juice at the High Tech Systems Fair 2014

| No Comments | No TrackBacks

From Jesse Scholtes, Eindhoven, the Netherlands

In December 2013, TMC and YASKAWA Benelux set out to make a technology demonstrator.

YASKAWA sees a shift in robotics, from welding, handling, painting to new application areas. Some recent developments of YASKAWA are 'milking-robots' and 'slaughter-robots'. And as we are aware of, Universities and research institutions are working hard on the introduction of robots in (health) care or other areas where human interaction is present and essential.

A new era of smart robotics is becoming a reality. The development of these new robots pose new technological problems and need different solutions. It raises questions about how a robot can be programmed to deal with changing environments? How an intuitive and user-friendly interface can be created? Or how flexible mechanics can be be designed to handle different object just like humans do. And more and foremost how to create smarter safety systems such that robots can safely operate among people.

The demonstrator sets out to do accomplish a number of things:

  • Use a 'higher' software environment like ROS
  • Integration of vision to make the robot aware of its surroundings;
  • Implementation of a flexible gripper to make it possible to perform different tasks;
  • Use a modern user interface like a tablet or smartphone to control the robot;
  • Make the robot perform some tasks that are expected from service robotics.

After a brainstorm we came up with the idea that the robot should pick, slice and squeeze an orange to make fresh orange juice, next to that it should also serve this glass to someone in the audience.

For 3½ months and with a team of 12 people we shared our Monday evenings and a lot of enthusiasm to build this orange crusher or as we call her nowadays: (Juicy) Lucy. The team consisted of people with different backgrounds: mechatronics, robotics, embedded software, mechanical design and electronics. Our deadline was the High Tech Systems Fair the 7th or 8th of May.

ROS enabled us to quickly prototype and realize our demonstrator. Using ROS we combined several piece of hardware:

  • An Android tablet where we designed two different apps. A user app to choose the amount of orange juice. And an engineering app that allowed us to control the robot gripper, provided ROS INFO messages plus control over the state machine.

  • A laptop and a mini pc; where the mini pc was used to perform the image processing.

  • A webcam, where its images were used to dynamically extract the X, Y coordinates of the oranges.

  • An Arduino board. Here, the Arduino board was used to control the gripper which was equipped with 4 x stepper motors, 4 end switches and a sonar sensor (the sonar was used to measure the height of the orange).

An example of the system overview can be seen here.

yasaka_system_overview.png

In the end we managed to deliver the first version of our demonstrator that serves as a platform for future enhancements and add more complexity. The fruits of our labor can be viewed in the following video.

Making Juice at the High Tech Systems Fair 2014 from YASK_dem on Vimeo.

At this point the HTS fair is the only fair were we presented our demonstrator. At the moment we are brainstorming on a follow up for the project.

Indigo Development Update

| No Comments | No TrackBacks
We're closing in on the Indigo Release quickly. Today our last package was released to fill out ros desktop and desktop full. There are a few packages which need to be fixed to make the release ready for testing. In preparation for the upcoming release we already have 430 packages released and building on the buildfarm. We expect many more to have been released by the final release planned for later this month. 

Also as a reminder coming up in June is ICRA and ROSKong. If you are presenting a paper in which you used ROS and think other ROS users would be interested, we would like to feature it on  ROS News blog please email us at ros-news@googlegroups.com 

And one last reminder that the Indigo Igloo T-Shirt is available for only 19 hours more hours. We've had a great response at the end of the campaign and want to make sure the you don't miss your opportunity. 

NooTriX posted Hydro VM Image

| No Comments | No TrackBacks
From Nootrix via ros-users:

We finally managed to make a virtual machine with Hydro. As usual, it's freely available for download at:
http://nootrix.com/downloads/#RosVM

Best,
NooTriX Team

ROS user survey: the results are in

| No Comments | No TrackBacks

The results are in from the January 2014 ROS user survey. Thanks to everyone who participated!

We had a total of 336 responses. We'll walk through the questions, one at a time:

In general, for what do you use ROS?

for-what-do-you-use-ros.png

Not surprisingly, the lion's share of ROS users consider themselves to be doing research. That's where we started, and we expect to continue to see high participation in the research community. But we also see about 1/3 of respondents classifying themselves in education and 1/3 in product development, with a smaller share of self-identified hobbyists. Those are all areas for future growth in ROS usage.

What about ROS convinced you to use it?

what-about-ros.png

Interestingly, the top response here is the communications system. When we set out to build ROS, we started with the communications system, because we believe that robotics problems are most naturally solved by developing distributed systems, and further that developing those systems is hard, requiring solid, easy to use tools. It looks like our users appreciate the effort that's been put into ROS middleware.

Also near the top are what we can call the "healthy open source project" benefits: friendly licensing, helpful community, and playing nicely with related open source projects.

How do you primarily use ROS?

how-use-ros.png

Most users are working with a single robot, but a substantial number of people are working with multiple robots, which was outside the initial design of ROS. Multi-robot support definitely needs improvement, but clearly people are already getting something out of ROS in multi-robot environments.

With what type(s) of hardware do you use ROS?

which-hardware.png

At least in part because most robots in the world (or at least in research labs) are basically cameras and/or lasers on wheels, we see most of our users working on those platforms. But we also see a fair number of people working with arms and hands, and we expect that the number of legged systems will grow in the future.

Have you shared and/or released your own ROS packages?

released-pkgs.png

Here we see a familiar pattern in open source development: most users don't share their code with the community. That's OK with us, because we know that not everybody is in a position to share their code (for example, commercial users who are building ROS-based products). But if you can share code, please do!

Which ROS packages are most important to you?

which-ros-pkgs.png

Here, we have some clear winners. Visualization is important: rviz is a critical piece of infrastructure in our community, and the rqt library of visualization components is also heavily used. Also highly ranked are planning libraries (navigation and MoveIt!), perception libraries (PCL and OpenCV), coordinate transform management (tf), and simulation (Gazebo). Interestingly, we see the OpenNI driver in the top ten, perhaps reflecting the long-standing connection between ROS and Kinect-like devices, dating back to the ROS 3D Contest.

Where should future ROS development focus?

future-development.png

Less clarity here; basically we should do more of everything.

What is your top priority for future ROS development?

The free-form answers we received in response to this question are challenging to quantify. At a high-level, here's a qualitative distillation of common themes, in no particular order:

  • more / better documentation
  • more / better / more up-to-date tutorials
  • improved usability
  • greater stability, less frequent releases
  • better multi-master / multi-robot support
  • consolidation of related parts into coherent wholes
  • better / more mature middleware
  • better / more attentive maintenance of core libraries and tools
  • add features and fix bugs in rqt
  • get to "production quality"
  • IDE support
  • real time support

Would you be willing to anonymously report usage statistics?

system-monitor-1.png

About 1/2 of respondents are willing to install a plugin to roscore that would track and anonymously report usage statistics, which would let us automatically collect data on which packages, nodes, launch files, etc. are most heavily used. Any volunteers to write that plugin?

Mathworks Releases TurtleBot Matlab API and Demo

| No Comments | No TrackBacks
turtlebot_matlab_control.png

Following the release of the ROS support Te Mathworks has put together a demonstration of how to use the interface to control a TurtleBot in simulation or on a real robot. 

For more information see the submission on the Matlab Central File Exchange: http://www.mathworks.com/matlabcentral/fileexchange/44853-use-matlab-ros-io-package-to-interact-with-the-turtlebot-simulator-in-gazebo

From Clearpath Robotics:

Thalmic Labs and Clearpath Robotics have joined forces to prove gesture controlled robots are possible. Thalmic Labs, developers of Myo Gesture Control, released the Myo alpha developer unit to Clearpath Robotics for testing. Clearpath has successfully integrated the Myo armband with their Husky Unmanned Ground Vehicle to start, stop and drive the vehicle using simple arm movements.

 

"There are a lot of interesting applications for using the Myo for robot control and our team is very excited to have the opportunity to work with the Alpha dev unit," said Ryan Gariepy, Chief Technical Officer at Clearpath Robotics. "We've been eagerly following Thalmic's progress and we've got a dozen different robots here we could do some more tests with."

 

Clearpath Robotics used the Robot Operating System (ROS) for most of the integration work. The Husky software package exposes a standard Twist interface, so the team was required to convert the Myo data into that format to create compatibility. The team did so by using their experimental cross-platform serialization server in socket mode.

 

For Myo integration and development, Clearpath Robotics added standard Windows Socket code into the provided Thalmic example code, and then determined the proper mapping from the Myo data to the desired robot velocity using timeouts and velocity limits. Further details on Myo integration cannot be released at this time.

Gazebo feature survey

| No Comments | No TrackBacks
From Nate Koenig via ros-users@

Hello all,

We understand that simulation is an important component within ROS. We are working on Gazebo's road map for the rest of this year, and could use your input.

Below is a link to a survey that lets you rank new simulation features. Please take a few moments to complete the survey, and help us guide the direction of Gazebo for 2014.


Cheers,
-nate

ROS Usage Survey

| No Comments | No TrackBacks
Please help us by taking a couple of minutes to fill out this ROS usage survey:


It's just a handful of questions and can be completed very quickly.

We're doing this survey to help us understand how people are using ROS
and which parts of ROS are most valuable to the community. The survey
results, which we'll share with everybody, will help us to prioritize
our development efforts.

ROS Development Update

| No Comments | No TrackBacks
I wanted to highlight some recent changes that I hope people will find useful.  Following on the recent new website for ROS, http://www.ros.org/ (Which I highly recommend you checkout if you haven't already.) we've been doing some more housekeeping to make things easier to use on our various websites. 

We have updated the CSS styles for the wiki, improving the look and feel for things like buttons and font spacing. We have also added some new information to package pages, for example: http://wiki.ros.org/roscpp_tutorials On this page you will notice the new badges for "Released", "Continuous Integration", and "Documented". This should give users more information at a glance for packages which are documented on the wiki. Additionally, there is now a "Jenkins Jobs" link on the right hand side "Package Links" box. If you click this it will expand to list all of the build farm jobs related to this package and their status. We will add more information to the package pages as we can, suggestions and pull requests are welcome.

We have also just launched the http://status.ros.org site. This site gives you an overview of the status of our services as well as some realtime metrics. This site is hosted externally, so we can communicate outages and progress on repairs even when our other infrastructure is down. We would encourage you to follow @rosorg or add our RSS feed/signup for email notifications on the http://status.ros.org site directly.

We're also actively working on preparing for ROS Indigo Igloo. A major part of this perparation has been preparing for Python 3. If you'd like more information on that there's a thread on ros-release@code.ros.org http://lists.ros.org/lurker/message/20131231.003813.311b5072.en.html and we have recently update REP 3 for Indigo http://www.ros.org/reps/rep-0003.html#indigo-igloo-may-2014 Also in perparation for turning on the Indigo buildfarm we have removed all Fuerte jobs from the farm. Fuerte packages will continue to be available however it will not be possible to build new packages. 

While I'm on the topic I'd like to encourage all maintainers to make sure that they're on the ros-release mailing list to make sure to stay up to date on release specific information and discussions. 

I'd also like to encourage everyone to send announcements and updates on their projects here to the ros-users mailing list or submit them to ros-news@googlegroups.com for posting on the ROS Blog. And if you're blogging about ROS related content submitting your blog to http://planet.ros.org/ where you can get an RSS feed of ROS related activities. One of the strengths of ROS is it's large user community sharing project updates and announcements is a great way to contribute to the community. 

2013 was ROS's strongest years with more and more people releasing packages against both Groovy and Hydro. The packages available for these distros have grown to be more than 750 and 850 respectively. 

Thank you to everyone who's contributed already and to everyone else I encourage you to start by making a small contribution such as answering a question on http://answers.ros.org or updating or extending a wiki page. 

Happy New Year!

MoveIt! Survey Results

| No Comments | No TrackBacks
From Sachin Chitta on ros-users@

Thank you for your responses to the MoveIt! survey. We had a fantastic response with 105 total respondents by the deadline. There are 65 different robots on which MoveIt! is being used now according to the survey (listed below), with multiple instances of the most popular robots using MoveIt!

A compiled summary of the MoveIt! survey results is available online: http://moveit.ros.org/data/surveys/MoveIt!-2013-Survey.pdf (also available on the MoveIt! wiki). 

Best Regards,
Sachin Chitta


Robots Using MoveIt!

Compiled list of robots running MoveIt! based on survey responses (please point out any duplicates). 
The list is in alphabetical order and figures in brackets indicate the number of respondents 
who reported using MoveIt! with that particular robot.

Total Number of different robots: 65

ABB IRB2400
ABB IRB6640
Aldebaran Nao (2)
Aldebaran Romeo
Arbotix PhantomX Pincher
Barrett WAM
Boston Dynamics Atlas (7)
BioRob Arm
Cerberus
CKBot
ClamArm
CloPeMa Robot
Comau NM45
Cyton Veta
Demining robot
Denso robot (vs060)
DIY Mobile Manipulator
DLR-HIT Hand
Dr. Robot
Fanuc m10ia
Fraunhofer Care-O-bot
Fraunhofer Rob@Work
HDT arm with Base for RCTA project Rescuer 
Hiro (Nextage)
Hoap3
HRP-4 (simulation) (3)
HRP2
Hubo
iarm ABB
iCub
IRB2400
Kawada Hiro
Kinova Jaco (3)
Korus Homemate robot
KUKA LBR (3)
Kuka Leightweight Arm (7)
KUKA LWR4
KUKA OmniRob
KUKA youBot (2)
Lego NXT
Lyncmotion servo erector set
Meka M3 Robot (2) 
Motoman SIA10d (2) 
Motoman SIA20 (2) 
Motoman SIA5 
Neuronics Katana (2) 
PAL Robotics REEM (2) 
PAL Robotics REEM-C 
Pi Robot
Pioneer P3AT
Pisa Velvet Gripper
Willow Garage PR2 (16) 
Rethink Robotics Baxter (8) 
Robonaut
Robonaut2
Schunk 7DOF
Schunk Dextreous Hand 
Schunk LWA (3)
Schunk Powerball
Shadow Robot Arm and Hand 
Summit XL-Terabot
TUM Rosie
Universal Robots UR10 (2) 
Universal robot UR5 (7) 
X-WAM

A new www.ros.org

| No Comments | No TrackBacks

When we started work on ROS, like most young open source projects, our greatest need was to recruit early adopters and fellow developers. So we targeted that audience: we built a wiki, filled it with documentation, tutorials, and code examples, and made the wiki the landing page at www.ros.org.

Well, times have changed. Now, six years into the project, we have a broader audience to consider. We want to reach teachers who are considering using ROS in their classrooms, managers who want to use ROS in a new product, journalists who are writing stories about ROS, and many, many others.

So, in celebration (just a bit late) of ROS's sixth birthday, we're pleased to present a new www.ros.org.

ros-org-screenshot1.jpg

ros-org-screenshot2.jpg

After all, a grown-up ROS deserves a grown-up website. Don't worry: the wiki is still there, as are all the other ROS sites on which we depend.

Btw, like most things we do, the website itself is at GitHub. If you run into a problem or have an idea for improving the site, open an issue and we'll have a look.

MoveIt Feedback Survey Posted

| No Comments | No TrackBacks
From Sachin Chitta of SRI:

We are polling the community to get technical feedback about MoveIt!. It should take less than 5 minutes to fill out the survey. This form will stay open until Friday Nov 1, 2013, 11:59 PM PST
A compiled summary of the survey results will be made available to the community through the MoveIt! wiki (moveit.ros.org). We appreciate any feedback you can provide to help us improve MoveIt!

ROS Hydromedusa Tshirt Campaign Successful

| No Comments | No TrackBacks

We have passed the 150 threshold for our Teespring Campaign and the Hydromedusa tshirts will be ordered. If you haven't ordered your yet you can still order for the for 13 more days before the campaign ends.

ROS Industrial releases a 1 Year Montage Video

| No Comments | No TrackBacks
ROS Industrial has been going for a full year now. Here is a compilation of ROS-Industrial application videos from the first year of the ROS-I repository. See http://ROSindustrial.org and http://consortium.ROSindustrial.org for more info:

10,000 Questions Asked on ROS Answers

| No Comments | No TrackBacks

This weekend, @Amal made history by asking the 10,000th question on answers.ros.org.

For posterity, here's a screenshot from Saturday night: 10001questions.png

The success of answers.ros.org is thanks to its many contributors. Answers.ros.org has been running for a little bit over 2 years now and in that time, the community has answered 7283 questions, 73% of the questions asked. That's an average of 10 questions per day for the last two years (including weekends and holidays). Traffic has steadily grown, and recently, users have posted closer to 30 questions per day.

There are now 4399 registered users, 388 of whom have earned over 100 Karma, and 60 of whom have amassed 1000 Karma!

@lorenz @tfoote @dornhege and @joq deserve special recognition as each of them has earned over 10,000 Karma. Accumulating a Karma stash of this size requires such actions as their answers being upvoted one thousand times.

Congratulations to the answers.ros.org community for making the site the thriving resource that it is today. Keep up the fantastic work, and keep the questions--and answers--coming.

Robotics Internships at Bosch

| No Comments | No TrackBacks

Announcement by Benjamin Pitzer (Bosch) to ros-users

Dear ROS Users,

The Bosch Research and Technology Center in Palo Alto, CA is looking for highly motivated robotics researchers and developers interested in contributing to ROS and being part of the PR2 Beta Program as part of our internship program.

We currently have the following openings:

Robotics Research Intern
Details: http://www.bosch.us/content/language1/html/9938.htm

Robotics Software Engineering Intern
Details: http://www.bosch.us/content/language1/html/10002.htm

Robotics Hardware Development Intern
Details: http://www.bosch.us/content/language1/html/10009.htm

Please use the "Email Resume" link on the job description page to apply.

Best regards / Mit freundlichen Grüßen,

Benjamin Pitzer

Robert Bosch LLC

Taylor Veltrop's announcement to ros-users

Hi Everyone,

A year ago I released my first video of Kinect robotics when I loosely controlled a KHR (mini humanoid).

Now I have "completed" the robot avatar project. A treadmill, HMD, Wii remotes, Kinect, and NAO have all been integrated together using ROS to create a fully immersive experience. I really feel like my "self" is in the place of the robot while using this.

Here is a video demonstrating it, I use the interface to brush my cat remotely.

Actually it looks like the project is not really complete after all... Something I realized when filming this is that I need to add 2-way audio...

Hope you enjoy the video! Happy new year!

Taylor

Jingle Bells from Uni Freiburg

| No Comments | No TrackBacks

HumanoidsFreiburg:

Our NAO humanoid plays Jingle Bells for Christmas on a glockenspiel / xylophone. The robot can read a single-track song derived from MIDI and plays it on the instrument. Implementation by Stefan Band and Jonas Delleske.

Merry Christmas from the Humanoid Robots Lab at the University of Freiburg!

Robotic Open Platform

| No Comments | No TrackBacks

roboticopenplatform.org:

Robotic Open Platform (ROP) aims to make hardware designs of robots available under an Open Hardware license to the entire robotic community. It provides CAD drawings, electric schemes and the required documentation to build their own robot. In the near future, standard electromechanic interfaces between the various robot components will be presented to enable the possibility to combine hardware components of various groups into one robot. By making the robots modular, users are encouraged to develop their own components that can be shared with the community.

In software, the Robot Operating System (ROS) is nowadays acknowledged as a standard software platform and is used by numerous (research) institutions. This open source software is available to everyone and by sharing knowledge with the community there is no need to 'reinvent the wheel', hence drastically speeding up development. Similarly, Robotic Open Platform (ROP) functions as a platform to share hardware designs available to all research groups within the community.

Announcement by Patrick Goebel of Pi Robot to ros-users

Hello,

I have posted a couple of new ROS tutorials for those getting started with either speech recognition or controlling a pan & tilt head:

Please let me know if you run into any bugs.

--patrick

Qbo Robot Visits Willow Garage

| No Comments | No TrackBacks

Qbo_PR2_IMG_0797.JPGThecorpora has made a great commitment to open source with Qbo, a robot with great robotics technology for everyday consumers. To work on integration ideas, Qbo and his friend, Francisco Paz, stopped by Willow Garage to meet the team and Willow Garage robots, such as the PR2. You can checkout photos of Qbo hanging out with PR2 and other robots at Thecorpora's blog.

Also on the way from Thecorpora is a new Android phone application for Qbo that provides telepresence; hear, see, and communicate with Qbo as if you were in the same room. Imagine getting Qbo to go where you want or direct it using Google's speech recognition software. This new app makes Qbo a telepresence in any room. For details, see their blog post and video.

Related:

rosinstall 0.5.17 now available

| No Comments | No TrackBacks

Version 0.5.17 of rosinstall has been released. You can update using the commands below. This update contains the new experimental rosws tool, updated --distro and --dev options for the roslocate tool, and numerous bug fixes. Please try it out and provide feedback on the new rosws tool and new roslocate with distro specific options.

Update commands

sudo pip install -U rosinstall

or

sudo easy_install -U rosinstall

PowerBoard_splash1-690x282.png

The good folks at turtlebot.eu have released EU-compatible designs for the TurtleBot powerboard as well as metric versions of the TurtleBot trays. They've also adapted the design for consumer Roombas for those that cannot purchase a Create in Europe.

For more information and to download the designs, please see the turtlebot.eu post

Cloud computing with Qbo

| No Comments | No TrackBacks

Thecorpora's Qbo showed off some cloud skills at the Campus Party in Valencia: the Qbo in Valencia was able to learn to recognize Tux, the Linux penguin, using a cloud-based object recognition system. Cloud-based recognition systems enable us to access seamlessly and collaboratively update knowledge about the world. During their live demo in Valencia, an engineer in Madrid was able to teach the image of Tux to the system, which was then accessed by the Qbo in Valencia. For more information on this demo and Qbo, you can checkout the Qbo blog.

IHR TurtleBot

I Heart Robotics/Engineering has been cranking out TurtleBot accessories as well as a some DIY instructions so that you can get the most out of your TurtleBot hardware -- whether it be new capabilities, or a little bit of flair.

Only a few days left to get your 15% discount from Clearpath Robotics for your own TurtleBot.

TurtleBotTurtleBot.com has launched! This new site provides access to TurtleBot information and also gives you new ways to access TurtleBot hardware. You can now order parts or assembled kits from several licensed vendors or take advantage of the open-source hardware designs to build your own robot from scratch.

For more information, you can check out the announcement on WillowGarage.com or head over to the new TurtleBot.com site.

Congrats to the GRASP Lab's PhillieBot for throwing out the first pitch at a Phillies Game! PhillieBot is the creation of Professor Vijay Kumar, Jordan Brindza, Jamie Gewirtz and Christian Moore. It features a Barrett arm on a Segway base and it runs ROS. They worked on several modifications to the Barrett arm to get up to pitching speeds, though the Phillies requested that they limit the pitch to a mild 30-40mph.

From awesome quadrotors to modular robots to the PR2 Beta Program, the GRASP Lab is doing some impressive work with ROS.

More information:

Congrats to the NimbRo@Home (University of Bonn) for their victory at the RoboCup German Open. During the competition, their Cosero and Dynamaid robots worked together to prepare breakfast. They demonstrated many difficult mobile manipulation tasks, like opening and retrieving orange juice from a refrigerator, pouring milk into a ceral bowl, fetching a spoon, and recognizing a pointing gesture. They were also able to deal with unknown environments.

The competition was a great demonstration of ROS software used to solve difficult challenges. ROS, PCL, and OpenRAVE were popular components in the competition -- five out of the eight robots used ROS-related software. The Nimbro@Home robots use ROS for communication as part of their four-layer modular control architecture, which is described in their 2011 paper.

The CCNY Robotics Lab was the first to bring us Kinect drivers for ROS, so it's not surprising that they have some awesome Kinect demos they have been working on.

In the above video, they show some of the latest results of their 6D pose estimation. Simply by moving the Kinect around an office, they are able to register multiple scans together and create a 3D model of the scene. Their code works with no extra sensors: they simply move around the Kinect freehand.

The work was done by Ivan Dryanovski, Bill Morris, Ravi Kaushik, and Dr. Jizhong Xiao. They are using custom RGB-D feature descriptors for the scan registration and use OpenCV, PCL, and ROS under the hood. They are working on releasing and documenting their code. In the meantime, you can checkout the rest of the cool software available in ccny-ros-pkg.

SLAM with Kinect on a Quadrotor

| No Comments | No TrackBacks

MIT's Robust Robotics Group, University of Washington, and Intel Labs Seattle teamed up to produce this demonstration of 3D map construction with a Kinect on a Quadrotor. Their demonstration combines onboard visual odometry for local control and offboard SLAM for map reconstruction. The visual odometry enables the quadrotor to navigate indoors where GPS is not available. SLAM is implemented using RGBD-SLAM.

More information

3D visual SLAM with mobile robots

| No Comments | No TrackBacks

A set of enterprising University of Waterloo undergrads have combined mobile robotics and 3D visual SLAM to produce 3D color maps. They mounted a Kinect 3D sensor on a Clearpath Husky A200 and used it to map cluttered industrial and office environment settings. The video shows off the impressive progress and capabilities of their "iC2020" module.

The iC2020 module was created by Sean Anderson, Kirk Mactavish, Daryl Tiong, and Aditya Sharma as part of their fourth-year design project at the University of Waterloo. They formed their group with the goal of using PrimeSense technology to create a globally consistent dense 3D color maps.

Under the hood they use ROS, OpenCV, GPUSURF, TORO to tackle the various challenges of motion estimation, mapping, and loop closure in noisy environments. Their software is capable of allowing real-time views of the 3D environment as it is created. ROS is supported out-of-the-box on the Clearpath Husky, and Sean Anderson noted that "ROS was crucial to the project's success" due to its ease of use and flexibility.

Their source code is available under a Creative Commons-NC-SA license at the ic2020 project on Google Code.

Implementation details:

  • Optical Flow using Shi Tomasi Corners
  • Visual Odometry using Shi Tomasi and GPU SURF
    • Features undergo RANSAC to find inliers (in green)
    • Least Squares is used across all inliers to solve for rotation and translation
  • Loop closure detection using a dynamic feature library
  • Global Network Optimization for loop closure

More information: iC 20/20

Hands-free vacuuming by OTL

| No Comments | No TrackBacks

OTL has been a frequent contributor of great Roomba hacks, and this one is no exception. This time he's used a Kinect and a Roomba bluetooth connector to take back control of the vacuum. You can find out more in his blog post (Japanese). His blog is a great Japanese-language resource for getting into ROS.

See also:

Buttons Redux

| No Comments | No TrackBacks

Get your Axel F on with this redux of Garratt Gallagher's prize-winning Customizable Buttons.

He also has PR2 moving with the Kinect. We can now only hope that he combines these two videos together...

The Chair of Automation Technology at Chemnitz University of Technology shows just how versatile a Kinect on a quadrotor can be. Their entry, "Autonomous corridor flight of a UAV using the Kinect sensor", uses the Kinect to find the ceiling, walls, and floor of a corridor. Once the quadrotor knows the geometric structure of the corridor, it can happily fly down the middle to get where it needs to go.

Their demo is built on an AscTec Pelican with a stripped-down Kinect. To handle the rest of the autonomous flight needs, they use a ADNS 3080 optical flow sensor for position and velocity control, and a SRF10 sonar sensor for altitude control. Sample-consensus algorithms from PCL are used to convert the 3D point cloud data into the estimated positions of these surfaces. Remarkably, they managed to make all of this run on an Atom processor.

Thumbnail image for 3Dturtle.jpgChris Burbridge and Lorenzo Riano from the University of Ulster Intelligent Systems Research Centre used the Kinect to turn their robot into a mobile 3D person scanner. A Kinect is great for collecting 3D data, but sticking it on wheels is even better because you can collect data from multiple points of view and construct full 3D models.

Their demo uses the Kinect at both the skeleton tracking and 3D point cloud level. The OpenNI skeleton tracker is used to identify the position of the person in the room, and then the 3D point cloud data is used to start building the full 3D scan. Once all of the point clouds are collected, they use PCL to create a unified 3D model.

The UU robot is a custom MetraLabs Scitos G5 mobile robot with a Kinect mounted at the end of a Schunk 7 DOF manipulator, but their code should be adaptable to other robot platforms.

ROS on FreeBSD Presentation

| No Comments | No TrackBacks
Post by René Ladan to the ros-users list

Hi,

last December I gave a talk about using ROS on FreeBSD.  The sheets are
available at ftp://rene-ladan.nl/pub/ros-freebsd.pdf .  Note that the
USB problem mentioned on sheet 12 is fixed for FreeBSD 8 and newer :)

Regards,
René

ROS 3D Entries: Teleop Kinect Cleanup

| No Comments | No TrackBacks

Thumbnail image for 3Dturtle.jpgZoltan-Csaba Marton and Dejan Pangercic of TUM's "Teleop Kinect Cleanup" entry into the ROS 3D Contest is a couple of demos rolled into one. Using their entry, you can point at an object on a table, then, in the virtual rviz display, move that object somewhere else like a Jedi. You start with a world that looks like your own, but by the time you're done, you've rearranged a new virtual world to your liking.

That's not all. They've also figured out how to make this useful for giving commands to a robot. After you move around a cup in your virtual world to your liking, a command to move the cup can be passed to a robot. Thus, once you've re-arranged your virtual world, it becomes the job of the robot to make the real world look like your virtual world.

If you want to see their robots in action, you can checkout this video of TUM's Rosie and PR2 making pancakes together.

Thumbnail image for 3Dturtle.jpgMichael Ferguson is a prolific contributor to ROS. His entry into the ROS 3D Contest is "Improved AR Markers for Topological Navigation". AR markers are a cheap and effect way to find the position of objects in an image using cheap cameras. Michael recognized the opportunity to combine these markers with the Kinect, which has both camera and depth data, to transform them into markers in three dimensions. You can even use this to find the position of the robot by attaching the markers to known locations in your map.

We encourage you to check out the many different robots that Michael is building, from the iRobot Create and Dynamixel AX-12-based Nelson to the up-and-coming Create + Kinect + tripod Trike. The software for the contest entry along with these robots can be found in albany-ros-pkg, which also contains a Neato XV-11 driver for ROS.

Photo essay on Penn's GRASP Lab

| No Comments | No TrackBacks

Penn_photo_essay.png

Evan Mann did a nice photo essay of Penn's GRASP Lab. Enjoy.

Photo Essay: GRASP Lab

For extra fun, don't miss the construction quadrotors on Colbert's Threatdown.

Thanks Ben Cohen!

Thumbnail image for 3Dturtle.jpgPatrick Bouffard's "Quadrotor Altitude Control and Obstacle Avoidance" was featured back in December, when he first made waves on the Internet by mounting a Kinect to a quadrotor and flying it around his lab. The Kinect was used to detect the altitude as well as avoid obstacles.

Patrick has updated his video for the ROS 3D contest. He has also released starmac-ros-pkg, which contains the software used in his Berkeley lab to get these quadrotors in the air. starmac-ros-pkg includes ROS drivers for Vicon motion capture systems as well as an abstraction of the AscTec autopilot driver. It's a great complement to ccny-ros-pkg, which provides AscTec quadrotor drivers, computer vision libraries, and other tools.

ROS 3D Entries: Anaglyph Viewer

| No Comments | No TrackBacks

Thumbnail image for 3Dturtle.jpgColin Lea's Anaglyph Viewer entry into the ROS 3D Contest brings a bit of 3D retro to our entries. Colored glasses for seeing 3D are an inexpensive way of seeing 3D content on a 2D screen. If you are able to see the data in 3D, you can become more immersed in the data that is coming from the Kinect. For example, you can build more effective teleoperation cockpits that let you take advantage of your ability to see depth. Add more Kinect cameras and you can start becoming fully immersed in a 3D world.

PS: Our skateboarding turtle says thanks!

ROS 3D Entries: Kinemmings

| No Comments | No TrackBacks

Thumbnail image for 3Dturtle.jpgThe Kinemmings entry by Alberto Jose Ramirez Valadez, Jonathan Rafael Patino Lopez and Marcel Stockli Contreras, is a take on the classic Lemmings game. Now, it's up to you and your body to guide the Kinemmings safely to their exit.

Kinemmings has the distinction of being the only game entry into the ROS 3D contest. In fact, as far as we know, it may be the first game package in all of ROS. We appreciate it as it means we can now tell our boss that we're "working on ROS".

Thumbnail image for 3Dturtle.jpgYou've have your Kinect and want to mount it on your robot, but now you're faced with a challenge: you need to precisely determine the mounting point of your Kinect so that it the data from it can be interpreted correctly; e.g. if you want to use it to run autonomous navigation.

The "Automatic Calibration of Extrinsic Parameters" entry from François Pomerleau, Francis Colas and Stéphane Magnenat of the Autonomous Systems Lab at ETHZ makes solving this problem easy for users and does much more. If you run their software with the Kinect mounted, it will output the tf transform between your base_link and the camera, making configuration easy.

They also released several lower-level libraries to help build other applications on top: libnabo for running fast K Nearest Neighbor, and libpointmatcher, a modular ICP library. These are important for building tracking applications, as shown in the video, as well as building SLAM and other systems.

Thumbnail image for 3Dturtle.jpgPatrick Goebel is the creator of Pi Robot, which is a custom-built, robotis-based hobby robot. Patrick has been a frequent contributor to the Trossen and ROS communities, including writing a detailed essay for hobbyists getting into ROS.

His entry for the ROS 3D contest builds on Taylor Veltrop's teleop control to adapt it for the Pi Robot, as well as add in a base controller and the ability to define new gestures for control. Patrick has also contributed a serializer package for those wishing to use the Robotis Serializer microcontroller in ROS. Pi Robot may be one of a kind, but, thanks to Patrick's contributions, you have the software you need to build your own.

Patrick will be giving the featured presentation at tonight's Homebrew Robotics Club meeting.

Thumbnail image for 3Dturtle.jpgTaylor Veltrop had the first ROS 3D contest entry with his teleoperation control of a humanoid KHR/Roboard robot. He wasn't content to leave it at that: he beefed up his teleoperation system with Wiimote and leg-based control. He also is running it on an Aldebaran Nao.

One of the difficulties in using the skeleton tracking libraries with the Kinect is that you do not get much information about the hands of the operator. For those trying to use the skeleton tracking to control a robot's arms, this creates a pickup problem: you can get the arm to location you wish to grab an item, but you don't have the control you need over the angle of the hand and the opening and closing of the gripper to complete the task.

Taylor solves this by enabling you to use Wiimotes in each hand. With this additional controls, the operator can seamlessly use the wiimotes to transmit the additional information about the correct hand position, and you can use the buttons on the Wiimote to perform additional operations, like opening and closing grippers.

Taylor also collaborated with Patrick Goebel to add in leg controls for moving a robot. Placing one leg forwards or backwards move the robot in that direction. Placing a leg to the side makes the robot turn.

You can watch Taylor's new video above, where he puts the Nao teleop through it's paces. If you have ever wanted to see a Nao wield a knife, play chess, or grab a tissue out of a box, check it out.

ROS 3D Entries: Nao Teleop Control

| No Comments | No TrackBacks

Halit Bener SUAY entered the ROS 3D contest with this entry that demonstrates teleoperation of an Aldebaran Nao using a Kinect. This is the not the only entry to tackle teleoperation, but it adds its own unique twists. Most notably, there are pre-defined gestures that enable the operator to switch between different modes of control. One leg controls starting and stopping the robot. Another enables the operator to switch between controlling the body and the head. Your arms can either directly control the robot arms are issue other commands, like directing the robot's gaze. All-in-all, it's a great demo of how we can go completely remoteless and still control a complex, walking robot like the Nao.

Humanoid Robot Control and Interaction

ROS 3D Entries: RGBD SLAM

| No Comments | No TrackBacks

Credit: Nikolas Engelhard, Felix Endres, Juergen Hess, Juergen Sturm, Daniel Kuhner, Philipp Ruchti, and Wolfram Burgard

The University of Freiburg team has put together an impressive 6D-SLAM library for entry into the ROS 3D Contest. By taking advantage of the additional 3D data that a Kinect provides, they've released a new benchmark for the state-of-the-art in the field. It's also a great demo that we can all try ourselves: pick up your Kinect, move it around, and build 3D models of your world.

RGBD-6D-SLAM

3Dturtle.jpgWe're now busy judging the eighteen awesome entries to the ROS 3D Contest. There's everything from teleoperation to games to libraries for registration and calibration. It's going to be tough choosing who gets a prize on them.

You can go ahead and checkout the entries yourself. In most cases, you should be able to even download and try them out on your own Kinect or PrimeSense device.

While we tally the results, we'll spotlight the entries here.

First off are Garratt Gallagher's entries. Garratt was our most prolific entrant and produced a total of five separate entries. Each is worth it's own blog post, and many of them already have been featured here:

We're grateful that Garratt has taken the time to, not only enter the contest, but go the extra mile to make sure that others can try out his libraries and build on his creative ideas. If you like what you see, you should consider helping out his Bilibot project, which is a low-cost Kinect + Create platform.

Garratt's newest entry is "Customizable Buttons". Using the Kinect, you can draw on a piece of paper to create your own music board. It's a lot of fun, as you'll see in the video:

Garratt's Entries:

Bill Mania gave an introductory presentation on ROS at ChiPy, the Chicago Python users group. He also gave a demo of his RoboMagellan robot that he's bringing up on ROS. This is a good overview for those of you just getting into ROS, especially from a Python perspective.

Recorded by Carl Karsten

ROS and PR2 in Education

| No Comments | No TrackBacks

We think that ROS and the PR2 are great tools for educators. Both platforms allow students to focus on building the relevant parts of a system while incorporating less topical components from the open source community. Students get started faster and complete more impressive projects. Even more importantly, students can take components built in ROS to their next course, research project or job without worrying about licensing.

We've started a wiki page to list courses using ROS or the PR2, and to discuss teaching-related issues. Here are some course examples that you can use for inspiration:

Short Courses

University (Undergraduate & Graduate) Courses

If you're teaching a course using ROS or the PR2, please post a link at ros.org/wiki/Courses. If you have advice on setting up labs, course computers, or any other teaching-related topic, post those too. By sharing material, we'll all create effective courses more quickly.

Taylor Veltrop has made the first entry to our ROS 3D Contest. He uses the Kinect, and NITE to put a Kondo-style humanoid through pushups, waves, and other arm-control gestures. Great work! We look forward to seeing more entries.

Hi everyone!

Please take a look at my entry in the Kinect/RGB-D contest! I'm really happy with how it's turned out so far.

It's a small humanoid hobby robot by Kondo with a Roboard running ROS. The arms are controlled master/slave style over the network by a Kinect.

Entry: Humanoid Teleoperation

Taylor Veltrop

You can watch an interview with Taylor about this project over at Robot Dreams.

In the works: ScaViSLAM

| No Comments | No TrackBacks

For Kinect/OpenNI users and VSLAM researchers, we're working on integrating Hauke Strasdat's ScaViSLAM framework into ROS. ScaViSLAM is a a general and scalable framework for visual SLAM and should enable exciting applications like constructing 3D models of environments, creating 3D models of objects, augmented reality, and autonomous navigation.

We hope to release the ScaViSLAM library in Spring of 2011.

Happy Holidays from Lewis at Wash-U

| No Comments | No TrackBacks

Turtles Using ROS: Willie

| No Comments | No TrackBacks

IMG_6047.JPG

Meet Willie, Ben Cohen's Reeve's turtle of 16 years. Willie's research interests include autonomous navigation using search-based motion planning, and AUVs.

Photos

Find this blog and more at planet.ros.org.


Please submit content to be reviewed by emailing ros-news@googlegroups.com.

Monthly Archives

About this Archive

This page is an archive of recent entries in the misc category.

jobs is the previous category.

packages is the next category.

Find recent content on the main index or look in the archives to find all content.