February 2011 Archives

dvurkoaritz.jpg
aritzsurf.jpg

Guest post from Urko Esnaola of Tecnalia

Tecnalia and Pukas have cooperated to integrate sensors in a high-performance surfboard to record data of relevant surfing parameters in real operation -- while surfing waves.

The aim of the project is to have information about "what's going on" in a high-performance surfboard while a surfer is riding the board. This will help: (i) surfboard manufacturers: to have valuable information to fabricate optimal performance surfboards; (ii) surfing community: to have very complete information about their surfing technique.

Strain gauges have been included to record the flex and torsion of the surfboard in real operation. One XSens MTi-G integrating gyroscopes, accelerometers, compass and GPS has been incorporated to record data about the surfboard accelerations, speed and movements. Pressure sensors have been installed on the surfboard deck to record data about the surfer's feet position. All the data is recorded in a flash memory stick through an IGEPv2 embedded computer.

After a surf session has been finished, data is transmitted over wifi to a PC. The software system to visualize and process the data has been developed in ROS.

Phase 1 of board construction and electronics performance validation has successfully finished. The exciting Phase 2 has started to do data analysis to find the keys for the mechanical behavior of surfboards and to improve surfer's surfing technique. Professional surfers Aritz Aranburu, Hodei Collazo, Kepa Acero and Mario Azurza have already tested the surfboard. Other professionals like Tiago Pires, Joan Duru, Tim Boal and Eneko Acero are waiting for their chance.

Pukas - Tecnalia Surfsens project from Pukas Surf on Vimeo.

Footage of the visualization software:

ROS Diamondback Release Candidate 3

| No Comments | No TrackBacks

Thumbnail image for diamondback-top-640w.pngThe third candidate of ROS Diamondback is now available! We intend for this to be our final release candidate, so please report any critical issues that you find. We will be focusing on improving source-based install methods between now and the final release.

This update comes with PCL 0.10, which has the stable API that will be used in PCL 1.0. The ROS 3D contest demonstrated many useful applications of PCL with robotics, and we're excited to have this stable release for Diamondback. We encourage users to provide feedback on tutorials and example applications to help guide PCL to its 1.0 milestone, which we are targeting for the ROS Electric Emy release in August.

Thank you to the many users who have submitted patches and bug reports to help improve these release candidates, including (but not limited to): andrewstraw, mdesnoyer, timn, ryohei, snorri, soetens, nbutko, nevion, willylambert, stevenbellens, isucan, gbiggs, atr, rene, bouffard, dthomas, joq, and lorenz.

ROS Diamondback RC3 Installation Instructions

Major updates:

ROS, meet Arduino

| No Comments | No TrackBacks

Guest post from James Bowman of Willow Garage

We recently wanted to hook up an analog gyro (the Analog Devices ADXRS614) to ROS, and decided to use an Arduino to handle the conversion.

Arduino interfaces a gyro to ROS

The Arduino runs a tiny loop that reads the analog values from its six analog lines,and writes these values to the USB serial connection.  Meanwhile on the robot, a small ROS Python node listens to the USB reports, and publishes them as a ROS topic.

That's all there is to it: the whole thing takes under 40 lines of code.

This is probably the simplest possible way to use an Arduino with ROS.  There are quite a few more sophisticated projects:

  • pmad  - controls an Arduino's I/Os using ROS service calls
  • avr_bridge - automates generation of ROS-Arduino message transport
  • arduino - our own package: an Arduino as an Ethernet-connected ROS node

ROS Diamondback Release Candidate 2

| No Comments | No TrackBacks

Thumbnail image for diamondback-top-640w.pngThe second candidate of ROS Diamondback is now available! We invite members of the community to test this second release candidate and identify any potential integration issues. We anticipate doing one more release candidate before the official Diamondback release, which is slated for the end of this month.

In addition to several bug fixes, this release also includes stacks from ccny-ros-pkg, including drivers for AscTec quadrotors. We appreciate the efforts of the CCNY Robotics Lab in putting together these releases.

ROS Diamondback RC2 Installation Instructions

New stacks:

Updates

rospy on Android

| No Comments | No TrackBacks

Announcement from Prof. Dr. Matthias Kranz of TUM

The team of the Distributed Multimodal Information Processing Group of Technische Universität München (TUM) is pleased to announce that we ported rospy to run on Android-based mobile devices.

Python for Android on top of the Scripting Layer for Android (SL4A) serves as basis for our rospy project. We extended the scripting layer, added new support for ctypes and other requirements. Now, rospy, roslib and the std_msgs are working and running on a roscore, directly on your mobile phone. To configure a roscore on a standard computer to cooperate with the roscore on the Android, you simple scan a QR code on the computer's screen to autoconfigure the smartphone. Basic support for OpenCV and the image topics is also included. You are welcome to extend the current state of our work.

You will need a current Version of the Scripting Layer (v3) and the newer Python for Android with the possibility to import custom modules. You can use every Android device able to run the SL4A. In general you be able to run it on every recent Android powered device. You will not harm your phone at all, even no root access is needed to run ROS on your device.

You can find our code, basic documentation and a video in our repository and on ROS.org.

A small video showcasing how to control a ROS-based cognitive intelligent environment via an Android-based smartphone is available here.

Links:

m1_robo_dance.jpg

Congrats to Meka on the official announcement of their M1 Mobile Manipulation platform! The M1 integrates many existing Meka hardware products into a unified mobile manipulation platform. This includes:

  • S3 Sensor Head with Kinect-compatible interface and 5MP Ethernet camera
  • Two A2 Compliant Manipulators with 6-axis force-torque sensors at the wrist
  • Two G2 Compliant Grippers
  • B1 Omni Base with Prismatic Lift and Computation Backpack

Meka M1 Mobile Manipulator vimeo from Meka Robotics on Vimeo.

As we've previously featured, Meka supports integration with ROS on their various robot hardware products. They are looking to take that a step further with the new M1 platform. They will start "pushing deeper on ROS integration", which means integrating the M1 with many higher-level ROS capabilities that the community has built. Users will still be able to take advantage of the great real-time capabilities that their M3 software system provides.

The M1 joins the growing ROS community of mobile manipulation platforms: Care-O-bot 3, PR2, and DARPA ARM robot. We're excited that the users of these various hardware platforms will be able to easily collaborate and push new bleeding-edge capabilities in ROS.

The M1 is inspired by Georgia Tech's Cody, which is also built using Meka components. The M1 will cost $340k for the standard model, but the modular design enables Meka to develop customized solutions as well.

ROS Day @ ISR Coimbra Presentation Slides

| No Comments | No TrackBacks

ros_isr.png

Message from Gonçalo Cabrita to ros-users

Hi everyone!

The ROS at ISR event is now over. We had a full amphitheater so I hope we'll be having a bunch of new ROS users very soon!

Once more thank you to everybody who helped and contributed for the presentation, which by the way is now available for download in pdf format at the event webpage. I will also be posting a presentation with notes soon.

Event page: ROS Day at ISR Coimbra

If anyone is interested in the Keynote presentation (for Mac ofc) send me an email!

Gonçalo Cabrita
ISR University of Coimbra
Portugal

Neato XV-11 Laser Driver

| No Comments | No TrackBacks

BewareCats.jpgAnnouncement from Eric Perko of Case Western to ros-users

Hello folks,

I'm happy to announce a ROS driver for the Neato XV-11 Laser Scanner. ROS has had a driver for the Neato itself for sometime now, but this was only useful for going through the Neato's onboard computer. Well now you can just yank that XV-11 laser scanner out, strap it to your iRobot Create and feed it right on into the rest of ROS! The XV-11 scanner gives 360 pings at 1 degree increments at a rate of 5Hz and is useful from ~6cm to ~5m.

Chad and I have written up a number of tutorials to help people get started using this low-cost scanner including how to remove the laser from the vacuum and wire it to USB, how to get it up and running with our ROS driver and how to interpret the raw bytes if you want to parse the data yourself on a microcontroller.

Let us know if you have any problems/questions/comments.

Eric Perko, Chad Rockey
CWRU Mobile Robotics Lab

Announcing ROS Answers

| No Comments | No TrackBacks

Screen shot 2011-02-15 at 7.28.14 PM.png

There's a brand-new answers.ros.org site to help you ask questions about ROS and have them answered by a community of ROS experts. It's like Stack Overflow, but all the questions are about ROS, and it's powered by the open-source AskBot platform.

Melonee and Tully have been hard at work on this the past couple of weeks and explained their motivation in their announcment to ros-users:

Over the last couple months we've seen an increase in the volume of emails on ros-users and no one likes too many emails. So we put together answers.ros.org to provide an alternative forum for asking and answering questions. If you really love getting all that email you can still have it by signing up for answers.ros.org and setup your profile to receive all traffic like before. However if you prefer lower volume you can filter based on tags (whitelist or blacklist). This does not mean we're doing away with ros-users, we would like to use this mailing list more for announcements and general discussion rather than individual troubleshooting. We would to like to encourage everyone to signup at answers.ros.org. Go ask questions and provide answers as usual.

Some of you may be wondering, "Why don't you just use Stack Overflow?"

The ROS Answers site, we hope, will provide a better experience for users. All of the tags will be related to ROS -- you can even subscribe to questions about a particular package/stack. We also hope that we will be able to use the open-source AskBot platform to have tight integration with the ROS wiki. In the future, look for wiki macros and other navigation tweaks to help you find the information you need more effectively.

Also, you may be wondering "When should one use answers.ros.org vs. the ros-users mailing list?" Well, there's an answer for that.

Lastly:

If you have questions or feedback about the site use "answers.ros.org" as the tag so that we get your feedback.

Hands-free vacuuming by OTL

| No Comments | No TrackBacks

OTL has been a frequent contributor of great Roomba hacks, and this one is no exception. This time he's used a Kinect and a Roomba bluetooth connector to take back control of the vacuum. You can find out more in his blog post (Japanese). His blog is a great Japanese-language resource for getting into ROS.

See also:

Michael Ferguson, prolific contributor to albany-ros-pkg and vanadium-ros-pkg, has put together his own low-cost mobile manipulator using a Kinect, ArbotiX RoboController, Dynamixel servos, and a custom diff-drive base.

Maxwell is my latest attempt at a lowcost, human-scale mobile manipulator using an ArbotiX and ROS. The design guidelines were pretty straight forward: it needed an arm that could manipulate things on a table top, a Kinect for primary sensor on the head, and a mobile base that kept all that stuff upright. Additionally, I wanted the robot to be easy to transport and/or ship.

You can find out more at Show Us Your Sensors as well as the Trossen Robotics Forums.

ROS Diamondback Release Candidate 1

| No Comments | No TrackBacks

Thumbnail image for diamondback-top-640w.pngThe first release candidate of ROS Diamondback is now available! Thanks to the help from the community, we've been able to make numerous improvements to our Diamondback Beta releases and are now ready to put out our first release candidate.

We have also been putting together new robot-centric API and installation guides, which we hope will help organize the various ROS libraries better for new ROS users. We have setup some example pages and welcome others in the community to do the same:

Stack contributors: You are now welcome to start releasing against Diamondback directly. We had disabled some stacks in Diamondback due to some integration issues that have now been resolved. You can view the build status to check whether or not your stack is currently included.

Known issues:

  • Integration is still on-going for OS X and other Non-Ubuntu platforms.
  • Final Diamondback release is awaiting PCL 1.0.

ROS C Turtle Update

| No Comments | No TrackBacks

Thumbnail image for cturtle_poster.jpgA new C Turtle update has been released. This is a minor update mainly to provide compatibility with Diamondback.

Updates:

Orocos Toolchain ROS 0.2.1 released

| No Comments | No TrackBacks

Announcement by Ruben Smits to ros-users

The Orocos Toolchain ROS stack developers are pleased to announce a new release of the orocos_toolchain_ros stack.

Debian packages are already available for unstable, diamondback and cturtle (only for lucid and maverick) will be available soon.

Most important changes:

  • We moved the development to GIT: the stack is available here (only http protocol is supported). Since we use git submodules do not forget to use -- recursive if you clone or to issue git submodule init;git submodule update after cloning.

  • Integration of RTT/Properties and ROS/Parameters: a new RTT service is available in the rtt_ros_param package that allows RTT users to store component properties on the ROS parameter server or to refresh them using values on the ROS parameter server. The integration also works seamlessly for your complex types as long as a RTT/Typekit for the type exists. Check here for more details.

  • The RTT typekit and transportkit generation for messages now creates a single library/typekit/transportkit for each package instead of one for each message type.

  • RTT and OCL are now native ROS packages, which allows us to use rosmake directly on the RTT and OCL source.

  • OCL offers a orocreate-pkg tool to create a new Orocos package (which uses rosbuild behind the scenes if available) including template code for components, typekits, plugins and services.

  • the OCL/deployer can now be used with roslaunch, each deployed process becomes a single node, containing the different deployed components.

  • the OCL/deployer now uses rospack in its import functionality, to find new component libraries, services, plugins, typekits and transportkits.

What's still on the roadmap:

  • Nicer solution for sequence types in the rosparam integration
  • Integration of the RTT/logging with the ROS/logging
  • Integration of RTT/Operations and ROS/Services (This one is currently on a hold because the feasibility is still uncertain)

More information:

-- Your friendly Orocos-ROS integration team.

ROS Diamondback Beta 2 Release

| No Comments | No TrackBacks

diamondback-top-640w.pngOur second beta release of ROS Diamondback is now available (installation instructions). Thank you to the many users who have reported issues with the first beta release. We also appreciate the users who have updated Diamondback instructions for the many various platforms ROS is available on. A special thank you goes to Nicholas Butko, who provided numerous patches to improve OS X compatibility.

In addition to OS X updates, this release continues our rollout of Eigen 2/3 compatibility and also cleans up various stack dependencies and installation variants. There are still several fixes and updates we wish to deploy before we reach release candidate status.

Install ROS Diamondback Beta

Buttons Redux

| No Comments | No TrackBacks

Get your Axel F on with this redux of Garratt Gallagher's prize-winning Customizable Buttons.

He also has PR2 moving with the Kinect. We can now only hope that he combines these two videos together...

Feburary 16, 2011: ROS Day @ ISR Coimbra

| No Comments | No TrackBacks

ros_isr.png

Lino Marques, Gonçalo Cabrita, Pedro Sousa and David Portugal are hosting a "ROS Day" at ISR University of Coimbra on February 16th. The afternoon event will introduce ROS to robot robot software developers and algorithm researchers from ISR and DEEC.

ROS 3D Contest: The Results!

| No Comments | No TrackBacks

Thumbnail image for 3Dturtle.jpgWe were absolutely thrilled with eighteen entries to the ROS 3D Contest. The community really impressed us with creativity and technical prowess in these entries, and choosing the prizes was a difficult process. In fact, so difficult that we bent the rules and created two new prizes: 4th Overall and 2nd place Most Useful.

For the Overall prizes, we selected the entries that both amazed us and embraced the spirit of the contest: inspiring and providing building blocks for future Kinect hackers. We've been able to try many of these entries on our own thanks to the great code and documentation, and we hope that others will as well. It was hard to pick a favorite, but we kept coming back to Garratt Gallagher's Customizable Buttons. We tried it on our own desks, and it just puts a smile on your face: you draw a button whereever you please and press it. It made us feel like we were in a cartoon world where we could bend the rules of the universe. Garratt was also the most prolific -- with his six different entries, you can deconstruct the various components that he was able to assemble to produce very different results.

For second place, we selected Quadrotor Altitude and Obstacle Avoidance by the STARMAC project at Berkeley. We loved seeing what a Quadrotor with an Atom processor could do, and they went the extra mile to make sure others with quadrotors had a good starting point. For third place, Taylor Veltrop's Humanoid Teleoperation entry just kept getting better and better. He recently used his library to win best performance in the Robot Athlete Cup 2011 competition by doing robo-ikebana. We added a fourth place for Person Tracking and Reconstruction from a Mobile Base with a 7 DOF Manipulator by Chris Burbridge & Lorenzo Riano. Using a robot to turn a Kinect into a 3D scanner holds many possibilities.

For Most Useful, we were wowed. First prize went to RGBD-6D-SLAM from the University of Freiburg. SLAM with a Kinect holds the potential to unlock many applications, from creating 3D maps and 3D models to cheap autonomous navigation and much, much more. Normally "bleeding edge" means no one else can run it, but, in their case, they produced a 6D-SLAM solution that we were able to download and use in our own offices. There is much to improve, but the potential is huge. For second place, we chose ETH Zurich's Automatic Calibration of Extrinsic Parameters. Anyone mounting a Kinect on their robot should take a look at using their library.

Finally, our PrimeSense Dev Kit 5.0 Awards (thanks PrimeSense) go to Michael Ferguson and the Chemnitz University of Technology. We are confident, from their entries, that they will be able to put them to good use.

Thanks everyone!

Overall:

1st Place ($3000): Customizable Buttons, Garratt Gallagher
2nd Place ($2000): Quadrotor Altitude and Obstacle Avoidance, Patrick Bouffard
3rd Place ($1000): Humanoid Teleoperation, Taylor Veltrop
4th Place ($500): Person Tracking and Reconstruction from a Mobile Base with a 7 DOF Manipulator, Chris Burbridge & Lorenzo Riano

Most Useful:

1st Place ($2000): RGBD-6D-SLAM, Felix Endres, Juergen Hess, Nikolas Engelhard, Juergen Sturm, Daniel Kuhner, Philipp Ruchti, and Wolfram Burgard
2nd Place ($1000): Automatic Calibration of Extrinsic Parameters, François Pomerleau, Francis Colas and Stéphane Magnenat

PrimeSense Dev Kit 5.0 Awards:

The Chair of Automation Technology at Chemnitz University of Technology shows just how versatile a Kinect on a quadrotor can be. Their entry, "Autonomous corridor flight of a UAV using the Kinect sensor", uses the Kinect to find the ceiling, walls, and floor of a corridor. Once the quadrotor knows the geometric structure of the corridor, it can happily fly down the middle to get where it needs to go.

Their demo is built on an AscTec Pelican with a stripped-down Kinect. To handle the rest of the autonomous flight needs, they use a ADNS 3080 optical flow sensor for position and velocity control, and a SRF10 sonar sensor for altitude control. Sample-consensus algorithms from PCL are used to convert the 3D point cloud data into the estimated positions of these surfaces. Remarkably, they managed to make all of this run on an Atom processor.

We had a great time at the January 26th meeting of the Homebrew Robotics Club, which was held at Google. There were many quick show-and-tell demos of various hobby robots followed by the featured presentation, "ROS for the Rest of Us", given by Patrick Goebel. Patrick is the creator of Pi Robot and maintainer of pi-robot-ros-pkg. Patrick writes regularly about ROS on his website, including a tutorial on visual object tracking using ROS.

There were several ROS-related presentations during the show-and-tell. Tony Pratkanis presented his ROS + Neato platform, Melonee Wise and Tully Foote introduced the "Turtlebot" platform (iRobot Create + Kinect), and James Bowman demoed an Arduino board running a full ROS node.

ROS for the Rest of Us

Show and Tell

Thumbnail image for 3Dturtle.jpgChris Burbridge and Lorenzo Riano from the University of Ulster Intelligent Systems Research Centre used the Kinect to turn their robot into a mobile 3D person scanner. A Kinect is great for collecting 3D data, but sticking it on wheels is even better because you can collect data from multiple points of view and construct full 3D models.

Their demo uses the Kinect at both the skeleton tracking and 3D point cloud level. The OpenNI skeleton tracker is used to identify the position of the person in the room, and then the 3D point cloud data is used to start building the full 3D scan. Once all of the point clouds are collected, they use PCL to create a unified 3D model.

The UU robot is a custom MetraLabs Scitos G5 mobile robot with a Kinect mounted at the end of a Schunk 7 DOF manipulator, but their code should be adaptable to other robot platforms.

ROS Diamondback Beta Release

| No Comments | No TrackBacks

diamondback.jpg

A beta release of ROS Diamondback is now available (installation instructions). We encourage users to try this beta release out and report any issues. We also encourage users who were using "unstable" to switch to Diamondback at this time as they are equivalent. We expect a very short beta period before our first release candidate.

The release of Diamondback is currently delayed due to integration of Eigen 3. Eigen 3 has become an important library in the robotics community, but it was originally incompatible with Eigen 2. The Eigen developers have been very responsive in introducing backwards compatibility modes, and we've determined that it would be better integrate with the updated library than to attempt to implement custom workarounds.

We still anticipate releasing Diamondback release this month, but we need your help to test Diamondback early and often, especially our Eigen integration. We also request the help of our OS X/Arch/Fedora/Gentoo/OpenSUSE/Slackware community to test integration and installation instructions for those platforms.

Install ROS Diamondback Beta

ROS on FreeBSD Presentation

| No Comments | No TrackBacks
Post by René Ladan to the ros-users list

Hi,

last December I gave a talk about using ROS on FreeBSD.  The sheets are
available at ftp://rene-ladan.nl/pub/ros-freebsd.pdf .  Note that the
USB problem mentioned on sheet 12 is fixed for FreeBSD 8 and newer :)

Regards,
René

ROS 3D Entries: Teleop Kinect Cleanup

| No Comments | No TrackBacks

Thumbnail image for 3Dturtle.jpgZoltan-Csaba Marton and Dejan Pangercic of TUM's "Teleop Kinect Cleanup" entry into the ROS 3D Contest is a couple of demos rolled into one. Using their entry, you can point at an object on a table, then, in the virtual rviz display, move that object somewhere else like a Jedi. You start with a world that looks like your own, but by the time you're done, you've rearranged a new virtual world to your liking.

That's not all. They've also figured out how to make this useful for giving commands to a robot. After you move around a cup in your virtual world to your liking, a command to move the cup can be passed to a robot. Thus, once you've re-arranged your virtual world, it becomes the job of the robot to make the real world look like your virtual world.

If you want to see their robots in action, you can checkout this video of TUM's Rosie and PR2 making pancakes together.

Find this blog and more at planet.ros.org.


Please submit content to be reviewed by emailing ros-news@googlegroups.com.

About this Archive

This page is an archive of entries from February 2011 listed from newest to oldest.

January 2011 is the previous archive.

March 2011 is the next archive.

Find recent content on the main index or look in the archives to find all content.