May 2015 Archives

Ridgeback the mobility solution for Baxter

| No Comments | No TrackBacks
From Meghan Hennessey of Clearpath Robotics

RB_Baxter3.22.png

Clearpath Robotics announced the newest member of its robot fleet: an omnidirectional development platform called Ridgeback. The mobile robot is designed to carry heavy payloads and easily integrate with a variety of manipulators and sensors. Ridgeback was unveiled as a mobile base for Rethink Robotics' Baxter research platform at ICRA 2015 in Seattle, Washington.

 

"Many of our customers have approached us looking for a way to use Baxter for mobile manipulation research - these customers inspired the concept of Ridgeback. The platform is designed so that Baxter can plug into Ridgeback and go," said Julian Ware, General Manager for Research Products at Clearpath Robotics. "Ridgeback includes all the ROS, visualization and simulation support needed to start doing interesting research right out of the box."

 

Ridgeback's rugged drivetrain and chassis is designed to move manipulators and other heavy payloads with ease. Omnidirectional wheels provide precision control for forward, lateral or twisting movements in constrained environments. Following suit of other Clearpath robots, Ridgeback is ROS-ready and designed for rapid integration of sensors and payloads; specific consideration has been made for the integration of the Baxter research platform.

 

"Giving Baxter automated mobility opens up a world of new research possibilities," said Brian Benoit, senior product manager at Rethink Robotics. "Researchers can now use Baxter and Ridgeback for a wide range of applications where mobility and manipulation are required, including service robotics, tele-operated robotics, and human robot interaction."

 

Learn more about Ridgeback AGV at www.clearpathrobotics.com/ridgeback


ridgeback V0.109.png


From Alessio Levratti via ros-users

I developed a new node for skeleton tracking for the ASUS Xtion Pro Live by editing openni2_tracker
The main differences are:
  • The node publishes a new message (user_IDs) containing the ID of the tracked user
  • The node publishes the video stream captured by the Xtion
  • The node publishes the Point Cloud captured by the Xtion

The package can be downloaded here: https://github.com/Chaos84/skeleton_tracker.git
Just type:
    $ git clone https://github.com/Chaos84/skeleton_tracker.git

The Lily Camera developed using ROS

| No Comments | No TrackBacks

From Henry Bradlow




"Throw your Lily in the air like you just don't care" is a popular comment on YouTube for the new Lily Camera. The fully autonomous flying camera went viral on May 12th when it was announced by Lily Robotics. It focuses on filming you, so you can focus on your activity.


Many people in the robotics and film communities have predicted that a product like the Lily Camera would enter the market, given the recent popularity of using drones for filming. While several companies have attempted to develop similar products, no company has achieved the capacity or flexibility of the Lily Camera, and no company has earned such enthusiastic attention.


What distinguishes the Lily Camera is its ease of use. The user throws the camera into the air and the Lily Camera automatically follows the user, capturing shots that are unmatched by any other device. The engineers at Lily Robotics integrate strategies from robotics, computer vision, and signal processing to ensure that Lily always knows the location of itself and the user it is filming. To achieve this situational awareness, the Robot Operating System, ROS, is heavily utilized.


In Lily Camera prototypes, ROS was used for passing messages between the tracking device and the Lily Camera. According to Rowland O'Flaherty, Lily Robotics' lead controls engineering, "Based on the sheer nature of how ROS is structured, it is seamless to pass messages between different devices. Sometimes you even forget that there are separate devices communicating with each other."


ROS is also leveraged for the testing and development of the Lily Camera. The engineers at Lily Robotics use RViz, ROS's 3D visualization tool, to simulate the movements of the camera for development, to visualize live test flights for real-time analysis, and to replay test flights for debugging and examination. Rowland added, "We can either run a simulated flight or a real flight with a flip of a switch thanks to ROS. Both the simulator and the real robot run the same code (ROS nodes), which rapidly increases the development cycle." 

ROS Jade Turtle Release

| No Comments | No TrackBacks
We're happy to announce the official release of ROS Jade Turtle [1]! 

jadeturtle_LoRes.jpg

Jade Turtle is the ninth release of ROS and is primarily targeted at the Ubuntu distributions Trusty, Utopic, and Vivid. Our current count of packages is 520, you can compare the packages available in Indigo vs. Jade here:


You can install Jade by following the Jade installation instructions here:

http://wiki.ros.org/ROS/Installation

I'm also happy to officially announce the name of the next release of ROS as Kinetic Kame, which we'll refer to as just "kinetic"!

We've also just finished deploying the new prerelease.ros.org website and updated the documentation[3] on the new way to do prereleases for Jade and Indigo:


We're also aware of, and working on, an issue that affects rosbuild on newer Ubuntu's. The issue is a blocker, but not on Trusty. We hope to have a solution in the next few weeks. Please follow this issue if you are interested:


Finally, I want to thank all of the people who helped make this release and get it started strong and on time by the release date (which is also World Turtle day [2])! Thanks also to Tully who put together a video representation of the 40 some odd contributors across around 500 packages:


Thanks to everyone, and enjoy ROS Jade!

ROS and rospy on Talk Python To Me Podcast

| No Comments | No TrackBacks
Episode 7 of Talk Python To Me features Dirk Thomas talking about the use of Python in ROS and rospy. The episode description is: 

Programming is fun. Robots are fun. Programming robots is awesome! This episode Michael speaks with Dirk Thomas from the ROS (Robot Operating System) project. You will learn how to use ROS and ROSPy to program robots.

We discuss how to use ROS from some of the largest and most complex robots built (including one on the International Space Station!) all the way down to basic robots controlled via micro-controllers such as arduinos.  

You can listen to the podcast or download it from: http://www.talkpythontome.com/episodes/show/7/robot-operating-system-ros-and-rospy

Robotics Fast Track now accepting applications

| No Comments | No TrackBacks
Cross posted from www.osrfoundation.org

We're excited to announce that OSRF and BIT Systems are seeking innovative and revolutionary robotics projects for the Robotics Fast Track (RFT) effort, sponsored by the Defense Advanced Research Projects Agency (DARPA).

The goals of Robotics Fast Track are:

  1. Enable rapid, cost-effective development of new robotics capabilities designed to respond to, and even anticipate, quickly evolving needs in space, maritime, ground, and air operations. RFT will focus on the development of groundbreaking robotic hardware and software by funding novel approaches as well as creative adaptations of existing technologies.
  2. Achieve breakthrough capabilities in less time and at a fraction of the cost typical of government-supported robotic development processes by engaging highly agile organizations and individuals who traditionally have not worked with the U.S. government.

Learn more and apply at rft.osrfoundation.org!

Call for Proposals: ROSCon 2015

| No Comments | No TrackBacks

ROSCon 2015


-------------------------------------------------
ROSCon 2015

October 3rd-4th, 2015  Hamburg, Germany
Immediately following IROS
-------------------------------------------------

Important Dates
-------------------------------------------------

Call for Proposals -- May 15th, 2015
Proposal submission deadline -- July 7th, 2015
Proposal acceptance notification -- July 14th, 2015


ROSCon

-------------------------------------------------

ROSCon 2015 is a chance for ROS developers of all levels, beginner to expert, to spend an extraordinary two days learning from and networking with the ROS community. Get tips and tricks from experts and meet and share ideas with fellow developers from around the globe.

ROSCon is a developers conference, in the model of PyCon and BoostCon. Following the success of the inaugural ROSCon in St. Paul, Minnesota, the second version in Stuttgart, Germany, last years event in Chicago, Illinois, this year's ROSCon will be held in Hamburg, Germany. Similar to previous years, the two-day program will comprise technical talks and tutorials that will introduce you to new tools and libraries, as well as teach you more about the ones you already know. The bulk of the program will be 30-40 minute presentations (some may be longer or shorter). To submit a proposal please read the Call for Proposals.

If you don't want to make a formal presentation, you should still bring your new project or idea to ROSCon! There will be sessions of Lightning Talks, which are 5-minute mini-talks that are scheduled just-in-time at the conference. There will also be open space for Birds-of-a-Feather (BoF) meetings, impromptu hacking sessions, and informal presentations.

If you are looking for information on past ROSCons see their separate websites including past programs, slides and videos of the presentations are available from their programs: ROSCon 2012 Program, ROSCon 2013 Program, and ROSCon 2014 Program.

As more information becomes available this years program will be filled out here.


Submission Guidelines

------------------------------------------------

Presentations and tutorials on all topics related to ROS are invited. Examples include introducing attendees to a ROS package or library, exploring how to use tools, manipulating sensor data, and applications for robots.

Proposals will be reviewed by a program committee that will evaluate fit, impact, and balance.

We cannot offer sessions that are not proposed! If there is a topic on which you would like to present, please propose it. If you have an idea for an important topic that you do not want to present yourself, please post it to ros-users@lists.ros.org.


Topic areas

------------------------------------------------

All ROS-related work is invited. Topics of interest include:

  • Best practices
  • Useful packages and stacks
  • Robot-specific development
  • ROS Enhancement Proposals (REPs)
  • Safety and security
  • ROS in embedded systems
  • Product development & commercialization
  • Research and education
  • Enterprise deployment
  • Community organization and direction
  • Testing, quality, and documentation
  • Robotics competitions and collaborations

Proposal submission

------------------------------------------------

A session proposal should include:

  • Title
  • Recommended duration: Short (~20 minutes) or Long (~45 minutes)
  • Summary, 100 word max (to be used in advertising the session)
  • Description (for review purposes): outline, goals (what will the audience learn?), pointers to packages to be discussed (500 Words Maximum)

To submit a proposal please visit: http://roscon.ros.org/review


Further Info

------------------------------------------------

The event website is http://roscon.ros.org  You can contact the organizing committee at roscon-2015-oc@osrfoundation.org

RSS 2015 Workshop on Robot Simulation

| No Comments | No TrackBacks

A workshop on Realistic, Rapid, and Repeatable Robot Simulation (R4SIM) will be held at the Robotics Science and Systems conference in Rome, Italy.

The R4SIM workshop is motivated by the need for robotics simulators that

  1. lower the barriers to entering robotics research,
  2. provide a means to realistically and comprehensively simulate systems in conditions, or at scales, that would be unfeasible or impossible to test experimentally, and
  3. enable efficient and reliable transition to and from hardware experiments.

Check out the workshop, call for papers, and important dates at http://r4sim.com/.

And a full CFP is located here: http://www.r4sim.com/R4SIM-Final.pdf

ROS Cheatsheet updated for Indigo Igloo

| No Comments | No TrackBacks
From Aaron Blasdel via ros-users@

The good old ROS CheatSheet has just been released for Indigo. If you know anyone just starting out in ROS please send this on to them.

I recently performed some much needed cleanup, reformatting, and content addition for the CheatSheet. Most notably the GUI tools section has been greatly improved and includes information on the RQT toolset.

Further it now comes in two flavors, New and Improved Catkin Flavor and Original Extra Crispy Rosbuild. Many thanks to Kei Okada of the JSK lab for adding this dual build functionality and his edits for Hydro!

If you find any errors or glaring omissions please create an issue so we can discuss them or a pull request to fix it on the ros/cheatsheet repo.

I hope this is helpful!

Middlesex University intro to ROS summer school

| No Comments | No TrackBacks
From Nick Weldin

Middlesex University is running a one week Intro to ROS summer school 6-10 July in London, UK. It will be a practical hands on class with 10 turtlebot robots and a Baxter Research Robot. More details are available  at http://www.mdx.ac.uk/courses/summer-school/courses/introduction-to-robot-operating-system
From Wagdi Ben yaala via ros-users@

We just published 3 packages for interfacing, using Modbus TCP communication,your ROS workstation with some industrial component like the famous In-Sight camera from Cognex and the Siemens S7 PLC.

You'll find a link and a quick tutorial for all those three packages:
http://www.generationrobots.com/blog/en/2015/04/cognex-siemens-plc-modbus-pkg/

Modbus package : http://www.generationrobots.com/en/content/87-modbus-package
Cognex In-Sight Modbus package : http://www.generationrobots.com/en/content/88-modbus-cognex-in-sight
Siemens S7 PLC Modbus package : http://www.generationrobots.com/en/content/89-plc-siemens-modbus-ros-package

Here is also the link to the ros wiki :
http://wiki.ros.org/modbus

New driver for DepthSense DS325 3D Camera

| No Comments | No TrackBacks
From Walter Lucetti via ros-users@

if someone is searching for a driver for the DepthSense DS325 RGB-D time-to-flight camera, I'm glad to say that the first working version of it is available on Github:

https://github.com/Myzhar/ros_depthsense_camera

At this stage the driver can correctly publish RGB videostream, Simple XYZ Pointcloud and RGB XYZ Pointcloud.

One of the strenght of my driver is the fact that it does not use neither OpenCV nor PCL libraries.
It publishes only  sensor_msg::pointcloud2 and sensor_msg::Image messages.

The next step will be the porting on CUDA of the heaviest function like the RGB to XYZ mapping and the use of the builtin accelerometer to compensate robot asset.
I will also write the code to simulate a 2D laser scanner to provide full information... if requested.

All this with an eye on the use of 100% of the potential of NVidia Jetson TK1...

Any comment or debug will be really appreciated

MIT RACECAR Course using ROS

| No Comments | No TrackBacks
From Michael Boulet via ros-users@

We would like to announce the recent completion of the Rapid Autonomous Complex-Environment Competing Ackermann-steering Robots (RACECAR) class. RACECAR is a new MIT Independent Activities Period (IAP) course focused on demonstrating high-speed vehicle autonomy in MIT's basement hallways (tunnels). The MIT News Office published an overview article with video at: https://newsoffice.mit.edu/2015/students-autonomous-robots-race-mit-tunnels-0406 . The course website is: http://racecar.mit.edu/ .

 

Instructors provided student teams with an model car outfitted with sensors, embedded processing, and a ROS-based software infrastructure. The base platform is a Traxxas 1:10-scale radio-controlled (RC) brushless motor rally car that is capable of reaching 40+ mph speeds. The sensor suite consists of a Hokuyo 10m scanning lidar, Pixhawk's PX4Flow optical flow camera, a Point Grey imaging camera, and SparkFun's Razor inertial measurement unit. Control and autonomy algorithms are processed on-board with an embedded NVIDA Jetson TK1 development kit running the Ubuntu Linux operating system with "The Grinch" custom kernel. The TK1's pulse width modulation (PWM) output signals drive the motor electronic speed controller and steering servomotor, bypassing the RC receiver.

 

The system uses the Robot Operating System (ROS) framework to facilitate rapid development. Existing ROS drivers (urg_node, razor_imu_9dof, pointgrey_camera_driver, and px4flow_node) receive data from the sensors. The model car's throttle and steering signals are commanded with a new ROS driver interface to the kernel's sysfs-based PWM subsystem. Students develop software and visualize data through a wireless network connection to a virtual machine running on their personal laptops.

 

Given the hardware platform and basic teleoperation software stack, teams of 4 -5 students prototype autonomy algorithms over an intense two week period. Students are invited to explore a variety of navigation approaches, from reactive to map-based control. At the end of the class, the teams' solutions are tested in a timed race around a closed-circuit course in MIT's tunnels. In January 2015, three of four teams reached the finish line with the winning team's average speed exceeding 7mph.

 

We would like to thank the many contributors to ROS and, in particular, Austin Hendrix for hosting the armhf binaries at the time.

 

RACECAR Instructors

ROS Summer School in Aachen (August)

| No Comments | No TrackBacks
From Stephan Kallweit 

Oops, we do it again! After the very successful ROS Summer School 2014, where we had more than 45 participants from all over the world, we are working again on mobile autonomous robots from the 10th until the 21st of August. The official DAAD (German Academic Exchange Program) Summer School, will provide the right starter kit by using our robotic hardware and - of course - ROS.  We first start with some days of introductory courses, before we tackle the main tasks of mobile robotics, i.e. perception, localization and navigation.

A highlight is a competition, like an urban challenge, at the end of the summer school: Participants form different teams that have the task to design a typical mobile robotic application like indoor/outdoor exploration. They all use the same hardware, powered by their learnt ROS skills.

The ROS Summer School includes also some leisure activities, such as trips to some interesting sights in the region. Last, but not least we have a farewell barbecue at the end.

Fore more info and some impressions of the ROS Summer School 2014 please check:


Here you will find a video.

From Vincent Rabaud via ros-users@

On behalf of Aldebaran and SoftBank Robotics, I am pleased to announce official ROS support for the Pepper robot. A local bridge with its NAOqi software is provided for all it sensors as well as an accurate URDF and meshes. Please find more instructions and tutorials on the ROS wiki page at http://wiki.ros.org/Robots/Pepper

Other good news: Aldebaran is now also providing an official C++ bridge with its NAOqiOS. It is pure open source: under the Apache 2.0 license with shared maintainership with the community.

As usual, let's discuss all that on the SIG.

Enjoy !
The Aldebaran team

Erle Robotics brain and vehicles

| No Comments | No TrackBacks
From Víctor Mayoral Vilches of Erle Robotics via ros-users@

Hi everyone,

I'd like to introduce Erle-Brain (https://erlerobotics.com/blog/product/erle-brain/) Linux autopilot, a ROS-powered embedded computer that allows to build different kind of drones and robots.

Using Erle-Brain we've build several vehicles (Erle-Copter, Erle-Plane, Erle-Rover, ...) displayed at http://wiki.ros.org/Robots and we keep exploring new paths. The brain runs the APM software autopilot (in Linux) which connects with ROS through the mavros bridge allowing to control the robots simply publishing to ROS topics.

This ROS package (https://github.com/erlerobot/ros_erle_takeoff_land) shows a simple example on how to autonomously take off and land a VTOL powered by Erle-Brain.

We are really excited to see what people can do with our Brain and vehicles so we've decided to launch a program that offers discounts for educational and research purposes called dronEDU (dronedu.es).
Feel free to get in touch with us if you are interested.

Jade Beta

| No Comments | No TrackBacks
We're happy to announce we're now in the Jade Beta! We're a few days behind schedule, but thanks to the hard work of all our contributors we've now got all of desktop-full released into Jade and available on packages.ros.org.

Even though we're a bit behind schedule, we would like to try and keep the original release date of May 23rd (also world turtle day [1] :D). That gives us just under 23 days until the release. We'll keep that date unless we run into a show stopper within desktop-full.

So between now and then I would encourage everyone who is able to:
  • Install `ros-jade-desktop-full` on Ubuntu and test out packages you regularly use.
    • Testing on other platforms is also appreciated!
  • Try out any documentation that you can, including tutorials, package wiki pages, and generated code docs.
  • Continue releasing packages and fill out the gaps between Jade and Indigo where possible.
If you find any issues while testing, please locate the issue tracker (usually on the corresponding wiki page for the package, e.g. wiki.ros.org/rviz) and report the issue there.

Auditing documentation is more challenging just because there is so much of it and searching on the wiki does not always make it easy to find pages with distribution specific content. So to help with this, I've done some special searches locally on the wiki's web server and compiled a list of pages which _may_ need to be updated for Jade:


So if you have time, please look at that list, and do a spot check on any pages that you use or have used in the past. Many of the core documentation pages are absent from that list because I've compiled them separately in a GitHub issue here:


Finally, if you are trying to release a package for Jade and the dependencies are not there yet, please contact the maintainers or ask for help on ros-release@lists.ros.org.

Thanks again to everyone who helped get the Jade beta out (mostly) on time.

Cheers,

P.S. Only ros-jade-desktop is available on armhf right now, we're waiting on an updated set of gazebo5 debs and then we'll have desktop-full on armhf as well. Also, armhf is Trusty only right now.

P.S.S. If you are testing gazebo-ros integration, we are aware of an issue with the launch files and are tracking it here:


A work around is to install `libgazebo5-dev` manually. We hope to have a proper fix out soon.

Clearpath offers ROS consulting service

| No Comments | No TrackBacks
Reposted from OSRF Blog

Our friends at Clearpath Robotics announced today that they're offering ROS consulting services for enterprise R&D projects. And they've committed to giving part of the proceeds to OSRF, to support the continued development and support of ROS!

This service is something that we've heard requested many times, especially from our industry users, and we're excited that Clearpath is going to offer it. If you're looking for help or advice in using ROS on a current or upcoming project, get in touch with Clearpath.

Find this blog and more at planet.ros.org.


Monthly Archives

About this Archive

This page is an archive of entries from May 2015 listed from newest to oldest.

April 2015 is the previous archive.

June 2015 is the next archive.

Find recent content on the main index or look in the archives to find all content.