- Gradle version -> 2.2
- More (and easier) methods of generating message artifacts.
- Build rosjava debs on the build farm without special workarounds
- Android Studio 1.x support
- Android interactions/pairing now stable with tutorials.
- Lots of other minor fixes and updates.
Results tagged “Android”
- Partially Catkinized - each gradle super project is a catkin package
- You can now do entire workspace builds and CI with one command
- Ros Gradle Plugins: take alot of the repitition out of the build.gradle files
- Debs - you no longer need to build every stack to build your own sources
- A Maven Repo - you don't even need ros to access/build with the rosjava jars, just point to our maven repo on github.
- Messages - each package now compiles into its own jar (no superblob)
- Android Studio/Gradle - uses the new adt build environment from google
- IDE/Command Line/CI are now all compatible
- AAR's : takes advantage of the new .aar's for android libraries
- Partially Catkinized : can do entire workspace builds on these too.
- with .aar's we can really scale up now
- A Maven Repo : just point to this instead of having to build everything
- don't need to build any sources to build your single application anymore!
- http://wiki.ros.org/rosjava - general/ros/catkin information
- http://wiki.ros.org/android
- https://github.com/rosjava/
rosjava_core - more java oriented documentation
crosspost from WillowGarage.com
The folks from Oddwerx came to Willow Garage for a visit recently.  For those who aren't already familiar with Oddwerx, it is a very cool initiative to turn an iPhone or Android phone into an autonomous robot.  In the first video you can see how Ted Larson, Bob Allen and Brandon Blodget from Oddwerx took a PS3 Game Controller and plumbed it together to send its ROS messages to an Oddwerx Robot running its own ROS node for controlling the motors and legs.  They took advantage of the existing ROS packages which support interacting with the PS3 joystick, which is in use on many robots including the PR2. Â
Fashion was front and center in this second video when the Oddwerx robot "grew" purple hair.  In addition to responding to PS3 ROS messages, the robot was programmed to send audio/video to enable teleoperation.  Since ROS employs publish/subscribe, multiple subscribers can just listen into the live video feed.
Oddwerx is now a Kickstarter project.  If you share their vision to turn smartphones into mobile robotic ROS platforms, then support this effort on Kickstarter.
Announcement by Chad Rockey (maintainer of laser_drivers to ROS users
Hi ROS Community,
I've been working on a driver that connects the sensors in Android devices to the ROS environment. At this time, it only publishes sensor_msgs/NavSatFix messages, but I will soon introduce sensor_msgs/Imu
and sensor_msgs/Image
to publish data from accelerometers, gyroscopes, magnetometers, and front/rear cameras.
To get more information and to install, please see the following:
- http://www.ros.org/wiki/androidsensorsdriver/
- https://market.android.com/details?id=org.ros.android.sensors_driver
To file bugs, request features, view source, or contribute UI, translation, or other improvements, please see the Google Code project:
http://code.google.com/p/android-sensors-driver/
I hope everyone finds this useful and I look forward to hearing your feedback and seeing cool uses for Android devices in robotics.
Thanks,
- Chad Rockey
Announcement by Daniel Stonier of Yujin to ros-users
Hi all,
Just a quick bump for those that might be interested in zero-configuration on android. I've got a rough working implementation with a couple of demo apps. The underlying jmdns still has a few rough edges, but am currently working with the developer to fix these. Since we're actively looking at issues now, it is probably an appropriate time to query others who might have an interest in seeing feature x or y to be implemented.
There is a review page at on the ROS wiki. Links to docs, code and demos can be found there.
If you have any comments, please add them there.
Regards,
Daniel Stonier
Yesterday at Google I/O, developers at Google and Willow Garage
announced a new rosjava library
that is the first pure-Java implementation of ROS. This new library
was developed at Google with the goal of enabling advanced Android
apps for robotics.
The library, tools, and hardware that come with Android devices are well-suited for robotics. Smartphones and tablets are sophisticated computation devices with useful sensors and great user-interaction capabilities. Android devices can also be extended with additional sensor and actuators thanks to the Open Accessory and Android @ Home APIs that were announced at Google I/O,
The new rosjava is currently in alpha release mode and is still under active development, so there will be changes to the API moving forward. For early adopters, there are Android tutorials to help you send and receive sensor data to a robot.
This announcement was part of a broader talk on Cloud Robotics, which was given by Ryan Hickman and Damon Kohler of Google, as well Ken Conley and Brian Gerkey of Willow Garage. This talk discusses the many possibilities of harnessing the cloud for robotics applications, from providing capabilities like object recognition and voice services, to reducing the cost of robotics hardware, to enabling the development of user interfaces in the cloud that connect to robots remotely. With the new rosjava library, ROS developers can now take advantage of the Android platform to connect more easily to cloud services.
Announcement from Prof. Dr. Matthias Kranz of TUM
The team of the Distributed Multimodal Information Processing Group of Technische Universität München (TUM) is pleased to announce that we ported rospy to run on Android-based mobile devices.
Python for Android on top of the Scripting Layer for Android (SL4A) serves as basis for our rospy project. We extended the scripting layer, added new support for ctypes and other requirements. Now, rospy, roslib and the std_msgs are working and running on a roscore, directly on your mobile phone. To configure a roscore on a standard computer to cooperate with the roscore on the Android, you simple scan a QR code on the computer's screen to autoconfigure the smartphone. Basic support for OpenCV and the image topics is also included. You are welcome to extend the current state of our work.
You will need a current Version of the Scripting Layer (v3) and the newer Python for Android with the possibility to import custom modules. You can use every Android device able to run the SL4A. In general you be able to run it on every recent Android powered device. You will not harm your phone at all, even no root access is needed to run ROS on your device.
You can find our code, basic documentation and a video in our repository and on ROS.org.
A small video showcasing how to control a ROS-based cognitive intelligent environment via an Android-based smartphone is available here.
Links:
Robot View, now on the Android Market, lets you create panoramas and upload them to the web with your cellphone. Here's a panorama taken at Stanford, and here's another taken in the WG offices.
This isn't just a cool new app for your phone. It's a preview of what's coming for OpenCV and ROS. The panoramic stitching engine used in Robot View will soon be part of OpenCV. You'll also be able to use it in your own Android Apps, because OpenCV now runs on Android:
And, of course, OpenCV and ROS are part of the same happy family, so, yes, it will soon be coming to a robot near you.
Find this blog and more at planet.ros.org.