Only released in EOL distros:  

Package Summary

ROS node which is a client for Virtual Reality Peripheral Networks and publishes a TF frame and TransformStamped for tracked bodies. Tested with the Optitrak motion capture system from NaturalPoint.

Demonstration Video

Hardware Setup

  • There are a number of devices for which VRPN servers are available. See complete list on the VRPN website.

  • We have tested this code with the OptiTrack system from Natural Point, using the VRPN server implemented in the Tracking Tools software.

  • Currently, this code only publishes the position and orientation of the tracked objects (Not the velocities, accelerations etc.)
  • Some of the instructions below might be specific to the Tracking Tools software.

Installation

  • Use the install_vrpn.sh scirpt in the package to download, compile, and install VRPN.
  • Please consult the VRPN website if your run into some trouble.

Running the code

  1. Run node from command line to track an object whose name is Trackable1:
    $ rosrun ros_vrpn_client ros_vrpn_client __name:=Trackable1 _vrpn_server_ip:=192.168.2.110
  2. visualize in rviz:
    $ roslaunch ros_vrpn_client demo.launch
  3. check rate at which object is being tracked:
     $ rostopic hz /Trackable1/pose
    • We can track objects at 100Hz.

TF coordinate frames

  1. /optitrak
    • world frame that we will use.
    • X axis is along the x axis of the clibration pattern.
    • Z axis is vertically up.
  2. Every tracked object has a coord frame whose TF name is the name of the ros node (given from the launch file or command line).
    • Hitting "Reset To Current Orientation" in the Tracking Tools software (Trackable properties) aligns the object coord frame with the /optitrak frame.

Nodes

ros_vrpn_client

  • Run a new instance of this node for each tracked object.
  • The name of this node should be the same as the Trackable name published via VRPN, and will be the name of the TF coordinate frame for this object.

Wiki: ros_vrpn_client (last edited 2014-02-27 10:18:54 by Ziyang Li)