WidowX 250 (Robotic Arm)

In this tutorial, we will show you how to integrate and remotely control the WidowX 250 robotic arm.

In this tutorial, we will show you how to integrate and remotely control the WidowX 250 robotic arm.

Mounting and wiring the arm

The mounting of the arm is particularly easy. If you have bought the arm with the modified support plate designed for our robot, you can use screws and nuts to connect the arm to the rover's mounting plate.

If you have the original support plate, you can get the model for 3D printing here (addons section):

page3D-printed parts

Use the modified Battery <-> MEB cable, included in the set, to connect the battery to the power socket located on the arm.

Last but not least, connect the arm's U2D2 driver to the rover's computer through the miniUSB socket located on the mounting plate.

Integrating the arm with the system

There is a couple of files that will need to be modified on the Rover's system. We will show you how to do this using nano - a command line text editor, but if you have your own preferable method of editing files, feel free to use it.

We need to make sure the U2D2 device is available at a fixed path on rover's system. To do this, you can add a rule to udev. Open a new .rules file with nano:

sudo nano /etc/udev/rules.d/u2d2.rules

and paste the following rule:

SUBSYSTEM=="tty", ATTRS{idVendor}=="0403", ATTRS{idProduct}=="6014", ENV{ID_MM_DEVICE_IGNORE}="1", ATTR{device/latency_timer}="1", SYMLINK+="ttyDXL"

To save the file, type Ctrl+O and Enter to confirm. Type Ctrl+X to exit nano.

For the rule to take effect, reboot the system or just type:

sudo udevadm control --reload-rules && sudo udevadm trigger

The device should now be available under /dev/ttyDXL path. You can check it by typing:

ls -l /dev/ttyDXL

To integrate the arm, you will need to build some additional ROS packages. Start by creating a local catkin workspace, if you don't have one yet:

mkdir -p ~/ros_ws/src
cd ~/ros_ws
catkin config --extend /opt/ros/melodic

The package sources for the arm are available on Github at the interbotix_ros_arms repository. Clone the repository to your source space:

cd ~/ros_ws/src
git clone https://github.com/Interbotix/interbotix_ros_arms.git

On the rover, you will only need the driver node for the arm (interbotix_sdk package) and the URDF description (interbotix_descriptions package). To speed up the building process, remove the unwanted packages:

mv interbotix_ros_arms/interbotix_descriptions ./
mv interbotix_ros_arms/interbotix_sdk ./
rm -rf interbotix_ros_arms

Now, use rosdep to install any dependent packages:

cd ~/ros_ws
rosdep update
rosdep install --from-paths src -iry

and build the workspace:

catkin build -j 1

Once the packages have been built, you can edit the environmental setup file to point to your result space. Open the file in nano:

sudo nano /etc/ros/setup.bash

Comment out the first line by adding # sign and add the source command for your workspace. The first 2 lines should look essentially like this:

# source /opt/ros/melodic/setup.bash
source /home/pi/ros_ws/devel/setup.bash

Now, to add the arm's driver to the rover's launch file, open the robot.launch file:

sudo nano /etc/ros/robot.launch

and paste these lines somewhere between the <launch> tags:

<include file="$(find interbotix_sdk)/launch/arm_run.launch">
  <arg name="port"                        value="/dev/ttyDXL"/>
  <arg name="robot_name"                  value="wx250"/>
  <arg name="use_default_rviz"            value="false"/>
  <arg name="use_world_frame"             value="false"/>
  <arg name="use_moveit"                  value="true"/>
  <arg name="arm_operating_mode"          value="position"/>
  <arg name="arm_profile_velocity"        value="131"/>
  <arg name="arm_profile_acceleration"    value="15"/>
  <arg name="gripper_operating_mode"      value="position"/>
  <arg name="use_time_based_profile"      value="false"/>
</include>

You can learn more about the driver's parameters and functionalities at the interbotix_sdk README page.

You can also edit the robot's URDF file to connect the arm's base link to the rover's model. To do this, open the robot.urdf.xacro file:

sudo nano /etc/ros/urdf/robot.urdf.xacro

and paste these lines somewhere between the <robot> tags:

<link name="wx250/base_link"/>

<joint name="arm_joint" type="fixed">
  <origin xyz="0.043 0 -0.001"/>
  <parent link="base_link"/>
  <child link="wx250/base_link"/>
</joint>

To learn more about what the files under /etc/ros are used for and how do they correlate with each other, visit the Adding additional functionality to the rover section on ROS Development guide:

That's it! On the next boot, the arm driver node will start together with all the other nodes. You can manually restart the running nodes, by typing:

sudo systemctl restart leo

Controlling the arm

Now that you have the driver running, you should see the new ROS topics and services under the /wx250 namespace. For a full description of the ROS API, visit the interbotix_sdk README page. You can test some of the features with the rostopic and rosservice command-line tools:

Retrieve the information about the arm:

rosservice call /wx250/get_robot_info

Publish position command to the elbow joint:

rostopic pub /wx250/single_joint/command interbotix_sdk/SingleCommand "{joint_name: elbow, cmd: -0.5}"

Turn off the torque on all joints:

rosservice call /wx250/torque_joints_off

The interbotix_ros_arms repository contains some packages that will let you control the arm in different ways. To use them on your computer, you will need to have ROS installed:

pageInstall ROS on your computer

and properly configured to communicate with the nodes running on the rover. For this, you can visit Connecting other computer to ROS network section of the ROS Development tutorial:

pageROS Development

First, install some prerequisites:

sudo apt install python-catkin-tools
sudo -H pip install modern_robotics

The modern_robotics python package is required to run the joystick control example.

and create a catkin workspace:

mkdir -p ~/ros_ws/src
cd ~/ros_ws
catkin config --extend /opt/ros/melodic

Clone the interbotix_ros_arms and leo_description repositories into the source space:

cd ~/ros_ws/src
git clone https://github.com/Interbotix/interbotix_ros_arms.git -b melodic
git clone https://github.com/LeoRover/leo_description.git

Install dependencies using the rosdep tool:

cd ~/ros_ws
rosdep update
rosdep install --from-paths src -iy

and build the workspace:

catkin build

Now, source the devel space to make the new packages visible in your shell environment:

source ~/ros_ws/devel/setup.bash

You will have to do this at every terminal session you want to use the packages on, so it is convenient to add this line to the ~/.bashrc file.

Visualizing the model

  1. Open RViz by typing rviz in the terminal.

  2. Choose base_link as the Fixed Frame.

  3. On the Displays panel click on Add and choose RobotModel.

  4. For the Robot Description parameter, choose robot_description.

  5. Add another RobotModel display, but for the Robot Description parameter choose wx250/robot_description.

The effect should look similar to this:

Planning the trajectory with MoveIt

MoveIt motion planning framework will allow us to plan and execute a collision-free trajectory to the destination pose of the end-effector. To use it, first make sure you have the use_moveit parameter for the arm driver set to true:

/etc/ros/robot.launch
<arg name="use_moveit" value="true"/>

On your computer, type:

roslaunch interbotix_moveit interbotix_moveit.launch robot_name:=wx250 rviz_frame:=wx250/base_link

The MoveIt GUI should appear:

On the MotionPlanning panel, click on the Planning tab, choose interbotix_arm for the Planning Group and <current> for the Start State.

There are some predefined poses which you can choose for the Goal State, such as home, sleep or upright. To set the pose manually, navigate to the DIsplays panel -> MotionPlanning - > Planning Request and check Query Goal State. You should now be able to manually set the end-effector pose for the goal state.

When the goal state is set, click on the Plan button to plan the trajectory (the simulated trajectory visualization should appear) and Execute to send the trajectory to the driver.

If you want to use the MoveIt capabilities in a Python script or a C++ program, please look at the interbotix_moveit_interface example.

Using joystick to control the arm

The interbotix_joy_control example package provides the capability to control the movement of the arm (utilizing inverse kinematics) with a PS3, PS4 or an Xbox 360 joystick.

To use the package with the arm connected to your rover:

  1. Change the parameters for the driver node. The joy control node uses the pwm mode for the gripper and is more suited to work with the Time-Based-Profile. Here are the settings that work well:

    /etc/ros/robot.launch
    <arg name="arm_operating_mode"          value="position"/>
    <arg name="arm_profile_velocity"        value="200"/>
    <arg name="arm_profile_acceleration"    value="200"/>
    <arg name="gripper_operating_mode"      value="pwm"/>
    <arg name="use_time_based_profile"      value="true"/>
  2. Connect the joystick to your computer. You can find the instructions on the package's README file.

  3. Start the joy_control.launch file:

    roslaunch interbotix_joy_control joy_control.launch robot_name:=wx250 controller:=ps3 arm_run:=false

    Change controller to either ps3, ps4 or xbox360 depending on the joystick you have connected.

Using the Python API

Aside from the driver, the interbotix_sdk package also provides a Python API for manipulating the arm. It is designed to mainly work with the position mode for the arm, pwm mode for the gripper and the Time-Based-Profile. For a start, you can set the same parameters for the driver as in the previous example.

There are some example scripts that demonstrate the use of the API at the interbotix_examples/python_demos directory.

cd ~/ros_ws/src/interbotix_ros_arms/interbotix_examples/python_demos

The bartender.py demo performs some pick, pour and place operations. To run it, first open the file in a text editor and search for this line:

arm = InterbotixRobot(robot_name="wx250s", mrd=mrd)

Change wx250s to wx250 and then type on the terminal:

python bartender.py

Make sure that you are not running any other script that takes control of the arm simultaneously (e.g. the joy control node).

If everything went right, you should see the arm in action.

You can check the other files in the directory for more examples. To view the available functions in the API and their documentation, take a look at the robot_manipulation.py file.

Last updated