Autonomy Payload

Autonomy Payload Image


Autonomy Payload and ROS

The Rover Robotics Autonomy Payload comes with all the sensors needed to operate autonomously, but the software for autonomy is left up to the user. Luckily ROS provides a lot of help with this.

A common misnomer, ROS is not a full operating system, even though it stands for Robot Operating System. ROS is mainly a collection of software packages meant to help roboticists. These software packages range from low-level hardware drivers all the way up high-level autonomy.

How ROS communication works ROS packages communicate to each other using TCP meaning they can be located on different computers connected via the internet. Even when located on the same computer they still use TCP. When a ROS package is run it creates what’s called a ROS node. This ROS node then connects to the ROS master in order to communicate with other ROS nodes. These TCP connections are called rostopics.

Useful ROS Commands

  • rostopic list
    • This command run on any computer that can contact the ROS master will show a list of all TCP connections (aka ros topics) between ROS packages.
  • rostopic info /topic_name
    • This command will show all the ROS nodes that are communicating over that topic.
  • rostopic echo /topic_name
    • This command will print the data currently being sent over the given topic

As a start Rover Robotics Autonomy Payloads come with ROS pre-installed and a workspace (~/catkin_ws) created for getting started with ROS.


The autonomy payload uses 4 USB cameras. In order to access the cameras 3 things must happen.

  1. Cameras needs to be plugged in
  2. Two /dev/videoX files must exist for each camera (e.g. /dev/video0, /dev/video1)
  3. The program ROS usb_cam must be running.

In order to run ROS usb_cam once the following command can be used with the proper parameters

rosrun usb_cam usb_cam_node 

In order to run multiple ros packages at a time ROS uses launch files and the roslaunch command. The autonomy payload is setup to run a 1 launch file on boot. This launch file is specified in a bash script located in the home directory. This bash script is run by the /etc/rc.local file which is one possible way that linux computers run programs on boot. It is recommended to run usb_cam from this roslaunch file for any camera you wish to come-up on boot.

There are 3 options for accessing the cameras feeds once the above 3 are working

  1. InOrbit
  2. ros_web_video server
  3. rqt_image_view (ros install required)

Option 1 (easiest, least robust) - Access through inorbit

InOrbit is cloud based fleet management for robotics. They are a startup so their feature set is changing very rapidly. This document will be updated monthly to reflect their newest features. Last Updated: 9/29/2018

To setup cameras in InOrbit navigate to Robot Settings → Navigation, select up to two ROS camera topics from the drop down menus

Inorbit Camera Settings

Once camera topics are configured return to the dashboard and click on the icon in the top right hand corner of the Localization tile

Inorbit Camera Settings

You should now see a live camera feed along with controls for driving the robot.

Inorbit Teleop View

Currently it can be a little hard to drive the robot with this interface due to the low resolution of the image. Try option 2 for higher resolution video.

Also if you find that you cannot view the image feeds though inorbit, try option 2.

Option 2 (medium) - Web Video Server

Type the ip address of the robot into a browser followed by a colon and port 8080 (eg.

ROS web video server image selection

You should see a list of available images. Select one to view a live feed. You should now see a live feed like the one below. You can lower quality and resolution of the image to lower latency by editing the URL. Instructions for this are on the ROS web_video_server wiki page.

ROS web_video_server Image View

Option 3 - rqt_image_view

If using a computer with Ubuntu and ROS installed you can view the image feeds with rqt_image_view. Set the ROS_MASTER_URI, ROS_HOSTNAME, and ROS_IP correctly and then use the command



If all three of the above options have failed check to see if the USB cameras are enumerating by issuing an ‘ls /dev’ command via the terminal while ssh’ed into the robot. You should see video0, video1, video2, video3, video4, video5, video6, and video7. Each camera enumerates as 2 different files under linux. The first is for accessing the YUYV of MJPEG image from the camera. The second is for accessing the H264 image.

If you do not see all 8 video files listed above that means the computer thinks that one of the USB cameras is unplugged. Unsure that all cameras are unplugged and try again.

If you do see all 8 video files listed above run the command ‘rostopic list’. You Should see

Please contact for any further questions.


Ensure that the Velodyne Puck Lite has both power and ethernet plugged in. You will be able to tell that the Lidar is on when you here a slight whirring noise from it, as it spins its lasers around.

Follow the following tutorial to test out the Velodyne Lidar

Once complete the Velodyne launch file (VLP16_points.launch) can be added to the bash script in the home directory so that its started on boot.

Scissor Lift

On out Github page there is a python library called liblac that can be used to control the scissor lift.

There is a folder in the home directory with examples of how to use liblac.