Care-O-bot 4 Programming
Contents
Architecture overview
tbd image
API (topics, services, actions)
The following tables show the programming interfaces to the components within Care-O-bot 4. The components are sorted similar to the diagnostics GUI.
1. Actuators
1.1. Base
interface name |
type |
message |
description |
/base/twist_mux/command_navigation |
subscriber |
Velocity streaming interface for moving the base with smoothing active. Desired for navigation purposes. |
|
/base/twist_mux/command_safe |
subscriber |
Velocity streaming interface for moving the base with collision checking and smoothing active. Desired for teleoperation purposes. |
|
/base/twist_controller/command |
subscriber |
Low level velocity streaming interface for moving the base. Desired for control purposes. |
1.2. Torso
interface name |
type |
message |
description |
/torso/joint_trajectory_controller/follow_joint_trajectory |
action |
Trajectory interface to move the torso. |
1.3. Arms
interface name |
type |
message |
description |
/arm_left/joint_trajectory_controller/follow_joint_trajectory |
action |
Trajectory interface to move the left arm. |
|
/arm_right/joint_trajectory_controller/follow_joint_trajectory |
action |
Trajectory interface to move the right arm. |
1.4. Grippers
interface name |
type |
message |
description |
/gripper_left/joint_trajectory_controller/follow_joint_trajectory |
action |
Trajectory interface to move the left gripper. |
|
/gripper_right/joint_trajectory_controller/follow_joint_trajectory |
action |
Trajectory interface to move the right gripper. |
1.5. Sensorring
interface name |
type |
message |
description |
/sensorring/joint_trajectory_controller/follow_joint_trajectory |
action |
Trajectory interface to move the sensorring. |
1.6. Head
interface name |
type |
message |
description |
/head/joint_trajectory_controller/follow_joint_trajectory |
action |
Trajectory interface to move the head. |
2. Sensors
2.1. Laser scanners
interface name |
type |
message |
description |
/base_laser_front/scan |
publisher |
Laser scan data from base laser front. |
|
/base_laser_left/scan |
publisher |
Laser scan data from base laser left. |
|
/base_laser_right/scan |
publisher |
Laser scan data from base laser right. |
2.2. Cameras
interface name |
type |
message |
description |
/torso_cam3d_left/rgb/image_raw |
publisher |
Color image data from torso camera left. |
|
/torso_cam3d_left/depth_registered/points |
publisher |
RGBD data from torso camera left. |
|
/torso_cam3d_right/rgb/image_raw |
publisher |
Color image data from torso camera right. |
|
/torso_cam3d_right/depth_registered/points |
publisher |
RGBD data from torso camera right. |
|
/torso_cam3d_down/rgb/image_raw |
publisher |
Color image data from torso camera down. |
|
/torso_cam3d_down/depth_registered/points |
publisher |
RGBD data from torso camera down. |
|
/sensorring_cam3d_front/depth/points |
publisher |
Depth data from sensorring camera front. |
|
/sensorring_cam3d_back/rgb/image_raw |
publisher |
Color image data from sensorring camera back. |
|
/sensorring_cam3d_back/depth_registered/points |
publisher |
RGBD data from sensoring camera back. |
|
/head_cam3d/rgb/image_raw |
publisher |
Color image data from head camera. |
3. Safety
interface name |
type |
message |
description |
/emergency_stop_state |
publisher |
The Emergency stop information includes laser and button stop. |
4. IO
interface name |
type |
message |
description |
/joy |
publisher |
This topic publishes the input commands from a joystick. |
|
/sound/say |
action |
Action for text-to-speech output. |
|
/light_base/set_light |
action |
Action to set base lights. |
|
/light_torso/set_light |
action |
Action to set torso lights. |
5. Other
interface name |
type |
message |
description |
/power_state |
publisher |
The power state publishes information about the battery status (e.g. remaining capacity). |
Workspace setup
The setup on Care-O-bot is organized in a way that multiple people can develop on the robot without influencing each other. Please try to avoid 'group accounts', we promote that each user develops on its own account (even for demos, we reccomend to create separate accounts. This setup avoids that one user breaks the setup of another one.
On each robot there is a user called 'robot' which ensures to keep a running version of all bringup related packages. The robot account should not be used for development nor for demos, it is just designed to serve as an underlay for each users individual account.
1. Filesystem
The pcs within the robot share a common NFS file system which holds all home directories. This means that code modifications in the home directory are available on all pcs immediately. Building code is only necessary on one pc, there is no need to build code on all pcs. We recommend to use the base pc (cob4-X-b1) to login via ssh, checkout, modify and build code.
2. catkin workspaces
As mentioned above we recommend to use the robot account as an underlay to you user accounts. The user accounts on the robot are by default already setup to follow this rule. In the ROS_PACKAGE_PATH you should see the following order: "opt/ros/indigo : robot account : user account". There is no need for a normal developer to create overlays of the bringup packages.
3. remote bringup
In order to startup nodes on various robot pcs, we use the machine tags within the roslaunch syntax. This allows the user to start the launch file on any robot pc which will then automatically deploy the nodes on the assigned pcs.
This is already done for the bringup layer and we reccomend to create similar files for your additional nodes.
Collaboration and coding guidelines
We are using github as our main collaboration tool. Services on github include source code hosting, issue tracking and pull requests. We've compiled some best practices on how to use github with our repositories.
Please be aware of our coding guidelines when extending or modifying our code base.