This project achieves how to demonstrate a Baxter robot (enable the Baxter robot to follow human actions) using human pose estimation.
- System: Ubuntu 18.04
- Sensor: Kinect v2
- Robot: Baxter pkg
Follow the instructions provided in the link below to set up Kinect v2:
Install the kinect2_tracker package for human pose tracking:
Note: Use NiTE-Linux-x64-2.2 instead of NiTE-Linux-x64-2.0.
Set up the Baxter robot by following the instructions below:
Clone and set up the kinect_based_arm_tracking package:
- Source Code: kinect_based_arm_tracking GitHub Repository
check the README.md in each sub folder in src/ for details
To run the demo, open multiple terminals and execute the following commands:
./baxter.sh
rosrun baxter_tools enable_robot.py -e # enable robot at the startroslaunch kinect2_tracker tracker.launch
Note: It may take time to model the person. You may need to retry this step multiple times until the "tf" of the person is successfully published.
cd ~/ros_ws/src/kinect_based_arm_tracking/scripts
python tf_listen_v8_puber.py
cd ~/ros_ws/src/kinect_based_arm_tracking/scripts
python tf_listen_v8_controller.py