_ROS Summer School Materials

 

2023-Q3-AI 9. Robot Operating System ROS

 

9.1. Video / Materials

Video: https://youtube.com/live/GcECaGVu2Uw?feature=share

Jamboard: https://jamboard.google.com/d/1sDvr47MHUQGh1haXVy4CX-aU3o-7d5wUMksZopvdFjs/edit?usp=sharing

 

9.2. Implement Control of Turtlesim bot

  1. Implement Control of Turtlesim bot to listen for topic /driveRectangle and then ensure robot drives rectangle and returns to current position

  2. Implement Control of Turtlesim bot to implement action server /driveRectangleAction and give feedback every time it turns and when it finishes action.

Submit code as ZIP and screenshots of the best results.

 

10.1. Video / Materials

Video: https://youtube.com/live/G0OVcx7C_xM?feature=share

Jamboard: https://jamboard.google.com/d/1p2xqBMTT6hdtrw9ZH3vD0BXKnVjNfh6TWFrOXGMNY8Y/edit?usp=sharing

 

10.2. Implement ROS node to control Baxter’s head

Implement ROS node to control Baxter’s head:

  1. With keyboard buttons left and right

  2. Add eye animations from either packages https://github.com/runbren/baxter-eyes/tree/master https://github.com/cwru-robotics/baxter_facial_animation

Test on real Baxter robot or Gazebo simulator.

Submit code as ZIP and screenshots of the best results.

 

2023-Q3-AI 13. Baxter Head

 

13.1. Video / Materials

Video: https://youtube.com/live/X30jmGTYfXY?feature=share

Jamboard: https://jamboard.google.com/d/1vb2imRCfqC9p7OI9WArv-P0cLu6RlHpoO61uOaa8x0w/edit?usp=sharing

 

13.2. Instruction for testing on real robot

  1. cd ~/ros_ws_py3/src

  2. git clone https://github.com/evaldsurtans/course-baxter-head-2023-q3.git

  3. rename using: mv ./course-baxter-head-2023-q3 ./baxter_head

  4. cd ~/ros_ws_py3

  5. catkin_make

  6. make sure that in ~/ros_ws_py3/baxter.sh and ~/ros_ws_py2/baxter.sh IP addresses are correct for baxter and local computer

7 launch in one of terminals: ~/ros_ws_py2/baxter.sh , then rosrun baxter_interface joint_trajectory_action_server.py

  1. launch in one of terminals: ~/ros_ws_py2/baxter.sh , then rosrun baxter_tools camera_control.py -l configure camera resolution and open cameras - head_camera must be open and one of hand cameras

  2. launch in one of terminals: ~/ros_ws_py3/baxter.sh , then python3 ~/ros_ws_py3/src/baxter_head/scripts/main.py

  3. Afterwards you should be able to operate baxter

Simulator: roslaunch baxter_gazebo baxter_world.launch

RViz 3D to Robot control roslaunch baxter_t_config demo_baxter.launch

both need in background running rosrun baxter_interface joint_trajectory_action_server.py

Baxter environment file for simulator: http://share.yellowrobot.xyz/quick/2023-7-28-BBB0A9C6-1C35-4D0B-BE9D-38ED51D27B0A.zip

Baxter environment file for real-robot: http://share.yellowrobot.xyz/quick/2023-7-28-1BE5CE92-89E9-4133-81F9-6B93F9CC0E0D.zip

Updated VM: https://drive.google.com/file/d/10FDTlS00mk7hqmCL28lc8hwT1q5fKbCX/view?usp=share_link

 

13.3. Improve baxter_head package

Improve baxter_head package and submit as ZIP.

Any improvements are welcome, but please test it on real robot or gazebo simulator.

Some of possible improvements:

  1. Add option to remember multiple keypoints of arm and execute as sequence

  2. Recognize hand near to gripper using camera and IR sensor, then grab it using gripper

  3. Implement additional python3.8+ node that handles better face detection and embeddings to remember faces

  4. Implement image segmentation to recognize things

  5. Implement Speech-to-text and language models to receive commands and respond using text-to-speech models