Video: https://youtube.com/live/GcECaGVu2Uw?feature=share
Jamboard: https://jamboard.google.com/d/1sDvr47MHUQGh1haXVy4CX-aU3o-7d5wUMksZopvdFjs/edit?usp=sharing
Implement Control of Turtlesim bot to listen for topic /driveRectangle and then ensure robot drives rectangle and returns to current position
Implement Control of Turtlesim bot to implement action server /driveRectangleAction and give feedback every time it turns and when it finishes action.
Submit code as ZIP and screenshots of the best results.
Video: https://youtube.com/live/G0OVcx7C_xM?feature=share
Jamboard: https://jamboard.google.com/d/1p2xqBMTT6hdtrw9ZH3vD0BXKnVjNfh6TWFrOXGMNY8Y/edit?usp=sharing
Implement ROS node to control Baxter’s head:
With keyboard buttons left and right
Add eye animations from either packages https://github.com/runbren/baxter-eyes/tree/master https://github.com/cwru-robotics/baxter_facial_animation
Test on real Baxter robot or Gazebo simulator.
Submit code as ZIP and screenshots of the best results.
Video: https://youtube.com/live/X30jmGTYfXY?feature=share
Jamboard: https://jamboard.google.com/d/1vb2imRCfqC9p7OI9WArv-P0cLu6RlHpoO61uOaa8x0w/edit?usp=sharing
cd ~/ros_ws_py3/src
git clone https://github.com/evaldsurtans/course-baxter-head-2023-q3.git
rename using: mv ./course-baxter-head-2023-q3 ./baxter_head
cd ~/ros_ws_py3
catkin_make
make sure that in ~/ros_ws_py3/baxter.sh and ~/ros_ws_py2/baxter.sh IP addresses are correct for baxter and local computer
7 launch in one of terminals: ~/ros_ws_py2/baxter.sh , then rosrun baxter_interface joint_trajectory_action_server.py
launch in one of terminals: ~/ros_ws_py2/baxter.sh , then rosrun baxter_tools camera_control.py -l
configure camera resolution and open cameras - head_camera must be open and one of hand cameras
launch in one of terminals: ~/ros_ws_py3/baxter.sh , then python3 ~/ros_ws_py3/src/baxter_head/scripts/main.py
Afterwards you should be able to operate baxter
Simulator: roslaunch baxter_gazebo baxter_world.launch
RViz 3D to Robot control roslaunch baxter_t_config demo_baxter.launch
both need in background running rosrun baxter_interface joint_trajectory_action_server.py
Baxter environment file for simulator: http://share.yellowrobot.xyz/quick/2023-7-28-BBB0A9C6-1C35-4D0B-BE9D-38ED51D27B0A.zip
Baxter environment file for real-robot: http://share.yellowrobot.xyz/quick/2023-7-28-1BE5CE92-89E9-4133-81F9-6B93F9CC0E0D.zip
Updated VM: https://drive.google.com/file/d/10FDTlS00mk7hqmCL28lc8hwT1q5fKbCX/view?usp=share_link
Improve baxter_head package and submit as ZIP.
Any improvements are welcome, but please test it on real robot or gazebo simulator.
Some of possible improvements:
Add option to remember multiple keypoints of arm and execute as sequence
Recognize hand near to gripper using camera and IR sensor, then grab it using gripper
Implement additional python3.8+ node that handles better face detection and embeddings to remember faces
Implement image segmentation to recognize things
Implement Speech-to-text and language models to receive commands and respond using text-to-speech models