Hand Gesture Control in Robosuite for Block Lifting Task

industrial robot arm

Click here to see our code

Demo Video


Click here to see our demo video

Project Overview

Coding policies into Robosuite for the robot to complete a task can be difficult and time consuming. Our project addresses this issue by tracking hand gestures and mapping these gestures directly onto the Panda robot in Robosuite. Specifically, we aim to solve the block lifting task by using our hand to guide the robot to pick up the block. To track our hand gestures, we used OpenCV and Python to map the joints of our hands. We transferred the coordinates of the hand points through a socket to mjpython and used them to adjust the robot arm's position in Robosuite. To move the robot arm around, we can move our hand in all 4 directions--up, down, left, and right. To close the end effector, we pinch together our index finger and our thumb; the program will recognize that the distance between these two points is "small", which will prompt the simulated robot to close its grip as well. Our project exemplifies a way we can teleoperate a virtual robot, perhaps for usages such as training for imitation learning, which potentially eliminates some of the cost limitations associated with robotics research.