Our paper named “Vision-Based Approximate Estimation of Muscle Activation Patterns for Tele-Impedance” is accepted to IEEE Robotics and Automation Letters! I really appreciate this to all co-authors, and especially prof. Lee, for suggesting me a good research topic
Vision-Based Approximate Estimation of Muscle Activation Patterns for Tele-Impedance
Hyemin Ahn, Youssef Michel, Thomas Eiband, and Dongheui Lee.
Abstact : It lies in human nature to properly adjust the muscle force to perform a given task successfully. While transferring this control ability to robots has been a big concern among researchers, there is no attempt to make a robot learn how to control the impedance solely based on visual observations. Rather, the research on tele-impedance usually relies on special devices such as EMG sensors, which have less accessibility as well as less generalization ability compared to simple RGB webcams. In this letter, we propose a system for a vision-based tele-impedance control of robots, based on the approximately estimated muscle activation patterns. These patterns are obtained from the proposed deep learning-based model, which uses RGB images from an affordable commercial webcam as inputs. It is remarkable that our model does not require humans to apply any visible markers to their muscles. Experimental results show that our model enables a robot to mimic how humans adjust their muscle force to perform a given task successfully. Although our experiments are focused on tele-impedance control, our system can also provide a baseline for improvement of vision-based learning from demonstration, which would also incorporate the information of variable stiffness control for successful task execution.