Technical Name |
Human-Robot Co-Dancing: A Computer Vision-Based, No-Code, Intuitive Robot Arm Choreography Interface and Human-Robot Collaborative Creation System |
Project Operator |
National Cheng Kung University |
Project Host |
沈揚庭 |
Summary |
The project of "Human-Robot Dance" targets intuitive, no-code robotic arm control, enabling artists to co-create using body language. Two main workflows were developed: Teach (Offline), where a dancer's movements are captured, processed, translated, and then reproduced by the arm; and Mirror (Online), where real-time pose perception and data translation allow the arm to synchronously mimic the dancer for improvisational co-dance. |
Scientific Breakthrough |
The “Human-Robot Co-Dance'” project lie in: empowering artists with code-free, intuitive robot choreography tools. Through real-time motion capture and Inverse Kinematics, coupled with node translation via machine learning, it achieves human-robot synchronous mirroring and a deeply coupled interaction. Ultimately, through co-creation and co-choreography, it explores the 'pseudo-life' characteristics of robots, pioneering a new direction for techno-art creation through co-dance. |
Industrial Applicability |
This project's technologies including intuitive robot teaching, real-time motion capture, and AI-driven motion translation offer vast potential. In Human-Robot Collaboration, they address skill gaps in smart manufacturing and construction, enabling flexible, safe co-working and fostering digital craftsmanship. In Techno Art, these tools empower artists with direct, intuitive expression, creating novel interactive experiences and allowing exploration of Human-Robot Co-Creation. |
Keyword |
Robotic Arm Human-Robot Collaboration Skeleton Recognition Computer Vision Choreography Techno Art Human-Computer Interaction Code Free Machine Learning Inverse Kinematics |