emg2tendon: From sEMG Signals to Tendon Control in Musculoskeletal Hands

Micropilot
RSS 2025

This video introduces the motivation behind the work. It contrasts joint-actuated and tendon-driven robotic hands with natural human hand motion, highlighting the advantages of tendon-driven systems in achieving biomechanical realism and dexterity. The animation demonstrates the limitations of vision-based pose tracking using motion capture, and presents surface EMG (sEMG) as a robust alternative for capturing neuromuscular activity. The final sequence shows how sEMG signals, recorded from the wrist, are mapped to tendon control forces using the MyoHand musculoskeletal model, enabling responsive and accurate control of robotic hands.

Abstract

Tendon-driven robotic hands offer unparalleled dexterity for manipulation tasks, but learning control policies for such systems presents unique challenges. Unlike joint-actuated robotic hands, tendon-driven systems lack a direct one-to-one mapping between motion capture (mocap) data and tendon controls, making the learning process complex and expensive. Additionally, visual tracking methods for real-world applications are prone to occlusions and inaccuracies, further complicating joint tracking. Wrist-wearable surface electromyography (sEMG) sensors present an inexpensive, robust alternative to capture hand motion. However, mapping sEMG signals to tendon control remains a significant challenge despite the availability of EMG-to-pose data sets and regression-based models in the existing literature.

We introduce the first large-scale EMG-to-Tendon Control dataset for robotic hands, extending the \emph{emg2pose} dataset, which includes recordings from 193 subjects, spanning 370 hours and 29 stages with diverse gestures. This dataset incorporates tendon control signals derived using the MyoSuite MyoHand model, addressing limitations such as invalid poses in prior methods. We provide three baseline regression models to demonstrate \emph{emg2tendon} utility and propose a novel diffusion-based regression model for predicting tendon control from sEMG recordings. This dataset and modeling framework marks a significant step forward for tendon-driven dexterous robotic manipulation, laying the groundwork for scalable and accurate tendon control in robotic hands.

Generate Tendon Data

Our QForce Tendon Implementation provides a high-performance C++ solution for converting motion capture (MoCap) pose data to tendon control forces for anthropomorphic hand control like MyoHand (MyoSuite). This implementation achieves a 20x speed improvement compared to Python versions and has successfully processed 1700 hours of MoCap data in just 24 hours on a 48-core CPU.

Key Features

  • High Performance: C++ implementation with 20x speed improvement over Python
  • Scalable Processing: Handles large-scale datasets efficiently
  • Robust Algorithm: QForce tendon algorithm with inverse dynamics and muscle modeling
  • Video Generation: Built-in comparison video generation between reference and achieved trajectories
  • Cross-Platform: Supports Linux, macOS, and Windows

Quick Start

Prerequisites: C++ compiler (GCC 7+ or Clang 6+), CMake 3.16+, Python 3.7+, MuJoCo, OSQP, and HDF5 libraries.

Installation

# Clone the repository
git clone https://github.com/micropilot/pose2tendon_ID
cd pose2tendon_ID

# Install system dependencies (Ubuntu/Debian)
sudo apt update
sudo apt install build-essential cmake libhdf5-dev

# Install MuJoCo and OSQP (see full README for details)
# Build the project
mkdir build && cd build
cmake ..
make -j$(nproc)

Usage

# Process HDF5 trajectory data
./qforce_tendon ../data/sample.hdf5 ../output/

# Generate comparison videos
python generate_video.py --bin_path ../output/sample.bin \
                        --hdf5_path ../data/sample.hdf5 \
                        --output_path ../videos/

Algorithm Overview

The QForce tendon algorithm processes trajectory data through several stages:

  1. Trajectory Processing: Reads joint angles and time from HDF5 compound dataset
  2. Inverse Dynamics: Computes generalized forces for target positions
  3. Muscle Modeling: Applies tendon dynamics and muscle properties
  4. Quadratic Programming: Solves for optimal control signals using OSQP
  5. Control Generation: Produces actuator control values

Performance

  • Processing Speed: ~1000 trajectory points per second
  • Memory Usage: ~50MB for 100k trajectory points
  • Video Generation: ~30 seconds for 100k points (25 FPS output)

Dataset Videos

Our EMG-to-Tendon Control dataset includes recordings from 193 subjects spanning 370 hours across 29 diverse gesture stages. Below are sample videos showcasing the variety of hand movements and gestures captured in our dataset.

Thumbs Up/Down & Rotations

Hook 'Em, OK, Scissors

Unconstrained Movement

Thumb Swipes & Whole Hand

Two-Handed Free Style

Grasp, Punch, Close/Far

Poke, Draw, Pinch, Rotate

All Finger Pinches & Thumb Actions

Counting & Finger Movements

One-Handed Free Style

Index & Middle Pinches

Doorknob, Grasp, Fist

Fast Pong & Throwing

Hand Desk & Clasped Chest

Poke & Pinch Close/Far

Finger Touch & Palm Clap

Hand Claw, Grasp, Flicks

Shaka, Vulcan, Peace

Counting Face/Side/Away

Finger Wiggling & Spreading

Finger Abduction Series

Single & Multiple Finger Pinches

Finger Freeform

Individual Finger Pointing & Snap

Wrist Flexion & Abduction

Play Blocks & Chess

Coffee Panic Pete

Hand-over-Hand Pinches & Thumb Actions

Hand-over-Hand Counting & Finger Movements

Video Presentation

Poster

BibTeX

@inproceedings{verma2025emg2tendon,
        title        = {{emg2tendon: From sEMG Signals to Tendon Control in Musculoskeletal Hands}},
        author       = {{Sagar Verma}},
        year         = 2025,
        booktitle    = {Robotics: Science and Systems}
      }

@inproceedings{verma2025emg,
        title        = {{EMG} Signals to Tendon Control Forces for MyoHand Actuation},
        author       = {{Sagar Verma}},
        year         = 2025,
        booktitle    = {Conference on Soft Robotics}
      }