Tendon-driven robotic hands offer unparalleled dexterity for manipulation tasks, but learning control
policies for such systems presents unique challenges. Unlike joint-actuated robotic hands, tendon-driven
systems lack a direct one-to-one mapping between motion capture (mocap) data and tendon controls, making
the learning process complex and expensive. Additionally, visual tracking methods for real-world
applications are prone to occlusions and inaccuracies, further complicating joint tracking. Wrist-wearable
surface electromyography (sEMG) sensors present an inexpensive, robust alternative to capture hand motion.
However, mapping sEMG signals to tendon control remains a significant challenge despite the availability
of EMG-to-pose data sets and regression-based models in the existing literature.
We introduce the first large-scale EMG-to-Tendon Control dataset for robotic hands, extending the
\emph{emg2pose} dataset, which includes recordings from 193 subjects, spanning 370 hours and 29 stages
with diverse gestures. This dataset incorporates tendon control signals derived using the MyoSuite MyoHand
model, addressing limitations such as invalid poses in prior methods. We provide three baseline regression
models to demonstrate \emph{emg2tendon} utility and propose a novel diffusion-based regression model for
predicting tendon control from sEMG recordings. This dataset and modeling framework marks a significant
step forward for tendon-driven dexterous robotic manipulation, laying the groundwork for scalable and
accurate tendon control in robotic hands.
Our QForce Tendon Implementation provides a high-performance C++ solution for converting motion capture (MoCap) pose data to tendon control forces for anthropomorphic hand control like MyoHand (MyoSuite). This implementation achieves a 20x speed improvement compared to Python versions and has successfully processed 1700 hours of MoCap data in just 24 hours on a 48-core CPU.
# Clone the repository
git clone https://github.com/micropilot/pose2tendon_ID
cd pose2tendon_ID
# Install system dependencies (Ubuntu/Debian)
sudo apt update
sudo apt install build-essential cmake libhdf5-dev
# Install MuJoCo and OSQP (see full README for details)
# Build the project
mkdir build && cd build
cmake ..
make -j$(nproc)
# Process HDF5 trajectory data
./qforce_tendon ../data/sample.hdf5 ../output/
# Generate comparison videos
python generate_video.py --bin_path ../output/sample.bin \
--hdf5_path ../data/sample.hdf5 \
--output_path ../videos/
The QForce tendon algorithm processes trajectory data through several stages:
Our EMG-to-Tendon Control dataset includes recordings from 193 subjects spanning 370 hours across 29 diverse gesture stages. Below are sample videos showcasing the variety of hand movements and gestures captured in our dataset.
@inproceedings{verma2025emg2tendon,
title = {{emg2tendon: From sEMG Signals to Tendon Control in Musculoskeletal Hands}},
author = {{Sagar Verma}},
year = 2025,
booktitle = {Robotics: Science and Systems}
}
@inproceedings{verma2025emg,
title = {{EMG} Signals to Tendon Control Forces for MyoHand Actuation},
author = {{Sagar Verma}},
year = 2025,
booktitle = {Conference on Soft Robotics}
}