Scalable Physics-Informed DExterous Retargeting (SPIDER) is a general framework for physics-based retargeting from human to diverse robot embodiments, including both dexterous hand and humanoid robot. It is designed to be a minimum, flexible and extendable framework for human2robot retargeting. This code base provides the following pipeline from human video to robot actions:
| Inspire Pick Tea Pot (Gigahands Dataset) | Xhand Play Glass (Hot3D dataset) | Schunk Pick Board (Oakink dataset) | Allegro Pick Cat Toy (Reconstructed from single RGB video) |
|---|---|---|---|
![]() |
![]() |
![]() |
![]() |
| G1 Pick | G1 Run | H1 Kick | T1 skip |
|---|---|---|---|
![]() |
![]() |
![]() |
![]() |
| Mujoco | Rerun |
|---|---|
![]() |
![]() |
| Genesis | Mujoco Warp |
|---|---|
![]() |
![]() |
| Pick Cup | Rotate Bulb | Unplug Charger | Pick Duck |
|---|---|---|---|
![]() |
![]() |
![]() |
![]() |
- First general physics-based retargeting pipeline for both dexterous hand and humanoid robot.
- Supports 9+ robots and 6+ datasets out of the box.
- Seemless integration with RL training and data augmentation for BC pipeline.
- Native support for multiple simulators (Mujoco Wrap, Genesis) and multiple downstream training pipelines (HDMI, DexMachina).
- Sim2real ready.
Clone example datasets:
sudo apt install git-lfs
git lfs install
git clone https://huggingface.co/datasets/retarget/retarget_example example_datasetsCreate env and install (make sure uv uses Python 3.12, which is what the project targets):
uv syncIf you already have the example datasets cloned, you can skip the preprocessing step where we convert the human data to robot kinematic trajectories. Run SPIDER on a processed trial:
uv run examples/run_mjwp.pyFor full workflow, please refer to the Workflow section.
conda create -n spider python=3.12
conda activate spider
python -m pip install --upgrade pip
python -m pip install -r requirements.txt
python -m pip install --no-deps -e .Run MJWP on a processed trial:
python examples/run_mjwp.pySPIDER is designed to support multiple workflows depending on your simulator of choice and downstream tasks.
- Native Mujoco Wrap (MJWP) is the default workflow and supports dexterous hand and humanoid robot retargeting.
- We also support Genesis simulator with DexMachina, workflow is useful for further training a policy with RL for dexterous hand.
- HDMI workflow supports humanoid robot retargeting + RL workflow with humanoid-object interaction tasks. It use MjLab as its backend simulator.
- supports dexterous hand and humanoid robot retargeting
TASK=p36-tea
HAND_TYPE=bimanual
DATA_ID=10
ROBOT_TYPE=xhand
DATASET_NAME=gigahand
# put your raw data under folder raw/{dataset_name/ in your dataset folder
# read data from self collected dataset
uv run spider/process_datasets/gigahand.py --task=${TASK} --embodiment-type=${HAND_TYPE} --data-id=${DATA_ID}
# decompose object
uv run spider/preprocess/decompose_fast.py --task=${TASK} --dataset-name=${DATASET_NAME} --data-id=${DATA_ID} --embodiment-type=${HAND_TYPE}
# detect contact (optional)
uv run spider/preprocess/detect_contact.py --task=${TASK} --dataset-name=${DATASET_NAME} --data-id=${DATA_ID} --embodiment-type=${HAND_TYPE}
# generate scene
uv run spider/preprocess/generate_xml.py --task=${TASK} --dataset-name=${DATASET_NAME} --data-id=${DATA_ID} --embodiment-type=${HAND_TYPE} --robot-type=${ROBOT_TYPE}
# kinematic retargeting
uv run spider/preprocess/ik.py --task=${TASK} --dataset-name=${DATASET_NAME} --data-id=${DATA_ID} --embodiment-type=${HAND_TYPE} --robot-type=${ROBOT_TYPE} --open-hand
# retargeting
uv run examples/run_mjwp.py +override=${DATASET_NAME} task=${TASK} data_id=${DATA_ID} robot_type=${ROBOT_TYPE} embodiment_type=${HAND_TYPE}
# read data for deployment (optional)
uv run spider/postprocess/read_to_robot.py --task=${TASK} --dataset-name=${DATASET_NAME} --data-id=${DATA_ID} --robot-type=${ROBOT_TYPE} --embodiment-type=${HAND_TYPE}# install dexmachina conda environment following their official instructions: https://mandizhao.github.io/dexmachina-docs/0_install.html
conda activate dexmachina
# note: install spider only without mujoco warp since we only use the optimization part
pip install --ignore-requires-python --no-deps -e .
# run retargeting
python examples/run_dexmachina.py# install HDMI uv environment following their official instructions:
# go to hdmi folder, install SPIDER with
uv pip install --no-deps -e ../spider# start rerun server
uv run rerun --serve-web --port 9876
# run SPIDER only with rerun viewer
uv run examples/run_mjwp.py viewer="rerun"SPIDER is released under the Creative Commons Attribution-NonCommercial 4.0 license. See LICENSE for details.
We expect everyone to follow the Contributor Covenant Code of Conduct in CODE_OF_CONDUCT.md when participating in this project.
- Thanks Mandi Zhao for the help with the DexMachina workflow for SPIDER + Genesis.
- Thanks Taylor Howell for the help in the early stages of integrating Mujoco Wrap for SPIDER + MJWP.
- Thanks Haoyang Weng for the help with the HDMI workflow for SPIDER + Sim2real RL.
- Inverse kinematics design is ported from GMR and LocoMujoco.
- Dataset processing is ported from Hot3D, Oakinkv2, Maniptrans, Gigahands.
- Visualization inspired by other good sampling repo like Hydrax and Judo.
@misc{pan2025spiderscalablephysicsinformeddexterous,
title={SPIDER: Scalable Physics-Informed Dexterous Retargeting},
author={Chaoyi Pan and Changhao Wang and Haozhi Qi and Zixi Liu and Homanga Bharadhwaj and Akash Sharma and Tingfan Wu and Guanya Shi and Jitendra Malik and Francois Hogan},
year={2025},
eprint={2511.09484},
archivePrefix={arXiv},
primaryClass={cs.RO},
url={https://arxiv.org/abs/2511.09484},
}

















