Skip to content

facebookresearch/spider

🕸️ SPIDER: Scalable Physics-Informed DExterous Retargeting

License: CC BY-NC 4.0 Python 3.12+ PyTorch arXiv Documentation Project Website

logo

Overview

Scalable Physics-Informed DExterous Retargeting (SPIDER) is a general framework for physics-based retargeting from human to diverse robot embodiments, including both dexterous hand and humanoid robot. It is designed to be a minimum, flexible and extendable framework for human2robot retargeting. This code base provides the following pipeline from human video to robot actions:

pipeline

Gallery

Simulation results:

Inspire Pick Tea Pot (Gigahands Dataset) Xhand Play Glass (Hot3D dataset) Schunk Pick Board (Oakink dataset) Allegro Pick Cat Toy (Reconstructed from single RGB video)
G1 Pick G1 Run H1 Kick T1 skip

Multiple viewer support:

Mujoco Rerun

Multiple simulators support:

Genesis Mujoco Warp

Deployment to real-world robots:

Pick Cup Rotate Bulb Unplug Charger Pick Duck

Features

  • First general physics-based retargeting pipeline for both dexterous hand and humanoid robot.
  • Supports 9+ robots and 6+ datasets out of the box.
  • Seemless integration with RL training and data augmentation for BC pipeline.
  • Native support for multiple simulators (Mujoco Wrap, Genesis) and multiple downstream training pipelines (HDMI, DexMachina).
  • Sim2real ready.

Quickstart

Clone example datasets:

sudo apt install git-lfs
git lfs install
git clone https://huggingface.co/datasets/retarget/retarget_example example_datasets

(Option 1) Quickstart with uv:

Create env and install (make sure uv uses Python 3.12, which is what the project targets):

uv sync

If you already have the example datasets cloned, you can skip the preprocessing step where we convert the human data to robot kinematic trajectories. Run SPIDER on a processed trial:

uv run examples/run_mjwp.py

For full workflow, please refer to the Workflow section.

(Option 2) Quickstart with conda:

conda create -n spider python=3.12
conda activate spider
python -m pip install --upgrade pip
python -m pip install -r requirements.txt
python -m pip install --no-deps -e .

Run MJWP on a processed trial:

python examples/run_mjwp.py

Workflow

SPIDER is designed to support multiple workflows depending on your simulator of choice and downstream tasks.

  • Native Mujoco Wrap (MJWP) is the default workflow and supports dexterous hand and humanoid robot retargeting.
  • We also support Genesis simulator with DexMachina, workflow is useful for further training a policy with RL for dexterous hand.
  • HDMI workflow supports humanoid robot retargeting + RL workflow with humanoid-object interaction tasks. It use MjLab as its backend simulator.

Native Mujoco Wrap Workflow

  • supports dexterous hand and humanoid robot retargeting
TASK=p36-tea
HAND_TYPE=bimanual
DATA_ID=10
ROBOT_TYPE=xhand
DATASET_NAME=gigahand

# put your raw data under folder raw/{dataset_name/ in your dataset folder

# read data from self collected dataset
uv run spider/process_datasets/gigahand.py --task=${TASK} --embodiment-type=${HAND_TYPE} --data-id=${DATA_ID}

# decompose object
uv run spider/preprocess/decompose_fast.py --task=${TASK} --dataset-name=${DATASET_NAME} --data-id=${DATA_ID} --embodiment-type=${HAND_TYPE}

# detect contact (optional)
uv run spider/preprocess/detect_contact.py --task=${TASK} --dataset-name=${DATASET_NAME} --data-id=${DATA_ID} --embodiment-type=${HAND_TYPE}

# generate scene
uv run spider/preprocess/generate_xml.py --task=${TASK} --dataset-name=${DATASET_NAME} --data-id=${DATA_ID} --embodiment-type=${HAND_TYPE} --robot-type=${ROBOT_TYPE}

# kinematic retargeting
uv run spider/preprocess/ik.py --task=${TASK} --dataset-name=${DATASET_NAME} --data-id=${DATA_ID} --embodiment-type=${HAND_TYPE} --robot-type=${ROBOT_TYPE} --open-hand

# retargeting
uv run examples/run_mjwp.py +override=${DATASET_NAME} task=${TASK} data_id=${DATA_ID} robot_type=${ROBOT_TYPE} embodiment_type=${HAND_TYPE}

# read data for deployment (optional)
uv run spider/postprocess/read_to_robot.py --task=${TASK} --dataset-name=${DATASET_NAME} --data-id=${DATA_ID} --robot-type=${ROBOT_TYPE} --embodiment-type=${HAND_TYPE}

DexMachina Workflow

# install dexmachina conda environment following their official instructions: https://mandizhao.github.io/dexmachina-docs/0_install.html
conda activate dexmachina
# note: install spider only without mujoco warp since we only use the optimization part
pip install --ignore-requires-python --no-deps -e .
# run retargeting
python examples/run_dexmachina.py

HDMI Workflow

# install HDMI uv environment following their official instructions:
# go to hdmi folder, install SPIDER with
uv pip install --no-deps -e ../spider

Remote Development

# start rerun server
uv run rerun --serve-web --port 9876

# run SPIDER only with rerun viewer
uv run examples/run_mjwp.py viewer="rerun"

License

SPIDER is released under the Creative Commons Attribution-NonCommercial 4.0 license. See LICENSE for details.

Code of Conduct

We expect everyone to follow the Contributor Covenant Code of Conduct in CODE_OF_CONDUCT.md when participating in this project.

Acknowledgments

Citation

@misc{pan2025spiderscalablephysicsinformeddexterous,
      title={SPIDER: Scalable Physics-Informed Dexterous Retargeting},
      author={Chaoyi Pan and Changhao Wang and Haozhi Qi and Zixi Liu and Homanga Bharadhwaj and Akash Sharma and Tingfan Wu and Guanya Shi and Jitendra Malik and Francois Hogan},
      year={2025},
      eprint={2511.09484},
      archivePrefix={arXiv},
      primaryClass={cs.RO},
      url={https://arxiv.org/abs/2511.09484},
}

About

A general physic-based retargeting framework.

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •