This is the official repo for the paper:
RoboGen: Towards Unleashing Infinite Data for Automated Robot Learning via Generative Simulation
Yufei Wang*, Zhou Xian*, Feng Chen*, Tsun-Hsuan Wang, Yian Wang, Katerina Fragkiadaki, Zackory Erickson, David Held, Chuang Gan
RoboGen is a self-guided and generative robotic agent that autonomously proposes new tasks, generates corresponding environments, and acquires new robotic skills continuously.
RoboGen is powered by Genesis, a multi-material multi-solver generative simulation engine for general-purpose robot learning. Genesis is still under active development and will be released soon. This repo contains a re-implementation of RoboGen using PyBullet, containing generation and learning of rigid manipulation and locomotion tasks. Our full pipeline containing soft-body manipulation and more tasks will be released later together with Genesis.
We are still in the process of cleaning the code & testing, please stay tuned!
Clone this git repo.
git clone https://github.com/Genesis-Embodied-AI/RoboGen.git
We recommend working with a conda environment.
conda env create -f environment.yaml
conda activate robogen
If installing from this yaml file doesn't work, manual installation of missing packages should also work.
RoboGen leverages Open Motion Planning Library (OMPL) for motion planning as part of the pipeline to solve the generated task. To install OMPL, run
./install_ompl_1.5.2.sh --python
which will install the ompl with system-wide python. Then, export the installation to the conda environment to be used with RoboGen:
echo "path_to_your_ompl_installation_from_last_step/OMPL/ompl-1.5.2/py-bindings" >> ~/miniconda3/envs/robogen/lib/python3.9/site-packages/ompl.pth
remember to change the path to be your ompl installed path and conda environment path.
RoboGen uses PartNet-Mobility for task generation and scene population. We provide a parsed version here (which parses the urdf to extract the articulation tree as a shortened input to GPT-4). After downloading, please unzip it and put it in the data
folder, so it looks like data/dataset
.
For retrieving objects from objaverse, we embed object descriptions from objaverse using SentenceBert. If you want to generate these embeddings by yourself, run
python objaverse_utils/embed_all_annotations.py
python objaverse_utils/embed_cap3d.py
python objaverse_utils/embed_partnet_annotations.py
We also provide the embeddings here if generation takes too much time. After downloading, unzip and put it under objaverse_utils/data/
folder, so it looks like
objaverse_utils/data/default_tag_embeddings_*.pt
objaverse_utils/data/default_tag_names_*.pt
objaverse_utils/data/default_tag_uids_*.pt
objaverse_utils/data/cap3d_sentence_bert_embeddings.pt
objaverse_utils/data/partnet_mobility_category_embeddings.pt
Put your OpenAI API key at the top of gpt_4/query.py
, and simply run
source prepare.sh
python run.py
RoboGen will then generate the task, build the scene in pybullet, and solve it to learn the corresponding skill.
If you wish to generate manipulation tasks relevant to a specific object, e.g., microwave, you can run
python run.py --category Microwave
If you wish to just generate the tasks, run
python run.py --train 0
which will generate the tasks, scene config yaml files, and training supervisions. The generated tasks will be stored at data/generated_tasks_release/
.
If you want to generate task given a text description, you can run
python gpt_4/prompts/prompt_from_description.py --task_description [TASK_DESCRIPTION] --object [PARTNET_ARTICULATION_OBJECT_CATEGORY]
For example,
python gpt_4/prompts/prompt_from_description.py --task_description "Put a pen into the box" --object "Box"
If you wish to just learn the skill with a generated task, run
python execute.py --task_config_path [PATH_TO_THE_GENERATED_TASK_CONFIG] # for manipulation tasks
python execute_locomotion.py --task_config_path [PATH_TO_THE_GENERATED_TASK_CONFIG] # for locomotion tasks
For example,
python execute.py --task_config_path example_tasks/Change_Lamp_Direction/Change_Lamp_Direction_The_robotic_arm_will_alter_the_lamps_light_direction_by_manipulating_the_lamps_head.yaml
python execute_locomotion.py --task_config_path example_tasks/task_Turn_right/Turn_right.yaml
In example_tasks
we include a number of generated tasks from RoboGen. We hope this could be useful for, e.g., language conditioned multi-task learning & transfer learning & low-level skill learning. We hope to keep updating the list!
- The interface between OMPL and pybullet is based on pybullet_ompl.
- Part of the objaverse annotations are from Scalable 3D Captioning with Pretrained Models