This repository contains the code for our paper Prompt a Robot to Walk with Large Language Models by Yen-Jen Wang, Bike Zhang, Jianyu Chen, and Koushil Sreenath.
The LLM directly outputs the low-level action to make a robot to walk. In our experiment, the LLM is supposed to run at 10 Hz although the simulation has to be paused to wait for LLM inference, and the PD controller executes at 200 Hz.
conda create -n prompt2walk python=3.9
conda activate prompt2walk
pip install -r requirements.txt
Please fill the openai.api_key
in src/llm.py
.
python src/run.py
python src/replay.py
Please refer to Isaac Gym Environments for Legged Robots. Currently, we've tested our code on A1 and ANYmal C Robot at 10 Hz.
Please read our paper! If you have further questions, please feel free to contact Yen-Jen.
Please cite our paper if you use this code or parts of it:
@article{wang2023prompt,
title={Prompt a Robot to Walk with Large Language Models},
author={Wang, Yen-Jen and Zhang, Bike and Chen, Jianyu and Sreenath, Koushil},
journal={arXiv preprint arXiv:2309.09969},
year={2023}
}