-
Notifications
You must be signed in to change notification settings - Fork 82
/
Copy pathREADME.md
156 lines (107 loc) · 5.22 KB
/
README.md
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
This is the official implementation of our paper:
Bowen Wen, Wenzhao Lian, Kostas Bekris, and Stefan Schaal. ["CaTGrasp: Learning Category-Level Task-Relevant Grasping in Clutter from Simulation."](https://arxiv.org/abs/2109.09163) IEEE International Conference on Robotics and Automation (ICRA) 2022.
<p align="center" width="100%">
<img src="./media/intro.jpg" width="600" />
</p>
<p align="center">
<img src="./media/catgrasp_qual_result.gif" width="1000" />
<img src="./media/pipeline.jpg" width="1000" />
</p>
# Abstract
Task-relevant grasping is critical for industrial assembly, where downstream manipulation tasks constrain the set of valid grasps. Learning how to perform this task, however, is challenging, since task-relevant grasp labels are hard to define and annotate. There is also yet no consensus on proper representations for modeling or off-the-shelf tools for performing task-relevant grasps. This work proposes a framework to learn task-relevant grasping for industrial objects without the need of time-consuming real-world data collection or manual annotation. To achieve this, the entire framework is trained solely in simulation, including supervised training with synthetic label generation and self-supervised, hand-object interaction. In the context of this framework, this paper proposes a novel, object-centric canonical representation at the category level, which allows establishing dense correspondence across object instances and transferring task-relevant grasps to novel instances. Extensive experiments on task-relevant grasping of densely-cluttered industrial objects are conducted in both simulation and real-world setups, demonstrating the effectiveness of the proposed framework.
# Bibtex
```bibtex
@article{wen2021catgrasp,
title={CaTGrasp: Learning Category-Level Task-Relevant Grasping in Clutter from Simulation},
author={Wen, Bowen and Lian, Wenzhao and Bekris, Kostas and Schaal, Stefan},
journal={ICRA 2022},
year={2022}
}
```
# Supplementary Video
Click to watch
[<img src="./media/suppl-video-thumbnail.jpg" width="400">](https://www.youtube.com/watch?reload=9&v=rAX-rFSKAto)
# ICRA 2022 Presentation Video
[<img src="./media/thumbnail.png" width="400">](https://www.youtube.com/watch?v=hFK8JfR5ZO0)
# Quick Setup
We provide docker environment and setup is as easy as below a few lines.
- If you haven't installed docker, firstly install (https://docs.docker.com/get-docker/).
- Run
```
docker pull wenbowen123/catgrasp:latest
```
- To enter the docker, run below
```
cd docker && bash run_container.sh
cd /home/catgrasp && bash build.sh
```
Now the environment is ready to run training or testing. Later you can re-enter the lauched docker environment without re-compilation by:
```
docker exec -it catgrasp bash
```
# Data
- [Download object models and pretrained network weights from here](https://archive.cs.rutgers.edu/pracsys/catgrasp/). Then extract and replace the files in this repo, to be like:
```
catgrasp
├── artifacts
├── data
└── urdf
```
# Testing
`python run_grasp_simulation.py`
You should see the demo starting like below. You can play with the settings in config_run.yml, including changing different object instances within the category while using the same framework
<p align="left">
<img src="./media/pybullet.jpg" width="800" />
</p>
# Training
In the following, we take the `nut` category as an example to walk through
- Compute signed distance function for all objects of the category
```
python make_sdf.py --class_name nut
```
- Pre-compute offline grasps of training objects. This generates and evaluates grasp qualities regardless of their task-relevance. To visualize and debug the grasp quality evaluation change to `--debug 1`
```
python generate_grasp.py --class_name nut --debug 0
```
- Self-supervised task-relevance discovery in simulation
```
python pybullet_env/env_semantic_grasp.py --class_name nut --debug 0
```
Changing `--debug 0` to `--debug 1`, you are able to debug and visualize the process
<p align="left">
<img src="./media/affordance_discovery.gif" width="250" />
</p>
The affordance results will be saved in `data/object_models`. The heatmap file `XXX_affordance_vis` can be visualized as in the below image, where warmer area means higher task-relevant grasping region P(T|G)
<p align="left">
<img src="./media/affordance.jpg" width="150" />
</p>
- Make the canonical model that stores category-level knowledge
```
python make_canonical.py --class_name nut
```
<p align="left">
<img src="./media/canonical.jpg" width="150" />
</p>
- Training data generation of piles
```
python generate_pile_data.py --class_name nut
```
<p align="left">
<img src="./media/0000048rgb.jpg" width="200" />
</p>
- Process training data, including generating ground-truth labels
```
python tool.py
```
- To train NUNOCS net, examine the settings in `config_nunocs.yml`, then
```
python train_nunocs.py
```
- To train grasping-Q net, examine the settings in `config_grasp.yml`, then
```
python train_grasp.py
```
- To train instance segmentation net, examine the settings in `PointGroup/config/config_pointgroup.yaml`, then
```
python train_pointgroup.py
```