-
Notifications
You must be signed in to change notification settings - Fork 1
Home
This repository has the base scripts for the Niryo-One robot (official docs). It allows us to run a grasping demo in-the-real using the Grasp Detection Library for the grasp generation and aruco markers for camera pose estimation.
- GPD
- aruco markers
- irlab_point_cloud_filtering
- niryo_one_ros
- rosdocked-irlab (used to build docker image with all dependencies included)
To run the demo the irlab_point_cloud_filtering repo should track the master branch. On the robopaas/rosdocked-kinetic-workspace-included
image the repo may be tracking the two_pcl_filtering branch. Please check the local repo's branch and git checkout master
if necessary.
Additionally, the files on_robot_calibrate_arm.py
, on_robot_learning_mode.py
and on_robot_motors_on.py
should be copied to the robot and executed there when needed.
-
on_robot_calibrate_arm.py
: Used to calibrate the arm before using. -
on_robot_learning_mode.py
: Used to turn the motors off. -
on_robot_motors_on.py
: Used to turn the motors on, otherwise the arm can't execute any trajectories.
If the robot cannot keep the joints still while in learning mode, edit on_robot_calibrate_arm.py to disable learning mode. Specifically edit the niryo_demo() function into the following:
def niryo_demo():
rospy.init_node('moveit_niryo_robot')
n = NiryoOne()
n.calibrate_auto()
n.activate_learning_mode(False)
We're using the realsense D415 for this demo. If you encounter any problems, check realsense_info.txt on the main icclab_grasping_niryo directory, it might be of some help.
Any other python script mentioned in this wiki is located in ~/icclab_grasping_niryo/scripts
The Niryo One Studio can be used to connect to a WiFi network. You can find more information on the Niryo One docs.
Each robot has its own hostname setup, you can find it on a sticker on one side of the robot (format: niryo-x). You should be able to find the robot's IP by running nslookup [*niryo-x*]
. At this point, you can edit your /etc/hosts file and add the robot's IP and hostname.
To setup proper ROS communication between the robot and your local machine run export ROS_MASTER_URI=http://**niryo IP or hostname**:11311
and export ROS_IP=**local machine's IP**
or export ROS_HOSTNAME=**local machine's hostname**
. This has to be done in every new terminal you bring up.
First you need to disable move_group from launching when bringing up the robot. For that go into the niryo_one_bringup/launch directory on your robot and disable the robot_commander.launch file by putting everything inside comment brackets. You'll need to restart the robot after this. To run move-group on your local machine, run roslaunch niryo_one_bringup robot_commander.launch simulation_mode:=true
.
Any time you move the camera or robot you will need to recalibrate to get the correct pose with respect to the robot. To do this first roslaunch icclab_grasping_niryo camera_niryo.launch
and then python calibrate_camera.py
. This will launch a transform listener, which will in turn save the robot to camera transformation parameters and store them in camera_params.yaml for later use. If the camera and the robot are not moved between runs, then you can skip this procedure. Also, note that the markers are not really needed after the calibration.
To run the demo start with roslaunch icclab_grasping_niryo niryo_one_aruco_view_gpd.launch
. Then run python test_camera.py
, which will create a transform broadcaster using the camera_params.yaml file. After this all that remains is python pick_and_place_niryo.py
. The neural net inference takes a while, so please be patient. After that the arm should grasp the object and then place it.
You can run the demo as many times as you want, all you need to do is rerun python pick_and_place.py
. If for some reason the gripper is in the camera's filed of view, you'll need to move it out of the way using the moveit Rviz plugin. Go into the planning tab, Select Start State then choose current on the drop-down list, 'Select Goal State' then select resting on the drop-down list. Finally, click on Plan, check if the trajectory looks ok and then click Execute.
- Calibrate arm
- Setup connection between robot and remote machine
- Switch branch of irlab_point_cloud_filtering to master,
catkin build
from ~/catkin_ws/ and source ~/catkin_ws/devel/setup.bash - (Optional)
roslaunch niryo_one_bringup robot_commander.launch
from local machine / cloud - (Optional)
roslaunch icclab_grasping_niryo camera_niryo.launch
for the camera calibration - (Optional)
python calibrate_camera.py
in /icclab_grasping_niryo/scripts/ then kill camera_niryo.launch - After the calibration is done, you can kill niryo_camera.launch.
roslaunch icclab_grasping_niryo niryo_one_aruco_view_gpd.launch
-
python test_camera.py
in /icclab_grasping_niryo/scripts/ 10.python pick_and_place_niryo.py
in /icclab_grasping_niryo/scripts/
Add to bashrc of your laptop:
source ~/catkin_ws/devel/setup.bash
and
export ROS_MASTER_URI=http://niryo-1:11311
source it with the command bellow(it won't hurt):
. ~/.bashrc
For the demo:
on the robot run:
./calibrate_arm.py
These launch files are located in the package icclab_grasping_niryo then on your laptop run:
roslaunch icclab_grasping_niryo moveit_and_camera.launch
open a New terminal (usually ctr-T or ctr-shift-T) and do
roslaunch icclab_grasping_niryo calibrate_camera.launch
wait for it and press Ctrl-C when it is done
start rviz and the grasping and the point cloud filtering and the camera transformation with the command bellow:
roslaunch icclab_grasping_niryo rviz_camera_trans.launch &
the and symbol is to put it on the background
Press Ctrl+L to clean screen and on the same terminal launch gpd:
roslaunch icclab_grasping_niryo start_gpd.launch
You should be able to see the robot arm grasping an available item