We utilized Unity Game Engine as our main simulation tool, and Kinova Jaco robotic arm. The robotic arm uses machine learning models to detect objects with computer vision, then picks up detected objects and places them in a predetermined position.
We implemented this project during our internship at Athena RC.
Jaco Kinova arm is a 6DOF robot arm that can perform a wide range of tasks. In this project we use the model j2n6s200 Jaco arm.
In order to replicate the project in your computer, follow these steps:
-
Clone the repo
-
Navigate to arm_ws (you need to have ROS Noetic or Melodic installed), execute:
catkin_make source devel/setup.bash
-
Open Jaco_arm project in Unity. We used Unity version 2020.3.11f1.
-
Follow the instructions in Unity Robitics Hub tutorial Quick Installation Instructions
-
Change ROS IP Address in Unity:
- Go to ROS Settings and change the ROS IP Address.
- Find ROSConnectionPrefab and change the IP there too.
-
Install the python packages required for the detection models. They can be found in object_detection/installations.txt.
This repository contains files for simulating the robot arm in gazebo simulator using ROS. In order to launch the robot arm in Gazebo and control it using RViz, execute:
roslaunch moveit_config my_robot_planning_execution.launch
We created 3 different Unity Scenes (found in Assets/Scenes directory)
This scene is mostly based on the Pick And Place tutorial from Unity Robotics Hub. In a terminal execute:
roslaunch jaco_unity simple_pick_place.launch
Open SampleScene and start the Unity simulation.
In this scene we have a few leaves, some with disease spots and some healthy. The robot carries a camera near its end effector, takes a photo of the table and if a spot is detected, the arm picks up the leaf and places it in the bin. To run this, execute in a terminal:
roslaunch jaco_unity camera_pick_place.launch
Then in another terminal:
python3 leaves_detection_model.py
Open GrapeLeavesScene and start the Unity simulation.
In this scene we have some plastic bottles and tin cans. The robot carries a camera near its end effector, takes a photo of the table and places the object classified with the highest confidence in the suitable bin. The classification is based on the material (metal, plastic, paper and glass). To run this, execute in a terminal:
roslaunch jaco_unity camera_pick_place.launch
Then in another terminal:
python3 garbage_detection_model.py
Open GarbageSortingScene and start the Unity simulation.
We implemented two python scripts for garbage detection and leaf disease detection, using two already trained models we found. The scripts interact with the Unity environment using the text files found in Jaco_Robotic_Arm/Jaco_arm/temp_txt directory.
Additional installation instructions for python libraries can be found inside Jaco_Robotic_Arm/object_detection/installations.txt
The leaves detection model (YoloV4) was trained on a dataset created by Athena RC.
The garbage detection model (SSD MobilenetV2) was trained on TACO dataset.
The pretrained garbage detection model and some inference code we based on can be found at https://www.kaggle.com/code/bouweceunen/garbage-detection-with-tensorflow.