This repository is based on ultralytics/yolov5. Seeed has modified it to make it more suitable for AIoT hardware devices launched by Seeed.
Model | Input_size | FLOPs | Params | Size(M) | [email protected] | [email protected]:0.95 |
---|---|---|---|---|---|---|
yolov5n6-xiao | 192×192 | 0.07G | 0.4M | 0.5 | 0.28 | 0.15 |
Equipment | Computing backend | Input | specification | time(ms) |
---|---|---|---|---|
Intel | [email protected] | 192×192 | pt | 15 |
Himax6537 | 400MHz | 192×192 | tflite | 700 |
Install
Python>=3.7.0 is required with all requirements.txt installed including PyTorch>=1.7:
$ git clone https://github.com/Seeed-Studio/yolov5-swift
$ cd YOLOv5-swift
$ pip install -r requirements.txt
Inference with detect.py
detect.py
runs inference on a variety of sources, downloading models automatically from
the YOLOv5n6-xiao and saving results to runs/detect
.
$ python detect.py --source 0 # webcam
file.jpg # image
file.mp4 # video
path/ # directory
path/*.jpg # glob
'https://youtu.be/NUsoVlDFqZg' # YouTube
'rtsp://example.com/media.mp4' # RTSP, RTMP, HTTP stream
Training
$ python train.py --data coco.yaml --cfg yolov5n6-xiao.yaml --weights yolov5n6-xiao.pt --batch-size 128
Export TFlite
The trained pt format model can be exported to the int8 type tflite model by the following command
$ python export.py --data coco.yaml --cfg yolov5n6-xiao.yaml --weights yolov5n6-xiao.pt --imgsz 192 --int8
Export UF2
UF2 is a file format, developed by Microsoft. Seeed uses this format to convert .tflite to .uf2, allowing tflite files to be stored on the AIoT devices launched by Seeed.
Currently Seeed's devices support up to 4 models, each model (.tflite) is less than 1M .
You can specify the model to be placed in the corresponding index with -t
.
$ python uf2conv.py -f GROVEAI -t 1 -c xxx.tflite -o xxx.uf2 # Place the model to index 1
$ python uf2conv.py -f GROVEAI -t 1 xxx.tflite -o xxx.uf2 # Place the model to index 1 & flash it