This directory contains the instructions for viewing the streaming video from a Tello drone in the browser. The video in the browser can then be processed by TensorFlow.js and inference run against the DroneAid model.
-
Install
Node.js
-
Install
FFmpeg
(this may take awhile)brew install ffmpeg
-
Clone this repository
git clone https://github.com/Code-and-Response/DroneAid.git
-
Change to
tello-demo
directorycd ../tello-demo
-
Install dependencies
npm install
-
In a terminal window, start the server in the
tello-demo
directorynpm start
-
Open a browser and go to http://127.0.0.1:3000/.
-
Connect computer to Tello drone's WiFi
-
Click Start stream
You should now see a live feed from Tello drone in the browser window! If you click and enable the Prediction switch inference will run against the video feed with the DroneAid model (using TensorFlow.js) and the browser will display the predictions annotated onto the video.
In some brief initial tests, the live feed (without running predictions) had less than 1sec latency. When predictions was turned on latency increased to about 1-2secs.
Further performance improvements may be possible with changes to the
- arguments passed to
ffmpeg
(inserver.js
) - options used for
jsmpeg
(indroneaid-tello.js
) - DroneAid TensorFlow.js model
While ffmpeg
is what is used here, it could be replaced with mplayer
(or some other video streaming service/application).