Nano Yolo

    I grabbed an Nvidia Jetson Nano a few weeks ago and I've been playing with it a bit. The included tutorials are pretty interesting, some very nice classification and regression demos. Within an hour I had trained a network to recognize how many fingers I was holding up (fairly robustly, different orientations, different hands, different people, varying the distance/position/lighting). After a bit of trial and error I also trained a network to localize the handle on a coffee cup (not quite as robustly, but still pretty cool). I'd love to see if I could use this board to do imprecise object manipulation. Then I ventured beyond the Nvidia demos and started looking for a more standardized comparison to other systems. I decided to see how difficult it is to deploy YOLO v3 on the Jetson Nano. Turns out, not that difficult.

    With some very slight re-configuration, you can run YOLO v3 on the Nano. Tiny YOLO will run at ~8fps and full YOLO will run 3-4fps. I used a Raspberry Pi camera, but of course it will work with USB webcams also. I threw together a few setup scripts to make the install process relatively painless. downloads+configures+builds YOLO v3. And does what you'd imagine it does. Those are posted on Gitlab.

    Something else to keep in mind, the Nano SoC heats up pretty good so you'll want to add a fan. The included heatsink has holes for a 40mm 12V fan and you can either secure that with zipties or tap the holes (M3) and screw it in place. Nvidia graciously included an automatic fan control program, and enables that controller and sets it to run automatically on startup.

    I also made a printable case for the Nano because the cardboard box it comes in is bulky and flimsy and I wanted to be able to throw the board in my backpack without worrying about it.

    Next up I'm going to see if I can get Nvidia's DOPE package or something similar running on this thing. For static object manipulation, 1fps would be enough, so we'll see.