Laser-Guided Autonomous Cat Bot


Laser-Guided Autonomous Cat Bot

Why?

As you may have picked up from a few recent posts of mine, I have been studying up on AI, Machine Learning, and Deep Learning. I purchased a $99 NVIDIA Nano for testing and NVIDIA was nice enough to send me the big sibling to the Nano, the Xavier, as a thank you for writing an article. After spending some time solving problems with stationary GPUs, I thought it would be fun to investigate an autonomous GPU on wheels. Sort of like a mini Tesla. NVIDIA has published the code as well as the hardware requirements for a Nano-controlled autonomous vehicle, JetBot, on GitHub. Once I had my JetBot's autonomous driving working, I thought it would be fun to give our bored cat some exercise, so I added a laser to the JetBot. This was fun and educational so I thought I would share the process.

Hardware


There are many options for JetBot kits. I started with the Sparkfun JetBot kit. This kit requires a lot of soldering, ended up being a bit top-heavy and the collision avoidance code does not work with the version of CUDA provided. Unfortunate. I returned the Sparkfun JetBot kit and purchased a Waveshare JetBot kit. The Waveshare kit is ideal. No soldering required, low center of gravity, compact, metal, small rechargeable batteries, comes with a wireless game controller, and all the software works. Oh and it is less expensive. Yay. 16850-type batteries are not included, so I bought them online. Many stores carry them including Amazon.

The hardware install instructions for the Waveshare JetBot can be found here.

The only other hardware I bought was a small rechargable laser pointer with an on switch.


Software


The main NVIDIA GitHub repository for JetBot is here. The WaveShare-specific software install instructions can be found here. NVIDIA has made testing different autonomous driving scenarios easy by controlling the JetBot through a python Jupyter notebook.

I only made slight changes to the collision avoidance code for my JetBot. You can find all the changed code on my GitHub catbot page. Here is an overview of the code changes:
  1. collision_avoidance/live_demo.ipynb (live_demo-DF.ipynb)
    1. I added some comments to remind myself how long each section of code takes to run before the python interpreter goes from Busy to Idle and the next section can be run. The entire set of collision avoidance code actually takes 5 minutes to run before the JetBot starts moving.
    2. I added a GPU% slider traitlet to accompany the existing blocked/free slider
    3. I reduced the forward speed to 25%, the avoidance speed to 15% and the probability blocked to 40%. The Waveshare motor controller and motors are quite fast, so lowering these numbers made a big difference in accuracy.
    4. I added a 1/2 second sleep statement before robot.stop() which can solve some issues with the robot not stopping.
  2. Auto Startup
    1. catbot.py: Once I got everything working, my goal was for collision_avoidance to start automatically whenever I switched on the JetBot. Our cat loves the laser so much, he is willing to stare at the unmoving red dot for the five minute startup time. Jupyter notebooks are interactive and I needed a non-interactive script to run on startup. I exported my live_demo-DF.ipynb to catbot.py and added a five minute sleep at the end so that the JetBot would drive around the house for five minutes with the cat chasing it and then stop.
    2. catbot.sh: I needed a shell script to be used at JetBot startup that would run catbot.py. This is a very simple shell script which I placed in my home directory and made executable
      1. #!/bin/bash
      2. cd /home/jetbot/Notebooks/collision_avoidance
      3. python3 catbot.py
    3. /etc/rc.local: To keep the startup process simple, I created the file /etc/rc.local and made it executable
      1. #!/bin/sh -e
      2. #
      3. # rc.local
      4. #
      5. # This script is executed at the end of each multiuser runlevel.
      6. # Make sure that the script will "exit 0" on success or any other
      7. # value on error.
      8. #
      9. # In order to enable or disable this script just change the execution
      10. # bits.
      11. #
      12. # By default this script does nothing.
      13. /home/jetbot/catbot.sh > /dev/null 2>&1 &
      14. exit 0

Summary


This was a lot of fun. It really is amazing that the NVIDIA Nano GPU can process camera images in real time, decide based on a neural network whether its path is free or blocked, and drive around obstacles. Giving our cat something to do was icing on the cake.

I hope you have enjoyed this post and I welcome your feedback.



Comments