Introduction: Candy-Throwing Robot With Arduino, Recycled Parts, and Dlib.

With a few recycled parts, an Arduino + motor shield, and Dlib computer software, you can make a working face-detecting candy thrower.

Materials:

  • Wood frame
  • Laptop/computer (preferably one more powerful than a Raspberry Pi!).
  • Arduino (Preferably Uno, or one that fits your motor shield.)
  • Arduino motor shield (I used the old Adafruit shield, as is still sold here)
  • Any standard webcam
  • Small washer

Recycled parts:

  • Casing (old metal box works well.)
  • Stepper motor, DC motor from disassembled old printer.
  • Old printer power supply
  • Candy dispenser (Large yogurt container).

The innards of the finished product will look somewhat like attached overview.

Warning!

Make sure you unplug power to Arduino/motor setup before wiring/rewiring. Be sure to plug the power in in the correct polarity!

This is an intermediate level project using Arduino and software you need to install or compile on your computer. Instructions may vary and were tested working on Ubuntu.

You may need to adjust the Arduino code to whatever motor shield you are using, if you are not using the old Adafruit motor shield.

Step 1: Drill, Connect and Mount Motor

Drill holes in the case, attaching stepper motor so the webcam can move around on top, and the dispenser can rotate around at the bottom.

Stepper motors can move in small increments (4 wires), unlike DC motors (2 wires) that run backward/forward, not in steps.

The DC motor has 2 wires (runs either way), the optional stepper motor will have 4 in two coils (test with multimeter resistance meter to see where the coils are wired, as described here.)

If you are using the old Adafruit shield as in my example, you should connect the DC candy spinner motor to motor #3, and the stepper to the first two motors (#1, #2) as described in their documentation.

Once connected, wire the motors to an Arduino Motor Shield attached to an Arduino. For best results it's recommended to have a second power supply to the motors, which you can wire up to the DC output on a $2 wall wart from a secondhand store.

Step 2: Attaching the Container

Measure out the center of a yogurt container or other large plastic container, and drill a hold just bigger than the end of the motor.

Attach the container to the box using a small washer - Krazy-glue it to the bottom of the container, and to the electric motor spindle.

Let the glue dry for a day or two to fully cure - you may want to place a small spacer between box and spinning container to make sure it doesn't set crooked.

Step 3: Make a Small Hole for Candy Thrower

Once the spinner is attached properly, use the end of a sharp knife to outline a small hole for candy to be thrown out of the end - this should be just above the bottom, where the lip is.

(For best results, about 30 pieces of candy can be loaded onto the lid end of the dispenser, which will be the bottom.)

Keep gently outlining with a sharp knife until it pops out, leaving a small hole (you can expand the hole later as necessary).

Step 4: Arduino Setup

If you haven't already installed the Arduino software, get it from

https://www.arduino.cc/en/Main/Software

Any recent version should work.

Test the DC/Stepper motor with the examples given in your Arduino Shield's example code.

If you happen to be using this shield (still available from some resellers) you can use my code directly:

https://github.com/programmin1/HowToTrainYourRobot...

Once you've tested and can dispense candy by entering "d" in the Arduino serial window, it's time to connect this to Dlib's recognizer.

Step 5: Dlib Setup

Dlib (http://dlib.net/) provides an open-source, easy-to-use library for image recognition. Install the Python Dlib module using:

sudo pip install Dlib

- or -

sudo easy_install Dlib

Wait for Dlib to install and compile. (You will need a computer with preferably a couple gigs of RAM, or you will need to wait a long time and expand swap space). The code also uses OpenCV for the webcam module, so run:

sudo apt-get install python-opencv

Installation steps may vary if you use MacOS or other operating systems.

Now grab the face detection landmarks data from

http://dlib.net/files/shape_predictor_68_face_land...

Unzip it (with Archive Manager / 7zip) and place it in HOME/Downloads/shape_predictor_68_face_landmarks.dat

Step 6: Connecting Face-recognizer to Your Robot

Plug the Arduino-controlling USB to the computer and verify that "/dev/ttyACM0" file exists (this is the device to send the serial commands to). If it doesn't work and a different similarly named directory that shows up in /dev when you plug it, replace /dev/ttyACM0 in the faceDetectThreadCorrelationCV2FaceSmile.py file in the repo.

Plug in the webcam (if no builtin webcam is in the computer you are using), and run that .py file in the command line or with the Run/F5 command in Geany text editor. If you want to use a second/external webcam on a laptop, you may want to change "VideoCapture(0)" to "VideoCapture(1)" to use a second webcam, which you can place on the stepper motor atop the robot's box.

If all goes well you will see an outline of a smile when a face is in front of the webcam.

Read the overview and the source code of the .py file to see the mathematics of how smile detection works from the points Dlib gives from the face landmark functionality. :)