Introduction: Egor V.2 - Robo-Animatronic
Egor V.2 - Robo-Animatronic (Carl Strathearn University of Huddersfield: Multimedia Design 2013-14)
Step 1: Actulatiy and Artificiality (Egor V.2)
The title of my final year Multimedia Design project (Actuality and Artificiality) is an adaptation of the post-modernist philosopher Jean Baudrillard’s ideology, Simulacra and Simulation: Simulacra as object and Simulation as process. The robot can track movement and respond to questions via a wireless keyboard, however, voice recognition can be implemented using macs dictation and speech function in system preferences in-putted via the xbox kinect's internal quad mic set up. The robot uses an adapted version of the Eliza algorithmic framework (siri derivative) to respond to participant questions, The 'script' is then out-putted to apple scripts voice modulator so it can be heard through the robots internal speaker system. The application of the project is as an interactive exhibit, but this project is highly adaptable and would be suited for museum displays, help desks or interactive theatre. The Eliza framework can be changed to mimic any individual or language (Currently based on Marvin from Hitchhikers Guide to the Galaxy) and also answer complex questions, making this a highly interactive and knowledgeable system. The voice can also be outputted and modelled on specific individuals and the mouth and lips react to the sound coming into the computer's audio output so it is always more or less in time. The system tracks peoples movement via a Kinect module, the current system uses an open software library that tracks the nearest pixel to the sensor, however, the script also includes skeleton tracking output which can be activated via un-commenting the skeleton tracking code and commenting the point tracking library. This means that multiple individuals can be tracked and interacted with at once, allowing for larger audiences.
Full instructions with image and video //ref guides available (http://carlstrathearn2014.blogspot.co.uk/)
Step 2: Equipment
1x wooden circle (20cm dia)
1 x check plate (aluminium) same size as wooden circle
Black paint
Processing free from Processing.org
Arduino free from Arduino.cc
Mac computer
Bolts and Nuts of Various sizes 1mm-4mm (m3)
2 x 12g micro servos
5 x hex screws
2 x meccano L brackers small (fish plate)
3 mm wire (sturdy)
1 x kinect sensor
1 x kinect sensor stand
1 x arduino (leonardo)
1 x servo shield v2
3 x Robot L bracket
1 x Robot circular rotating base (aluminium not plastic) - (ebay) (around £30)
1 x 10kg servo (base)
3 x silver small 'short' robotic U brackets (ebay)
1 x guitar neck plate - black
1 x 15 kg servo (neck) u / d
3 x large robot brackets in black (ebay)
false teeth
Artificial eyes
Miliput (black)
3 x 9g micro servos
Guitar string (E)
2 x Springs from pegs
1 x pocket speaker (anker - front)
1 x pocket speaker (larger - back)
4 x metal rods 15 cm
1 x short 7 cm
8 x brass screw bricks from wire connector blocks
2 x circular servo horns
1 x meccano long plate
1 x breadboard
reel of tribind wire (90lb fishing wire)
10 x servo lead extension leads
2 x multifunctional robot brackets
8 x brass spacers (m4)
4 x m4 bolts 8 cm
2 x ball and socket joints (lego)
extension lead 2.1mm audio
1 x 5 kg sero (jaw)
1mm wire
black thermo plastic (small bag)
2 mm eyelets
Step 3: Base
Take the wooden circular base (1.1) and paint it black (1.2), drill 4 holes on the outer edge of the base to attach the aluminium plate (1.3). Match the holes in the aluminium check plate, drill an additional 4 holes that match the m4 screws (1.4) in the base of the robotic rotational base (1.5). The base comes as a kit but needs assembling (look on ebay under robot arm base) it is a pretty straight forward installation and has placement for the base servo (1.6). Attach the aluminium plate to the end of the m4 screws on the rotational base and attach the base to the wooden base. This completes the base structure.
Step 4: Neck
Take a multifunctional robot bracket (2.1) and attach a servo (2.2) and a large robotic U bracket (2.3). Take the L bracket and bolt it into the back of the multifunctional bracket using M3 bolts and nuts (2.4). Fit the other end of the L bracket to to upper aluminium platform of the rotational robotic base (2.5). Attach A large black U bracket to the servo via a circular servo horn (2.6) and a ball and flange cup bearing on the other (standard formation) to the end of the Large L bracket attach another L bracket facing the opposite way to for a H shape. To the front of this mechanism attach the guitar neck place to the front to finish it off using the bolts supplied with the fitting. This completes the neck set up. (2.7).(2.8)
Step 5: Jaw
The jaw is made up of a multifunctional bracket (3.1) with a small silver aluminium U bracket (3.2) attached to the end in standard formation (3.3). To make the lower jaw take the lower part of the false teeth and some 3 mm metal wire (3.4), bend the wire into the shape of the lower jaw use black miliput to fix this element in place on the ends fix meccano L fish plates into the milliput and under the wire for extra support. Fit the wire through the holes in the U bracket and curl the wire to fix in place, use miliput to seal these ends to form a stable structure. Do the same procedure for the upper jaw, take two multifunction brackets and attach them back to back. Use two L shaped small meccano parts to secure the upper jaw wire to the two multifunctional brackets. Example set up (3.5) video (3.6)
Step 6: Eyes
Take two medium aluminium U brackets (4.1) and bolt them back to back ][ like this (4.2) there are four holes at each end of the bracket. The holes will be used to thread the 3mm x 100mm straight rods through (4.3). To make the eyes move up and down we need to use the top holes and left and right we need to use the left hole on the left side and the left hole on the right side (4.4). Attach the 70mm smaller rods (4.5) to the ends of the upper (eyes up / down) and lower (eyes left / right) rods with brass bricks taken from cable connector blocks (4.6), bolt the frame together with the supplied bolts. This forms two large U shapes that slide in and out of the holes in the two medium aluminium U brackets. Take the artificial eyes (4.7) and using miliput (4.8) secure the socket joint of the ball and socket (4.9) element into the middle of the back of the eyes. Then take 4 (2 for each eye) small brass screw eyelets (4.2.1) and .secure them into the miliput. These eyelets should be level with the rods that are coming out of the upper two medium aluminium brackets. To each end of the rod take a brass wire connector block and bolt it half way onto each of the metal rods. Make a small loops using 1mm wire and attach it to the spare hole in the end of the brass connectors and secure in place with miliput. The the ball part of the ball and socket joint and secure it to middle lower part of each of the medium aluminium U brackets. Secure this part onto the bracket using Black thermo plastic (4.2.4). Attach the ball and socket joints together and 1mm wire through each of the brass eyeless, close the eyelets with pliers and tighten the 1mm wire (with caution) to form a snug fit. Use miliput on the renaming parts of the brass connectors and 1mm wire loops to secure in place. Take two micro servos and attach them upside down to eachother by their bases using glue. Add stand plastic servo arms take some 3mm wire and cut two pieces into 4cm strips. Loop one end and fit it through one of the holes in the servo arm and loop the other end around the back part of the large U bracket the controls the eyes up and down / left and right. secure this in place on each side with thermo plastic (allows the loop to turn with the servo arm but not move up and down the metal rod). Attach the two servo to the upper part of the back of the robot with miliput (see picture). This completes the eye mechanism. see video for test footage.
Step 7: Lips.
Attach 2 micro servos (5.1) to each side of the middle head bracket (aluminium multifunctional robot brackets bolted back to back) one servo here > ][ < on here (5.2). These motors will drive the lips. To make the lips take a long 18 hole meccano (5.3) strip and attach brass wire blocks (5.4) to the middle and either side secure in place with black thermo plastic (5.5). On the end of each block take a small spring from a peg (5.6) and place it over the end of the block (this with retract the lips back into place) Take the guitar string (5.7) and cut it to size using the teeth as a guide leave plenty spare on each end so you can play around with getting the fit right before securing each end to frame with miliput. Take some thick fishing wire (5.8) and thread it through the arms of the micro servos, through the brass blocks and secure to the upper lips with a knot. Secure in place with miliput (I used metal clasps in the picture but replaced because they kept slipping along the guitar string). Use the same set up for securing the lower lip into the bottom jaw aluminium L bracket (5.9) and secure with thermoplastic. Attach a mirco servo to the underneath of the bottom jaw using two bolts and thermoplastic. Take a brass block wire connector and attach it to the middle of the lower jaw (underneath) use miliput to secure in place. Run wire from the servo arm through the brass block and attach to the lower lip with a knot and secure with miliput.
This completes the lip functions, see test video and photos of finished set up.
Step 8: Fixing the Head Together and Securing Speakers
Attach the eye mechanism to the top of the multifunctional brackets (bolted back to back in the middle of the head) use thermoplastic to secure in place as this gives it a strong tight hold (6.1). Take the bigger pocket speaker (giz wiz 6.2)and place it at the back of the robot ( it will fit perfectly between the Large upper black U bracket if you buy the one stated in the equipment list. Take the smaller speake (6.3)r and secure it to the upper top part of the jaw mechanism. Extend the reach of the cables using extension leads (2,1mm)
Step 9: Arduino Set Up and Audio Trigger
Take an arduino and place the servo shield on top of it, place each of the servos into a digital port on the board. (Note: I run a 6v 3amp psu / battery into the servo shield to power to motors: remember to remove the crossover on the shield if you are going to do this). Take a 2.1 mm headphone/audio cable and remove the female end. Take the wires and twist the ground together and the signal. Place the ground in the ground port on the arduino and the signal into one of the analog ports. (It does not matter which port because you can change the code to suit the ports of our liking). Place the end of the cable into a apple laptop or computer.
Step 10:
Step 11: Processing Script and Kinect Libraries
First of all we need to download the kinect libraries for processing (look it up on google) Make sure you install the point tracking for kinect library into your processing library. When you can run the demo for point tracking you know it is installed correctly on your system. We also need to install the Eliza chat bot library, this comes with a test package.
You will need to make your own background image for your program and call it bg1.jpeg
You will also need to make your own Eliza script ive called it something like newscript.txt but call it what you want, or unhash the internet sample code if you just want to use the original.
You can also comment out point tracking and access skeleton tracking mode to interactive with multiple participants.
Step 12: Applescript Voice App Script and Voice Recognition Output
Make an apple script app with the following:
set theVoices to
{"Alex", "Bruce", "Fred", "Kathy", "Vicki", "Victoria"}
set thePath to (path to desktop as Unicode text) & "test.txt"
set the_file to thePath
set the_text to (do shell script "cat " & quoted form of (POSIX path of the_file))
set the clipboard to the_text
set theSentence to the clipboard
log (theSentence)
say theSentence using ("Bruce") speaking rate 140 modulation 5 pitch 15
on readFile(unixPath)
return (do shell script "cat /" & unixPath)
end readFile
To use voice recognition simple activate the function on your apple mac from the system menu and out put it to the processing / eliza chat interface instead of using a keyboard. (You will need to set up the microphones in the kinect sensors for this to work)
Step 13: Setting Up the Kinect Sensor
This is a little bit tricky but if you buy a kinect stand or make one you can move it around a little bit and find the best position for your robot. I find this easier to do than re-programming everytime, although sometimes this is practical if the robot is going to be stationary for a long period of time to refine the positions.