Introduction: Robot - Recognition From Voice

About: Estudiante de Ingeniería Electrónica. Universidad de los Llanos !

I apologize if you find spelling errors or nonsensical text, my language is Spanish and has not been easy to translate, I will improve my English to continue composing instructables.

Step 1: Pieces.

It is interesting to find such pieces when you think about the construction of our projects in robotics, these parts are you throw in aluminum with a thickness of 2mm which makes them light and very resistant.

These parts are designed by a company called Lynx Motion, they are responsible for their manufacture, to join the same company in their YouTube channel offers a step by step tutorial on how to build a biped robot as observed in this instructable.

For this project were designed parts SolidWork, a fairly stable and with many design options platforms. I present some captures you take:

The interesting thing about using these parts in robotics projects is that they allow us variety of designs, limited to a single design, offers us the opportunity to test our imagination and design what we want, these are some of the possible designs:

Now the good news attached file to download parts of this instructable biped robot.

Procedure:

Step 2: Battery.

Materials:
  1. 6 Battery Li-lon.
  2. Module Charger battery From Lithium Mini USB 1A

Battery Li-lon

Specifications
ManufacturerSAMSUNG
Type from batteryLi-Ion
Size from battery / batteryMR18650
Capacity2200mA
Potential3.6V
Measures interiorØ18.25 x 65mm
Current maxim4400mA

Module Charger battery From Lithium Mini USB 1A

Specifications
ChipsetTP4056
InterfaceMini USB
Load linear1%/td>
Voltage input4.5V / 5.5V
Voltage from load full4.2V
Precision+-1.5%
Precision+-1.5%
Temperature operating-10°C~+85°C

Scheme:

This setting is given by the following reason, the module Module battery charger From Lithium 1A USB Mini only allows me to load batteries in parallel, thus I have only the possibility to get 3.7V output, but if I agrago one of the batteries in series , get 7.2V, sufficient for handling the robot.

At the time of charging the battery switch 1 I leave it open while switches 2 and 3 the I close, thus my batteries are parallel. As for the 7.2V I do the opposite process, exactly as in the picture.

Step 3: Camera Pi Noir As Server.

After having for several months the raspberry pi stop I plan to give a utility as it deserves. I will create a server and inexpensive home video. It is true that there are IP cameras in the market for a set price, but the chances it gives you the raspberry pi not give any market.

Materials:

  1. Raspberry Pi
  2. camera Pi Noir

Procedure:

connect the bus of the camera.

raspi-config camera
<p>raspi-config camera enabled</p>

Now we connect via SSH or we open LXTerminal and use the update commands

<p>sudo apt-get update<br>sudo apt-get upgrade
sudo apt-get install rpi-update
sudo rpi-update</p>

Patience then take time, then run reboot with

<p>sudo reboot</p>

Now commands

<p>sudo apt-get install -y libjpeg62 libjpeg62-dev libavformat53 libavformat-dev libavcodec53 libavcodec-dev libavutil51 libavutil-dev libc6-dev zlib1g-dev libmysqlclient18 libmysqlclient-dev libpq5 libpq-dev</p>

Follow the commands to create the folder and the appropriate settings to lower

root@raspberrypi:~# pwd<br>cd /
mkdir mmal
cd mmal
wget   <a href="https://www.dropbox.com/s/xgny7z7ltluussa/motion-"> <a href="https://www.dropbox.com/s/xgny7z7ltluussa/motion-</a"> <a href="https://www.dropbox.com/s/xgny7z7ltluussa/motion-...</a"> <a> <a> https://www.dropbox.com/s/xgny7z7ltluussa/motion-...</a>>>>>mmal.tar.gz<br>tar -zxvf motion-mmal.tar.gz

The configuration file is modified with the following command

sudo nano motion-mmalcam.conf

In the nano editor can search for a word with Ctrl-W, you write the word and press ENTER

You must make the following changes in the configuration file.

width 640<br>height 480
target_dir /home/pi/m-video
output_pictures off
text_left Pi-cam %t
logfile  /home/pi/mmal/motion.log

What we are doing with this configuration is set resolution and a folder / home / pi / m-video, to record the video to be obtained. Once the changes made is recorded and this file is closed. In the same folder, rename a file to differ.

mv motion motion-mmal

From software to use the command

./motion-mmal -n -c motion-mmalcam.conf

Now you can see the camera output on port 8081 using the address of the Raspberry Pi in this way

<a href="http://ip_raspberry_pi:8081">http://ip_raspberry_pi:8081</a>

And to close simply run ctrl-c, A sample screen shown below.

For the startup script

sudo nano startmotion

type the following

type the following</p><p>nohup ~/mmal/motion-mmal -n -c motion-mmalcam.conf 1>/dev/null 2>&1 </dev/null &

Save the file. To stop script

sudo nano stopmotion

Type the following

#!/bin/sh<br>ps -ef | grep motion-mmal | awk ‘{print $2}’ | xargs kill

Save the file and use the following commands.

chmod 755 startmotion<br>chmod 755 stopmotion

For both executable files. To use it simply runs.

./startmotion<br>./stopmotion

Step 4: Card Sound Card

As you know, our dear plate 'made in UK' has an audio output which can connect powered speakers or headphones, but if we get better sound quality or have an audio input for recording, we will pull external sound card.

Material:

  1. Raspberry Pi.
  2. Usb sound card and a microphone.

The functions of these USB cards are much more limited since they often have only one input and one audio output, but the good thing they have is their price, recently I bought one on eBay for just 2 euros (shipping included) which represents a saving of more than 20 € regarding Wolfson cards. This type of sound cards are ideal for small projects. I leave the image I purchased:

This particular sound card uses audio chipset C-Media 100% supported by ALSA (Advanced Linux Sound Architecture) in Raspbian. In this tutorial we learn how to install the card and set it as the default audio input and output device.

Preparing the Raspberry Pi

First we start with the Raspberry Pi sound card connected to a USB port. At the end of the operating system boot we introduce this command in the console to list all connected USB devices:

lsusb

We will display something like this: itivos usb connected:

USB audio card has been detected correctly, as you will see on the last position as 'C-Media Electronics, Inc. Audio Adapter'. Not wanting to use the audio output of the Raspberry but the USB card, I had to edit the configuration file that controls audio devices operating system. For that we open the file /etc/modprobe.d/alsa-base.conf text editor using this command:

sudo nano /etc/modprobe.d/alsa-base.conf

We look like this:

Now let's edit the line where it says options snd-usb-audio index = -2, this line does not allow the USB audio device as the default device is used, you put a # in front to clear the line and are thus:

# options snd-usb-audio index=-2

Optionally you can enter / edit these 2 lines that forced the audio output of the Raspberry pass into the background:

options snd-usb-audio index=0<br>options snd_bcm2835 index=1

At the end press Ctrl + X and press Enter to save changes to the file, then restart the Raspberry with this command:

Sudo reboot

Testing the new audio settings:

If we have done well and should be able to use the USB card as default audio device, you can upload an audio file to the Raspberry and try to see if we can hear something, try it with the following command:

aplay /home/pi/test.wav

If you've done everything right you'll hear the audio without problems.

Adjusting the volume of O

It might be heard too loose or too high, adjust the volume of inbound or outbound enter the following command in the console:

alsamixer

and we will look like this:

sudo alsactl store

END

Step 5: Recognition From Voice

Projects that haunted my mind, I found most interesting is the Speech Recognition, but not only to transcripts, but to perform actions with the GPIO of Pi.
So after a few days of testing and tests, I managed to have a continuous listening voice input to execute commands on or off as LEDs connected to the GPIO. From here you can do whatever we pass by the imagination. You want to know how to do just that in your Raspberry? Well, keep reading!

Materiasl:

  1. Raspberry Pi

The first thing we do is install the free voice recognition software, called PocketSphinx.

To do this execute the following commands (some requiring elevation). With this install and update the software, and give preference to the audio card or USB input to when making voice recognition.

<p>sudo apt-get install rpi-update<br>sudo apt-get install git-core
sudo rpi-update</p>

PocketSphinx:

We recommend using the latest versions of software

Once you installed you can run the command to start ./pocketsphinx_continuous try if you recognize the voice. In the event that an error will not find audio input, run as an administrator with sudo.

Now, we will create a dictionary with words that you want to assign to some actions.
This will help the software to perform the Speech to text, since you only have to check the audio input that we create entries in the dictionary.

In my case, create a dictionary with few words as: Green On Red On, Off Green Red Off. To turn on and off leds.Para create the dictionary, create a plain text file (.txt) with one word in each row.

Then enter the following address (LM-TOOL) and raises the txt file.

It will show the links to download your dictionary.

Download and unzip the TAR in a folder to execute scripts. For the first tests, go to the dictionary folder and run the following command:

<br><p>sudo pocketsphinx_continuous -lm 9640.lm -dict 9640.dic > capture.txt -samprate 16000/8000/48000</p>

Where 9640.lm and it 9640.dic be replaced by the number of dictionary you have just created.

When you run the command you can say the words of the dictionary, and once acabéis (Control + C to close) capture.txt you can read the file to check if it is correct.

Now we connect the LEDs and start playing with them. To start, connect the LEDs in the Pines GPIO17 and Ground (earth) and GPIO1 and Ground.


Now run the following commands to indicate they are active, and they are output:

<p>echo 17 > /sys/class/gpio/export</p><p>echo 1 > /sys/class/gpio/export</p><p>echo out > /sys/class/gpio/gpio17/direction</p><p>echo out > /sys/class/gpio/gpio1/direction</p>

We can turn them off and turn them on changing the value with the following command:

<p>echo 1 > /sys/class/gpio/gpio17/value</p><p>echo 0 > /sys/class/gpio/gpio17/value</p>

Now, to start making actions based on voice commands, I've written some scripts in Python, to do this. The peculiarity of this code is running PocketSphinx and leaves listening to say that every time a command is described to automatically check and execute the action in question.

So we can create the following files and copy content for testing. Remember the words I use are created in my dictionary. Also you can see that although this said, also call a program called Festival, which is just the opposite, a speech synthesizer if we want the Raspberry answer every time we do an action.

Archivo read.py:

<p>#!/usr/bin/pythonimport os</p>import time
 
i=0
 
while i != 1 :
infile = open(‘capture.txt’, ‘r’)
 
for line in infile:
	if line.find(“GREEN OFF”) != -1 :
		os.system(“echo 0 > /sys/class/gpio/gpio17/value”)
		os.system(“true > capture.txt”)
		#os.system(“festival -b ‘(SayText “Green led off”)'”)
	if line.find(“GREEN OK”) != -1 :
		os.system(“echo 1 > /sys/class/gpio/gpio17/value”)
		os.system(“true > capture.txt”)
		#os.system(“festival -b ‘(SayText “Green led ON”)'”)
	if line.find(“RED OFF”) != -1 :
		os.system(“echo 0 > /sys/class/gpio/gpio2/value”)
		os.system(“true > capture.txt”)
		#os.system(“festival -b ‘(SayText “Red led Off”)'”)
	if line.find(“RED OK”) != -1 :
		os.system(“echo 1 > /sys/class/gpio/gpio2/value”)
		os.system(“true > capture.txt”)
		#os.system(“festival -b ‘(SayText “Red led ON”)'”)
	if line.find(“TEST”) != -1 :
		os.system(“echo 1 > /sys/class/gpio/gpio2/value”)
		os.system(“echo 1 > /sys/class/gpio/gpio17/value”)
		os.system(“true > capture.txt”)
		#os.system(“festival -b ‘(SayText “Green and red led ON”)'”)
	if line.find(“EXIT”) != -1 :
		os.system(“sudo pkill -9 pocketsphinx”)
		os.system(“true > capture.txt”)
		#os.system(“festival -b ‘(SayText “Goodbye!”)'”)
		i=1
 
		infile.close()
		time.sleep(2)

And now the all.py file PocketSphinx running in the "background".

#!/usr/bin/pythonimport os
import subprocess, time
 
os.system(“rm capture.txt”)
#os.system(“./shut.py &”)
os.system(“sudo pocketsphinx_continuous -lm 3906.lm -dict 3906.dic > capture.txt -samprate 16000/8000/48000 &”)
os.system(“./read.py &”)<br>

Now we execute permission to the two programs:

<p>chmod +x all.py read.py</p>

And we can run all.py for testing.

This "piece" of code in development will continue to try to find improvements. I offer freely for anyone to use and play around with it.

xcxcxcxcxc

Step 6: Glasses.

Materials:

  1. Wood balsa
  2. Silicone

Construction:

The construction process is simple, using balsa wood build a box (glasses) that engages with the face perfectactamente trying to eliminate the inconvenience it may bring.

We started to build the basic structure of the box.

Following this, the side boards of the box will give you the exact shape of your face.

Besides that, we in the box grid vertically in order to be able to enter the mobile phone.

Now we just build the electrical circuit.

Step 7: Circuit From Glasses.

We divide into two stages: Circuits glasses and Servos

Let us begin, step 1: Circuits is glasses.

Gyroscope MPU6050

The sensor InvenSense MPU-6050 contains, in a single integrated, a 3-axis MEMS accelerometer and a 3-axis MEMS gyroscope. With the gyroscope we can measure the angular acceleration of a body on its own axis, while the accelerometer can measure the acceleration of a body along a direction. It is very accurate, as it has an AD converter (analog to digital) 16 bits for each channel. Therefore capture channels x, y and z simultaneously. The sensor has a communications protocol standard I²C, and easy to interface with the world of Arduino.

Connections for Arduino Uno:
GY-521 ARDUINO UNO
VCC3.3V
GNDGND
SCLA5
SDAA4

N.B: The layout and the links are directed only for Arduino Uno, but the tutorial is also valid for all other board Arduino. The only thing that changes in the connections are the 2-pin I2C, or SDA and SCL (Ex. Arduino Uno pin SCL you find on pin A5 while Arduino Mega is on pin 20). Just consult the datasheet or do research on google to find I2C inputs of its board

At the end MPU6050 library annex.

Bluetooth HC-05

Bluetooth is a wireless communication standard that enables data transmission via radio frequency in the 2.4 GHz band. There are many Bluetooth modules for use in our electronic projects, but the most common are the JY-MCU modules because they are inexpensive and easy to find on the market. They are small and with a very low consumption that will allow us to add Bluetooth functionality to our Arduino modules. These modules containing the chip with a development board with pins required for serial communication.

There are two models of Bluetooth modules: HC-05 that can be master / slave (master / slave), and HC-06 can only act as a slave (slave). The difference between master and slave is a slave mode is the device who is connected to the module, while in master mode is the module who connects to a device.

Physically, the two modules are very similar, only vary some connections. We find pins that are:

-VCC: Module power between 3.6V and 6V.

-GND: The mass of the module.

-TXD: Data transmission.

-RXD: Receiving data at a voltage of 3.3V.

-KEY: Putting high level to enter module configuration mode (only the HC-05 model)

-State: To connect a LED to display output when data is communicated.

AT Commands

AT commands are a type of commands to configure the Bluetooth module via a microcontroller, a computer or any device that has a serial communication (Tx / Rx). They are instructions that allow us to change the baud rate of the module PIN, name, etc. To use the AT command Bluetooth module must not be connected to any device (red LED flashing module). According to the specifications of the module, while having to respect between sending an AT command and one has to be 1 second. If an AT command is sent and in less than a second one is sent, the module returns no response.

Materials:
  1. Arduino Mini Pro
  2. Bluetooth HC-05
  3. Gyroscope MPU6050
  4. Button
  5. Switch
  6. Battery 3.7V-900mA
Scheme:
Construction:
For the construction of this circuit use a universal bakelite, the purpose was to make it smaller circuit.

The gyroscope should be the most focused as possible so that when programmed not start by default with errors, the role of the button is to allow data to this casting gyroscope are sent by bluetooth, it helps me to servos found in the robot are activated only where the button is pressed so that I can save a minimum of energy.

Some evidence:
Error on measures

They are all noise interference that affect electronic devices. The accelerometer can measure any angle, but their readings are noisy and have a certain margin of error.

If you ever want to draw a graph of the measurements of an accelerometer with time, you'll see something like this:

The actual angle (ideal) is marked in blue, and the actual measurements are in red. I can say that does not comply to the letter the definition of "need". To correct these errors we will use the filter known as Supplemental Filter. It is ideal to implement Arduino: easy to use, low processing cost and with very good accuracy.

Annex code implementing the complementary filter in the management of three servos, ideal if you need to drive a robotic arm.