Introduction: Differential Wheeled Robot With IR Sensing

This project aimed to create cheap and easily manufactured swarm robots. These robots would model the random search behaviours exhibited by many organisms, ranging from microscopic bacteria to large snowy albatrosses. The movements of these organisms when searching for stimuli such as food, are characterised by mostly small displacements with occasionally large displacements. This random search characteristic is termed the Levy flight. It can be modelled using a Levy distribution for the distance of the movements, and a uniform distribution for the direction of the movements.

While this robot isn't yet capable of communicating with other robots which is characteristic of a robot swarm, it is capable of performing precise movements needed for replicating Levy flights. It is also equipped with IR sensors allowing for algorithms to be implemented that merge the Levy flight with temporal and spatial sensing. Finding the most efficient balance between these three is crucial for developing optimal search algorithms that have a wide array of applications like search and rescue for example.

In this instructable, I will go over all the steps needed to create the physical robot, as well as example codes that show how the hardware can be utilised.

Supplies

Step 1: CAD Model and 3D Printing

The fusion and STL files for the robot can be found below. The printer settings used are shown above. During post-processing, be careful when removing support around the clips so they don't break!

Step 2: Construction Part 1 - Inserting Heatserts

Place heatserts:

Use the soldering iron to insert the heatserts. A useful trick is to place a flat object over the freshly inserted heatsert to prevent lumps around the heatserts from forming. Note that the updated Design has 2 extra slots for heatserts at the bottom compared to the one shown in the picture above.

Step 3: Construction Part 2 - Motors and Wheels

Glue motors:

Place motors as shown above. When glueing the motors, ensure that only the silver section of the motor is in contact with the glue. The black wheel at the back needs to be free to spin to ensure that the encoder works accurately. Ensure that glue does not get on the other parts of the motor like the gears.


Glue wheel to the motor shaft:

Place the o-rings around the wheels. Next, place the wheels at the end of the shaft and glue the wheels in place.

Step 4: Construction Part 3 - Fastening Ball Caster and Sensors

Ball caster:

Push the chrome ball into the housing and bolt down using M3 bolts and a hex key. Note that the updated design has an extra slot for the second ball caster compared to the one shown in the picture above.


Fasten sensors:

Fasten the IR sensors as shown above using the 2 5mm M3 bolts. Note that the IR sensors misbehave when completely bolted down tightly. I suggest testing them with a slightly looser fit of the bolt and seeing if the readings of Voltage over black and white surfaces go as expected. You should see higher voltages when the sensors are over a black surface and very low voltages when over a white surface.

Step 5: Construction Part 4 - Creating the Circuit

Solder header pins to boards and wire (optional)

Solder pins onto the boards as shown above. I soldered male header pins to the boards and used female-to-female jumper cables. However, the wires can be simply soldered directly to the pinouts if that is preferred.


Place boards into clips

Place the boards into the appropriate clips as shown above.


Wiring

Wire according to the circuit diagram above.

Step 6: Construction Part 5 - Attaching Top Layer

Insert the battery into the top layer and connect the battery to the Powerboost 1000. Afterwards, bolt down the top layer and that's it, all the physical components should be ready now.

Step 7: Flashing Raspberry Pi Pico

Some of the code discussed later makes use of a numpy-like module for micropython called ulab. To ensure that the pico has access to this module, flash the firmware that can be found in this link to the pi pico. The file is called RPI_PICO.uf2. This is done by holding down the Bootsel button when plugging the Pi Pico into your computer and copying the file from the link into the Raspberry Pi Pico.

Step 8: Code for Testing Motors and Achieving Full Rotation

The code below generates an array of angles of size n that the robot tries to achieve. Currently, the code is set to rotate the robot 360 degrees once. This can be altered by deleting or commenting out line 25 ("direc = np.array([fullrot] # code for full rotation"). If the PID controller isn't working as expected, a good way to start troubleshooting is by checking the enca and encb pins and swapping them.

Note that all code is done using Micropython. Also note that if you want to execute the code without the robot being connected to the PC, add a delay of a few seconds at the start and save the code as main.py on the pi pico so that the code is executed once the pico is turned on. Disconnect the pico from the PC, and disconnect and re-attach the VS pin on the Powerboost 1000 and the robot should execute the code.


The first video below shows the robot fully rotating.

The second video shows the robot trying to achieve 3 separate desired angles.

Step 9: Code for Executing Levy Walk/Flight

The code below executes a Levy walk/flight. The variable alpha controls the extent of the tail of the Levy distribution. Currently, the code is set to make the robot move in a square, but this can be removed by altering lines 60 to 62:("#Code below is to make robot move in a square

direc = np.array([0,fullrot/4, fullrot/4, fullrot/4,fullrot/4])

Levy_list =[500,500,500,500,0]").


The first video shows the robot moving in a square.

The second video shows the robot executing 5 iterations of the Levy walk/flight.

Step 10: Gradient Sensing Code

The code below is used to detect a gradient using the two IR sensors and move towards either a white or black source. In this code, when the gradient changes, the robot randomly rotates, detects the positive gradient using the sensors and moves accordingly. This code is an example of using spatial data which is the gradient across the sensors, as well as temporal data which is used to determine a change in gradient, in order to locate a food source for example. This is just one of the many algorithms that could be used.

During the calibration stage, ensure that the robot is placed in such a position that both IR sensors rotate over a purely white and purely black area for accurate calibration.


In the first video, the calibration step can be seen in which the robot rotates in place.

In the second video attached, the code is used to confine the robot to a black triangle.

In the third video attached, the code is used to confine the robot to a white triangle.

Step 11: 360 Rotation Code and Video

The code below only needs one sensor. This code uses purely temporal information from the IR sensors. The code works by comparing the data average data from one of the sensors to its previous average. When the average drops, the robot rotates and finds the position with the maximum signal and moves in that direction. The video below shows the robot executing this algorithm.