logoWords MangoLogo

Autonomous Mobile Robot

Robot Picture
Ain't she cute?

Overview

Pictured above is the Autonomous Mobile Robot, a robot that navigates an obstacle course, finds a target, and returns to its starting location. It was a pretty nifty project! I programmed and co-designed this for my Senior Design Class.

My partner, David Waithaka, co-designed the robot and was responsible for the chassis and most of the circuitry on this project.

My advisors on this project were Dr. Hanyi Dai and Dr. Frank Severance of Western Michigan University. Many thanks go to them for their valuable advice and input.

The project was largely designed using the Arduino development suite. The robot uses a combination of sonar and infrared sensors to navigate and locate its target. It is programmable via Bluetooth and can be given commands during operation wirelessly.

A custom PCB and chassis was designed for this project, upon which the hardware was mounted.

Read more about the robot below - this is an adaption of parts of our final report on the project. Check out the video of the robot facing a treacherous obstacle course of soda bottles and cardboard boxes!

Sensors

The sensors are arguably the most important component of the robot; without any sensors the robot would just indifferently run into various objects until either the battery runs out or it bashes itself into robot heaven.

There were three different sensors used: proximity sensors to prevent bumping, ultrasonic to detect object distances directly in front of the robot, and an active infrared sensor to lock onto the target.

Infrared Proximity Sensors

Proximity Sensor

To minimize the blind spots (especially when approaching an obstacle at a sharp angle), Sharp GP2Y0D805Z0F proximity sensors were used. These sensors have a range of between 0.5 cm and 5 cm and have binary outputs (active low). For general navigation they were possibly more useful than even the sonar sensor. Three of these sensors were used, placed on the front-left, front-right, and the back of the robot.

Difficulties with the Proximity Sensors

The proximity IR sensors rely on reflection – the robot would sometimes not detect an object in its path if it was not sufficiently opaque; soda pop bottles, for example, would sometimes not activate these sensors.

Ultrasonic Sensor

Proximity Sensor

The Parallax ultrasonic distance sensor was used to detect objects directly in front of the robot. This sensor has a range 2 cm to 3 m (0.8 in to 3.3 yards) and a viewing angle of about 40 degrees for a 3.5” diameter target placed 4 feet away. The microcontroller was programmed to interface with this module using the following timing diagram.

sonar timing pic
Ultrasonic Module Timing Diagram
MC Timing BarMicrocontrollerInput Trigger PulsetOUT2 μs (minimum)
Sonar Timing BarPing SensorEcho HoldofftHOLDOFF750 μs
Burst FrequencytBURST200 μs
Echo Return Pulse MinimumtIN-MIN115 μs
Echo Return Pulse MaximumtIN-MAX18.5 ms
Ultrasonic Operation Time Durations

The Atmega328’s Timer1 was used to interface with the sensor with its output compare and input capture capabilities. To make the PING sensor execute a sonar burst, the input/output pin must be pulsed for at least 2 "μs" . The output compare functionality (interrupt based) was used to do the following:

  1. Set the pin low for a small time to create a good high edge.
  2. Set the pin high for at least 2 μs.
  3. Set the pin low for a short time to prepare the pin for PING’s output.

Next the input capture mode is used to measure the incoming pulse from the PING sensor. The pulse represents the time lapse between sonar burst and echo detection – if the speed of sound is known, the distance between the sensor and an object within its viewing angle can be calculated.

Calculation

Interpreting the ultrasonic module's output is pretty simple. The speed of sound is 1126 fts, and the pulse is to be measured in microseconds. Thus, to convert the input to inches, the following ratio must be used:

1126 fts * 12 inft * 1 s106 μs = .0135 inμs

However, using a floating point data type in a conversion is generally unwise in an embedded environment. The reciprocal of the above ratio is:

(.0135inμs)-1 = 74 μsin

Thus, the equation for converting an input x, in microseconds, to an output y, in inches, is:

y = x μs / 74 μsin / 2 = x / 74 / 2 in

The result must be cut in half (the final division by 2), since the pulse time includes both time to target and time back to the sensor.

Difficulties with the Ultrasonic Sensor

The Parallax sensor has difficulties with surfaces that aren’t particularly solid or flat, which is what partly necessitated the IR proximity sensors. Objects that filled the obstacle courses that the robot was tested on included chairs, soda-pop bottles, tables, cardboard boxes, and other everyday objects. At a distance (greater than a couple feet), the robot would often read distances falsely. This also means that if a sufficiently thin object obstructed the path of the robot, but not the active IR sensor, a false target acquisition could occur.

Active IR Sensor

Active Infrared

The TSOP48 active IR sensor is the most important of the sensors as it detects the 38 kHz IR target. It is active low, outputting a low voltage when it detects a 940 nm optical signal blinking with a frequency of 38 kHz. A simple read of the associated pin is all that is necessary to interface with the sensor. The maximum range on signal detection when coupled with the LED target is about 10 feet.

Difficulties with the Active IR Sensor

If a surface is sufficiently reflective, the target's infrared signal may be reflected, causing the sensor to falsely "lock on" to the target. Usually the robot would move out of the angle of reflection as it would approach a false positive, so this wasn't too much of a problem.

Target

Target
The secret ingredient is sticky tack.

The target is a 940 nm LED driven by an Arduino board contained in a project enclosure (dimensions: 6 by 4 by 2 in). The robot uses both the Parallax ultrasonic module and the active IR sensor to acquire this target. The circuit for the target is simply an IR LED, a battery, and an Arduino board. The IR LED is different than normal LEDs in that it can handle much more current without going through irreversible breakdown, so it is connected to pin 0 on the Arduino board with no pull up resistor.

Two of these were used – one for the “initial target” and one for the “base” as starting position to return to after the target is found.

Motors

Servo Motor

Two Hitec HS-1425CR servo motors were used, as shown above. These servo motors were selected due to their size, voltage requirements, torque, and ease to integrate with the microprocessor. They operate at a 4.8 - 6.0 V supply voltage at a speed of 44 to 52 RPM at no load. Each produces a torque of about 38.8 to 42 oz-in (0.27 N-m to 0.30 N-m) depending on the supply voltage.

Driving the Motors

The motor has three states: stopped, reverse, and forward. Because no speed control was necessary for the project, this motor was selected for its simplicity: no external gearing was necessary. The orientation (forward vs reverse) was arbitrarily picked. Each of these three states are selected by pulsing the motor’s input pin at with a certain frequency. The frequencies used for the three states are listed in the table below. Note Timer0 is used to generate these frequencies, set a resolution of 1 ms – this means that the fastest the robot could effectively make a decision is once every few milliseconds. This will be important later when designing the robot’s path finding algorithm.

DirectionFrequencyDuty Cycle
Stopped0 Hz0%
Forward490 Hz75%
Reverse326.67 Hz50%
Servo Signalling Frequencies

Maneuvering

Because of the motor's simplicity robot locomotion was likewise simple. The robot has the ability to perform 4 maneuvers: move forward, reverse, turn left, and turn right.

Going Forward

Going Forward
Forces When Moving Forward

While both wheels rotate forward, both motors are identical, which means they mirror each other when attached; the left motor must be directed move in “reverse” so that both wheels rotate in the same direction (noticeable in the “motors.ino” module). The net force is forward, and the net torque is zero, moving the robot forward.

Going in Reverse

Going in Reverse
Forces When Moving in Reverse

As with moving forward, the microcontroller must instruct the left motor to move in “reverse” so both motors rotate the wheels backward. The net force is in the reverse direction and net torque is again zero, moving the robot backward.

Turning Left

Turning Left
Forces and Net Torque When Turning Left

To turn left, the left motor moves in reverse while the right motor moves forward. This creates a net force of zero, and a net counterclockwise torque causing the robot to rotate to the left.

Turning Right

Turning Right
Forces and Net Torque When Turning Right

To turn right, the left motor moves forward while the right motor moves in reverse. This creates a net force of zero, and a net clockwise torque causing the robot to rotate to the right.

These four maneuvers (in addition to, of course, stopping) were sufficient to fully navigate an obstacle course. A video example of these is given below, where the robot is being manually controlled over Bluetooth.

An Example of Each Locomotive Maneuver

Bluetooth Module

The JY-MCU is used for two separate purposes: programming the robot wirelessly, and for communicating with the robot after its program has been initialized.

JY-MCU Bluetooth Module
The JY-MCU Bluetooth Module

Configuring the module is done by using traditional AT commands (“Attention Commands” for modems). This can be done either over the TX and RX pins or by streaming the commands via Bluetooth. By default, the JY-MCU is configured for 9600 baud, and the Arduino board could communicate with it directly over its TX and RX pins. However, the Arduino bootloader communicates at 115200 baud; in the interest of avoiding recoding and compiling the bootloader, the module was set to operate at 115200 baud.

Communicating with the JY-MCU module was done by pairing it with a generic PC Bluetooth transceiver. Since security was not a concern on this project, the default pairing pin was used: “1234.”

Communicating with the Arduino Over Bluetooth

The Python scripting language was used to communicate with the robot over Bluetooth. The script, hereafter referred to as the “control script,” can be found in the appendix under “control.py.” Python was selected for its extensive libraries and ease of use – it is an interpreted and interactive language, lending itself to rapid prototyping and development. A paired Bluetooth connection manifests itself as a serial port, and can be readily interfaced with as if it were a standard RS-232 connection. The Python library “pySerial” was used to interface with the Bluetooth connection. The Arduino serial library is used to communicate with the Bluetooth module, and the code used to do so can be found in the “bluetooth.ino” module.

Programming the Arduino Over Bluetooth

The bootloader, by default, has a 500 ms window in which it waits for a binary to be streamed to it. If the bootloader receives valid input, it writes the assembled machine code (in the form of a hex file) to the ATmega328’s flash memory (in the program space). If the bootloader receives no valid input, it loads the binary in the program space to SRAM. The key to writing a new program to the robot is making that 500 ms window.

The Control Script invokes an In System Programmer (ISP) called “AVRDUDE” (AVR Downloader/UploaDEr ) to communicate with the bootloader and upload a new program. The process for uploading the program is as follows:

  1. User selects hex file to be written to flash memory
  2. Control Script tells the robot to reset
  3. Robot halts operation, and delays for one second before entering the bootloader
  4. As the robot waits to reset, the control script invokes AVRDUDE
  5. As AVRDUDE initializes, the ATmega328 loads the bootloader
  6. During the 500 ms window AVRDUDE uploads the hex file in conjunction with the bootloader
  7. The bootloader loads the new program into SRAM, and the Control Script attempts to establish communication with the new program

So long as the Bluetooth module remains intact from iteration to iteration of the program, the programming process is seamless – the Control Script remains running through and after the process.

Circuits

Power Distribution

To determine the power distribution system needed, the power requirement for all the loads was first calculated. The rated currents for loads were added together assuming full load for each one of them. The current required was about 341ma and supply voltage ranging from 3.6 – 6.0V. The table below shows the voltage and current requirements for the various loads.

LoadQuantityVoltage (V)Current(ma)Total Current (ma)
Servo Motor24.8 - 6100200
Ultrasonic sensor153030
Bluetooth13.6 - 63535
ATmega32814.5 - 5.50.20.2
Active IR sensor12.5 - 5.50.450.45
Proximity IR sensor32.7 - 6515
LEDs32.7 - 52060
Total:340.65
Power Budget

To determine what type of voltage supply to use, a calculator tool available at the Society of Robots website was utilized using the values from the table above. The table below shows the obtained results.

Total Continuous Power Draw6.48 watts
Min Required Battery Voltage5.20 volts
Idle Motors Battery Life1.47 hours
Minimum Battery Life0.675 hours
Typical Battery Life0.0694 hours
Obtained Power Calculation Results

From the above results, it appeared that a 9 volt battery would be suffient to power the robot. To meet the voltage requirements, the 9 V battery was combined with two voltage regulators as shown in figure below. One of voltage regulator supplied power to the microprocessor while the other supplied the sensors and motors. The need to split the power supply into was necessitated by the microprocessor’s theoretical sensitivity to voltage stability. If the microprocessor is fed from the same voltage regulator as the motors, the voltage drop due changes in the loading conditions would cause the microprocessor to reset.

Power Circuit Schematic
Split Power Supply Schematic

Unfortunately, relying on this web app was not sufficient to ascertain power needs. A second 9 volt battery had to be jerry-rigged to provide sufficient power - the sensor would otherwise cut out due to lack of current. A rechargeable battery would also have been much better suited for the task; many batteries were consumed during development.

Salute our fallen comrades, the batteries.
Pictured: A Small Sample of the Casualties of War

A soft-latching momentary switch (soft power switch), as shown in the soft power switch schematic below, was used for turning the power to the motors and sensors on/off while leaving the microprocessor running. Two switching methods were implemented in parallel; one using a momentary pushbutton switch and the other using a microprocessor-controlled relay. The relay switching potentially enabled toggling the power to the peripherals remotely and switching the microprocessor to low power/sleep mode. This would have reduced power consumption during idle periods and provided a means to wake up the robot's periphreals remotely. Unfortunately, this particular circuit was never actually tied to a pin on the microcontroller. No power mode toggling was implemented.

Soft Latch Circuit
Soft Latching switch power switch

A relay switch (schematic below) was connected in parallel with the pushbutton in, to potentially enable software switching mode.

Power Switching Circuit
Power switching relay circuit

The next step was to design a circuit for the microprocessor so that it could be mounted on the PCB board. Since Arduino evaluation board was being used, an Arduino to ATmega328 pin mapping was needed. The figure below shows the Arduino to ATmega328 pinout mapping.

Atmega328p Pin Mapping
Arduino to ATmega328 pin mapping

The circuit below shows how the microprocessor was mounted onto the PCB board. A 16MHz crystal was used as an external clock and momentary pushbutton used for resetting the microprocessor. This readily enabled programming of the microcontroller.

Atmega328p Pin Mapping
ATmega328 PCB Mounting Circuit (Click to See Schematic in Detail)

Printed Circuit Board and the Chassis

PCB Design

DipTrace®, a PCB design software, was used to design the PCB and generate all the required files for fabrication. The software package includes a PCB layout and schematic capture with extensive libraries.

The circuits in the previous section were created in the DipTrace® Schematic Capture in a similar manner as in PSpice. The schematic was the exported into the DipTrace® PCB Layout where the components were arranged in the desired way and then the traces were auto-routed. Special consideration was given to the spacing between components and traces given that hand-soldering was going to be used and would also make it to fix any problems that may arise after the manufacturing process.

The "Gerber" and "NC drill" files necessary for fabrication were generated and sent out to a PCB fabrication vendor.

Fabricated PCB Populated PCB
Fabricated PCB (left), Populated PCB (right)

Chassis Design

AutoCad® was used to create the preliminary designs for the chassis and the body and for creating the templates needed for cutting, milling, and drilling. Pictured below are some of the drawings created in AutoCad.

Plexiglass was used to build the chassis since it is easy to cut, drill, and form. An aluminum sheet was used to form the servo mounting flange.

Bottom Plane Servo Mounting Flange
Bottom Plate and Servo Mounting Flange

The 1” standoffs were used to mount the PCB board on to the robot and also create room for the battery and space for routing the cables.

Chassis and Mounted Motors/Wheels
Chassis with Servos and Wheels Mounted

Searching for the Target

Once the basic functionality for each of the subsystems was completed, development on path finding could begin in earnest. The intention was to make the pathfinding algorithm as simple as possible while still completing the search in a relatively short period of time. The algorithm started out with basic object avoidance – the robot would go forward if possible, and turn left or right when an obstacle was encountered. When the target was detected with the active IR sensor, the robot would move forward until it reached the target.

The first problem with this approach is there is no “randomness” with navigation: depending on the obstacle course, the robot could be trapped in a loop as it avoids obstacles with the same path time after time. Somewhat paradoxically, the randomness makes it difficult to for the robot to find the target: it has to get lucky and approach the target from within the LED’s cone of emission. Both of these problems are addressed by “swiveling” or sweeping at certain intervals. The robot will periodically to the right (between 45 and 90 degrees) and then rotate back to the left again (back to neutral and then another 45 to 90 degrees to the left). If the target is detected, the robot stops swiveling, and goes towards the target. If the robot does not discover target, it ends up rotated from 45 to 90 degrees to the left of its original position, adding some randomness to room navigation. Of course, if the robot cannot continue rotating in a direction, it will stop.

In addition to the “swiveling,” the robot has a preference between turning left and right when an obstacle is encountered – if the robot is not blocked in either direction. The preference between left and right changes after 10 non-forced turns. With these two behaviors, the robot has avoided getting stuck in a looped path in all tests.

While the periodic swiveling helps increase the chance of finding the target signal, there is still the matter of tracking onto the signal; each time the robot makes a decision, it checks to see if the target was detected last time it made a decision. If the target was previously detected but is no longer, the robot will immediately do a sweep to try to get back on target. This is absolutely necessary because of the cone nature of the LED signal – as the robot gets near the target, the cone becomes smaller and smaller. Tracking ensures that the robot stays on target.

Going in Reverse

The last major concern with navigation is getting stuck in a corner. If the robot approaches a wedge, like a corner in a rectangular room, it is liable to get stuck, being unable to turn either left or right. The software determines that the robot is stuck when it oscillates between trying to turn left and right while making no progress forward. After 14 attempts at these turns, the robot will then move in reverse for 20 decision cycles. This, in combination with the randomness introduced by swiveling, ensures that the robot will eventually get out of these stuck situations.

Decision Cycles

Decision making is tied to the sonar sensor – after a distance measurement has been taken, the software will go through a series of if-else statements to determine what its next move will be. After making a decision, the robot waits 4.2 milliseconds before making another distance measurement. The sonar sensor can be interfaced more often than this, as the minimum input pulse time is 2 μs - this is done to put less of a strain on the battery. The minimum echo response pulse of the sonar module is 115 μs, and the maximum pulse is 18.5 ms. Compared to these two delays, any other lag in the software is negligible; this means that a decision is made every 4.2 ms to 22.7 ms. This uncertainty in execution is undesirable in a real time system, and will be discussed further in the report.

A simplified flowchart of how the robot makes decisions is given below. Note that “turn preference” here refers to whether the robot is swiveling or not, and the “secondary preference” is the left or right turning preference that was discussed in the previous section. The decision making code can be found in the interrupt functions in the “sonar.ino” module.

Decision Flowchart
Decision Making Flowchart - Click the Picture to See Full Size

Differentiating between Target and Base

Detecting and acquiring the target is relatively simple: detect the 38 kHz IR signal and use the sonar sensor to maneuver within 8 inches of the target housing. After the target is acquired, however, the robot must return to its starting position, the “base.” The exact same hardware that is used for the “target” is used for the “base.” To differentiate between the two, the base alternates between turning on and off the 38 kHz IR signal every 32 milliseconds.

The robot makes a decision every 4.2 ms to 22.7 ms. To differentiate between the two signals (target signal and base signal) an array of 10 samples is used. If making decisions every 4.2 ms, in 42 ms 3 of the 10 samples will be low/high, and 7 of the 10 samples will be high/low, respectively. If decisions are made every 22.7 ms, in 227 ms there will be about 7 different changes in signal (227/32 = 7.09). Max low/high samples is 7, and min low/high samples is 3. Thus, if between 3 to 7 low/high samples are detected, the base is detected. If the balance in samples falls outside those bounds, either the target is being detected, or no signal at all is detected.

This system of differentiation is far from perfect – it causes some temporary false reads when trying to differentiate between the target and the base. However, the false reads are temporary, and the robot moves on and continues navigating the room. This could have been avoided if the system made decisions based on real time scheduling, instead of whenever receiving a response pulse from the sonar module

Video of Target Acquisition

A video of the robot acquiring a target and returning to its "base" is show below. Note its difficulties detecting the transparent soda bottles with its proximity sensors. The video is sped up for brevity's sake.

Acknowledgements and Source Code

There were three code examples that were crucial to completing the project (clickable links):

  1. Serial RS232 Connections in Python by Versano showed how to synchronize a Python control script with the Bluetooth connection.
  2. David Yoo’s Python example for processing a stream of input – this made direct control of the robot over Bluetooth possible.
  3. The examples on the nognu interrupt page made interrupt programming for the robot so much easier.

These following references were heavily used while coding the software (clickable links):

  1. ATmega Datasheet (the most used documentation by far).
  2. List of AT commands for configuring the Bluetooth module.
  3. Parallax datasheet for interfacing with the sonar sensor.
  4. PySerial documentation for coding the control script.

In addition, the examples in the Arduino library were quite useful for familirization to the IDE and the platform. A part of the sonar example in the example library was used to perform the conversion from microseconds to inches. The library is open source, and can be found here.

My partner, David Waithaka, designed the printed circuit board and related circuits, and the chassis (which is an elegant molding of plexiglass and fiberglass!) I was responsible for the programming, debugging, and the targets. A very special thanks to my partner - without him, the robot's body would consist of a breadboard glued to some foamboard!

And once again, thanks to our advisors Hanyi Dai and Frank Severance for their helpful input on the project.

Source Code

All files are of the "ino" filetype. These are syntactically correct C++ source code files that are wrapped by the Arduino IDE.

The code is located in this directory.

Contact: welbyseely@gmail.com

©2014-2017 Welby Seely