Andy

In this project, I designed and constructed an interactive robot who can recognize certain people and react differently to them.

Links to my github repository for this projectLinks to the live website for this project (if available)Links to the Medium article associated with this project (if available)

Arduino

OpenCV

Python

Andy

Overview:

In this project I created a dynamic robot capable of advanced interactions. I designed and implemented a robot that could communicate through movement. It was also able to recognize certain people using facial recognition and computer vision. I approached the design of the robot using Fusion 360 so that it could be 3D printed. I wanted it to have a personable appearance with more rounded features. I used a microcontroller to give the functionality of the robot and a webcame and an OpenCV based Python script for facial recognition.

The Design

I used Fusion 360 to create the outer shell of the robot, which I decided to name Andy. The design is inspired by more rounded shapes to give the robot a more friendly appearance. Additionally, I took into consideration the fact that it would need to be 3D printed, which constrained some parts of the design. The following sections break down the design decisions in and procedure for making each component of the robot.

The Head

I wanted to design Andy's head to have a friendly appearance. So, I decided to use more rounded shapes, keeping in mind the functionality of the head. That is, it needed to be able to rotate 360 degrees and tilt up and down in a 3D space. Additionally, the head needed to connect with the body and house an ultrasonic sensor for object avoidance.  

Sketch for Andy's head.

I used Fusion 360's revolve function to create a three dimensional shape from a two dimensional sketch. I wanted it to have a kind of 'bubbly' shape, rather than a perfect semisphere. I chose to increase the radius of the sketch towards the middle of the head, and decrease it for the top and bottom. I also chose to use a height of about 4 inches, and a radius of about 2.5 inches, so that it would sit nicely on the body component (discussed below).

After revolving the sketch, I used Fusion 360's shell command to hollow the head, leaving only an inner radius of about 0.2 inches. Next, I created holes so that the ultrasonic sensor could be mounted. I did this by simply making a dimensioned sketch and extruding it past the inner wall of the head.

I also wanted to add some additional character to the robot, so I decided to add some kind of cylindrical 'ears.' I made a sketch on one side of the head by creating an offset plane from the original y-z plane. I used a simple circle sketch and extruded it towards the head, making a cylinder. I used Fusion 360's mirror function to construct a second cylinder on the other side of the head.

Finally, I rounded most of the edges and features on Andy's head to emphasize the softer, more friendly nature of the robot. I did this by using Fusion 360's fillet feature, which adds material to the edge joining two objects or faces.

Andy's Head Design

The final design for Andy's head is shown above. I think that it accurately portrays the idea that I initially had. The rounded features and overall shape seem to enhance the feeling of friendship or affection.

The Head Base

I also designed a simple base for the head component. It can snap together with the head and allows for wires to be fed down to the body. The head only contains the ultrasonic sensor because I wanted it to remain as light as possible. Additionally, the head plate allows for a pan tilt mechanism, which is responsible for the rotation and tilting motions, to be attached to the head as a whole.

Head Base Design

I designed this base by extruding a circle with the same dimensions as the head. I used the same technique again, but this time with a radius the same as the inner radius of the head. This created the snapping functionality. Finally, I made another circular sketch that I then used to extrude through the other parts of the plate, making a hole for the wires to run down to the body. The final design for the head base is shown.

The Body

I wanted Andy's body to be reminiscent of a sphere. To that end, I actually started with a sphere element in Fusion 360 and used some offset planes to split it into three parts. I only needed the middle componenet, so I hid the other two outputs from the split operation. I then hollowed the spherical shape with the shell function. I cut a hole on the top of the body so that I could put all of the components inside the robot as well as attach the head. Finally, I needed a way to attach the wheels of the robot. I planned to use two dc motors to provide Andy's locomotion. So, I used the dimensions of the output shafts for these motors to sketch a similar profile with enough room to turn the dc motor 360 degrees. After creating the sketch, I generated a hole by extruding the sketch into the body. Once I had finished one hole, I simply needed to mirror it across the center xz plane to create the second. Andy's main body component is shown below.

Andy's Body

The Assembly

The image below shows the final assembly of Andy. It includes the head and head plate, the body, and some wheels that I roughly created using some circle sketches, Fusion 360's fillet command, and the mirror function.

Andy's Final Design

The Electronics

Andy has a lot of different functions and capabilities, so there are quite a few sensors needed. First, an ultrasonic sensor is used to ensure that Andy doesn't run into anything. Next, an accelerometer/gyroscope sensor was implemented in accordance with a PID controller for self balancing. This also required two dc motors to control the speed and direction of the wheels. Finally, two servo motors were needed to support the full rotational capacity of Andy's head.

Design Decisions

I decided to use two SG-90 servo motors primarily because they are inexpensive and readily available. However, due tothe weight of the head, more heavy duty servo motors might need to be considered if the robot is in use for longer periodsof time. These servo motors are capable of 90 degree rotation in either direction about their center position, allowing forthe desired mobility in Andy’s head. The output position can be controlled using pulse width modulation with a 20 ms,or 50 Hz, period.

I chose to use an HC-SR04 ultrasonic sensor so that Andy does not run into various objects while moving. This sen-sor works on the principle of ultrasonic sound. It sends a signal and essentially ’listens’ for an echo, which would stemfrom a nearby object. It is a digital sensor and can be read directly from a microcontroller.

Though I chose to use an Arduino Uno for the scope of this project, it will likely need to be upgraded for future work.This is because there are not enough PWM pins to support all of the functionality I desired in building Andy. Additionally,the capability of onboard machine learning, rather than that coming from a program running on the computer, is not sub-stantial. Therefore, in further development, I would likely switch to something like an Arduino Nano BLE Sense becauseof its onboard sensors, PWM pins, and capability of some computer vision models. Additionally, this microcontroller hasa relatively low power consumption and includes the possibility of a power saving mode for when Andy is not directly in use.

The Computer Vision Model

I used Python and OpenCV to create a facial recognition model that could be trained for a specific person. Most of the functions implemented come from facial recognition library; see the reference section below. Once a photo is uploaded, the algorithm is able to create an encoding and use it to check for similar features in the webcam system. I also included commands for serial communication between the Python code and the Arduino microcontroller. My idea was that, rather than needing to run the computationally expensive program on the board itself, a simple result of 0 or 1 could be sent via serial communication and used to determine the way Andy interacted with his environment. This was done using the serial and time libraries in Python. The code is available to download on my GitHub.