Autonomous Robot
During my Junior Year at the University of Denver, I and two other classmates took on the challenge of building an autonomous robot for a class we were in. This robot was required to navigate a room to avoid obstacles, pick up flags, and drop off the flags in a drop-off zone.
Along with other teams in the class, we were all given a few supplies to get started, like a Raspberry Pi 3b and a motor controller, but the overall form and function of the robot were completely up to each team. Within my team, we built a tank-style robot with treads along each side to move, two long arms to pull in flags, and a long “pusher” to eject the flags. Within this project, my team consisted of one electrical engineer (myself) and two mechanical engineers. As the electrical engineer, I took on all electrical and software work and the CAD work on the drive train.
Starting with the drive train, we decided to go with a treaded design because it would allow us to pivot along the center of the robot in addition to not overdrawing the battery by using four motors. These treads took a lot of inspiration from electric skateboard belt systems by utilizing a cogged belt and gear setup. The system was straightforward: attaching the drive-gear to the motor, and then the belt crossed over and slipped onto the free-wheel.
As seen in the picture, the gears and framing was all FDM 3D printed in PLA, and the belt was FDM 3D printed in TPU. This design took three iterations and a couple of printing attempts; however, it turned out very functional.
Next came the electrical work within the system. Due to the chaotic wiring mess on the physical robot, this flow chart explains the electrical system quite cleanly in its most basic mode: remote control mode. As seen above, the main brain on the robot was a Raspberry Pi 3b. This microcontroller could take in the remote inputs via Bluetooth and then do the remainder of the work as explained in the code. For the drive train, the Pi directly spoke with a Teensy (Arduino) through USB connections, and then the Teensy controlled the motor controller via the “hat” that connected the two. From there, the motor controller could power and influence each motor individually. As for the servos that collected and ejected the flags, each was controlled by sending code to a servo driver from the Pi, and then the servo driver sent PDM signals to each servo to control each respective angle.
This system worked quite well and allowed for quality control of the robot with the remote control:
Finally, the code was implemented into the system. The code was written entirely in Python and run natively on the Pi through Thony, the onboard IDE. The Teensy integrated into the system was coded in C; however, it was written by a previous graduate for everyone to use, and then a simple Python Library was available to control the Teensy and motor controller, respectively.
It’s also worth noting that the camera on board could run image recognition software known as YOLO. However, the school’s server went down before any deep integration of the system could be incorporated.
The final result of this project was a robot capable of avoiding obstacles, collecting flags, and ejecting the flags via remote control and reprogrammed maps. The final test was not recorded. However, a simplified demonstration of the final product can be seen as follows.