[vc_row][vc_column][heading title=”Projects” subtitle=”PyPiBot”][/vc_column][/vc_row][vc_row][vc_column][vc_column_text]Work in progress, expect regular updates and blog-entries on specific parts of this project.
Project description and plans
A robot that is controlled by python on a raspberry pi. The build is rover-like, with four driven wheels in a skid-steering configuration. I plan to use it mainly for computer-vision-experiments and for modeling different cognitive systems to solve tasks like navigation, mapping etc. I plan to evaluate different learning mechanisms for object- and place-memory using this plattform. It will be equipped with a rgb-camera, a 360-degree lidar, ultrasonic range-measurement-sensors, and wheel encoders. The build should be easily reproducable by using off-the-shelve parts, so anyone can replicate this robot. Below is a rendering of the planned robot, alongside the current status:
Milestone 0: Assembly and power-distribution
I first aligned the motor-mounts on the plexi-sheets and drilled holes, then fastened them using regular M2-screws and nuts. Every base is connect to the one below using threaded standoffs with a 1cm thread (needs to go through the 4mm plexi and still be long enough to be screwed into the threaded standoff of the lower level.
As detailed in my blogpost about powering the raspbbery pi from a battery, I found a stable configuration of providing power to the motors (up to 7.4V) as well as stable 5V the raspbbery Pi by using a dc/dc-converter.
Milestone 1: Driving untethered.
After I have fixed the power-issues, it is now able to drive untethered. For this experiment I implemented a quick prototype to set the directions of each motor and then apply pulse-width-modulated signals to their speed-pins. For below video, I just applied a certain direction and speed for 2 seconds each time based on keyboard-input.
Milestone 2: Software
The first implementation of the system running on the Raspberry Pi can be found on github: https://github.com/TobiasWeis/pypibot. Every module is running a python-multiprocess, processes communicate with each other using a managed dictionary.
Milestone 3: Localization and Navigation
Although this Milestone has not fully been reached yet, a huge leap has been accomplished by successfully interfacing the Picolo Laser Distance Sensor. For calibration purposes and tests of my software, I created an artificial environment with the small box exactly 1m away from the center of the laser distance sensor:
The scan seems pretty accurate! Lets see how this looks when we let the robot drive forwards and backwards again:
Also, simulated odometry and simulated laser-scanner-input, as well as a first mapping-algorithm has been implemented:
Milestone 4: Visual input
With the camera in place, its time so have a look at what the robot is seeing, so I dumped some frames while driving the robot manually (that’s where the rugged movement comes from):
Appendix – Part-list
The current part-list consists of the following items:
- Raspberry Pi B+ V2 (amazon, 34.99 EUR)
- Edimax EW-7811UN Wireless USB Adapter (amazon, 7.99 EUR)
- Motors: 4x DC 6.0V, 35:1 gear ratio, diameter of shaft: 4mm ( Eckstein, V-TEC 6V Mini 25D DC Motor, ebay, 63.80 EUR)
- Motor-mounts (4x DROK 25mm DC, amazon, 11.59 EUR)
- Motor-driver: 2xL298 dual h-bridge modules (STL298N for Arduino, ebay, 10,70)
- Tires: 2xABSiMA Wheel Set Buggy “Street” 1:10 (conrad, 2×7.69 EUR)
- Tire-mount: 12mm Hex Wheel Adapter for 4mm shaft
- Battery-pack: 7.4V 2400 maH RC-Akku, NiCd (conrad, approx. 25 EUR)
- DC-DC converter, LM2596 DC/DC-Wandler, Step-Down Modul, 3-40V auf 1.5-35V (amazon, 6.79 EUR)
- 8xMini-breadboards 170 holes (amazon, 10.22 EUR)
- Logitech C920 webcam (amazon, 59.99 EUR)
- 4x 4mm Plexi-sheets, 15x15cm (ebay, approx. 10 EUR)
- Jumper-Wires, screws, distance-sleeves, threaded standoffs, nuts, etc (approx. 20 EUR)
- Arduino Nano (amazon, approx. 8EUR)
[/vc_column_text][/vc_column][/vc_row]