my blog my blog

Category: Robotics
Measuring voltage on a Raspberry Pi and displaying it in style

After I completed the circuitry to power my raspberry pi from a battery pack, I wanted a way to display the voltage of the battery pack, and be able to access the voltage level from the raspberry pi, so it can shutdown automatically when critical voltage levels are reached to prevent damage to the filesystem or draining the battery too much.

Our Pi can stand at most 5V per input pin, so how do we measure voltages like the 7.4V of our battery and above (if its full, it has more than 7.4V)? We have to scale the highest expected voltage down to 5V! The easiest way to do this is by using a voltage divider. Suppose the 9v-battery in the schematic below is our battery, and the arduino is powered over USB. We connect GND of both and divide the 9V of the battery by using a voltage divider. If both resistors in the schematic below are equal, we effectively halve the voltage. Remember the formula for voltage dividers (without load):

U_{out} = U_{in} * \dfrac{R2}{R1 + R2}

In our case, that means if the battery is full and has 9V:

U_{out} = 4.5V = 9V * \dfrac{10k}{10k + 10k}

So the maximal expected voltage at our analog-in is 4.5V!

By using the arduino-function analogRead(A0), we get a value between 0 and 1023, which represents the voltage scale from 0-5V. We know that whatever we read will be halve of the actual external voltage, so to convert this value back to the actual voltage:

int sensorValue = analogRead(A0);
voltage = sensorValue * (10.0 / 1023.0)

Now this can be put into a nice little script to output this value over serial to the Raspberry Pi, and display it somewhere. I chose to use a 1.44 Inch Spi Tft Lcd Color Screen St7735 with 128×160 pixels and wrote some code to save and display measurements at certain time-intervalls in a rolling fashion, which results in this:

Powering a raspberry pi from battery

The raspberry pi will be the main processing unit of my pypibot. I want to power 6v-motors, so I decided on going with a 7.2v battery-pack. I had one lying around with 2600mAh, which should be enough for testing the setup right now.

I originally planned on going the easy way and ordered a converter from 8-36v to 5v, with a micro-usb-connector already wired (from DROK). Without any other load this worked nicely, although the input voltage was below 8v. But as soon as the motors have been wired up, the voltage would drop too low for this thing to still output 5v, and in consequence the pi went down.

So, here is the definite way to go if you want to power a raspberry pi in a robust way from a 7.2v battery (or anything above that voltage):

Using an adjustable DC/DC power converter! While these units cost little over 5 EUR (for 5 units in total), they take anything from 4V – 35V as input, and the output voltage can be configured by turning a little screw on a potentiometer. In my experiments, the voltage could drop as low as 6.1v, and this unit would still supply a rock-steady 5v to the pi (once setup, it will deliver steady 5v on the output for a wide range of voltages actually). They can withstand 3A max, which should be enough for the raspberry pi and any sensor I hook up to it.

I ended up soldering a micro-usb-connecter to it myself:

In a first testrun with the 7.2V, 2600mA NiCd battery pack I had lying around (it is quite old, so it probably has a far lower capacity than that), the Raspberry Pi lasted 1 hour and 42 minutes, while driving around with the motors from time to time: up 1:42, load average: 0.48, 0.33, 0.33.

Neato XV Laser scanner (LIDAR)

So today my Neat XV LIDAR module arrived, and I had to test it directly with the Raspberry Pi. For everyone that does not know this wonderful piece of hardware yet: It is a low-cost 360-degree spinning laserscanner that is usually scavenged from the Neato XV vacuum-robots. In Germany it is quite hard to get your hands on one, so I ordered one via ebay from the US.


According to, the wires of the LIDAR-unit have the following pinout:

Red: 5V
Brown: LDS_RX
Orange: LDS_TX
Black: GND

Although the logic-unit is supplied with 5V, the interface (rx/tx) is 3.3v. Perfect for talking to a raspberry pi!

As stated in the wiki, the sensor (without the motor!) draws ~45mA in idle and ~135mA when in use (rotating).

For these first tests, I wired it up by connection the power-lines of the logic to an external 5v power supply (that can definitely provide the needed mA), the TX of the scanner directly to the RX of the raspberry pi, connected the GNDs. Without connecting the motor yet and just powering the logic-unit on while connected to the serial (115200 baud, 8N1), it would greet me with the following welcome-message:

Piccolo Laser Distance Scanner
Copyright (c) 2009-2011 Neato Robotics, Inc.
All Rights Reserved

Loader	V2.5.15295
CPU	F2802x/c001
Serial	KSH14415AA-0358429
LastCal	[5371726C]
Runtime	V2.6.15295
#Spin...3 ESCs or BREAK to abort
Simulating robots with MORSE

It is quite challenging and costly to build up a robot lab, especially if you just want to conduct some experiments with sensors and a moving platform. In todays search of affordable robot platforms, I discovered MORSE, a simulation platform built on the blender game engine ( This article will show how to set it up, select an environment, add sensors and read from them.

It already has the infrastructure, several environments and pre-built robots, sensors (camera, GPS, laserscanner, IR, etc.) and actuators to play with, and it can be installed directly via apt (Ubuntu + Debian). It took me less than an hour to skim through the tutorials, set up a basic environment, add a laser-range sensor to an existing robot and visualize the results, pretty amazing! (You can find all of my project files here:



Displacement priors

What is the target of all this ? Driving in an automotive scenario with a given speed and turnrate at any moment, we want to predict the displacement of a 2D-projection (pixel) between two frames:
p(\vec{uv}_{x,y} | speed, turnrate, camera-matrix, world-geometry)

By using the camera-calibration, I can create artificial curves and walls as 3D point-sets and project them back to 2D. Using discretized values for speed, turnrate, streetwidth and wall-height, I can then simulate the displacement of these 3D-Points when they are projected to 2D (our image).
(Note for me: this is the backprojection-code, main-file: