How quickly can you read the GPG battery voltage and get valid results?

To give myself a break, I decided to play with battery voltage measurements and writing data to files in Python.

I created a python file called  It periodically reads the GoPiGo battery and 5v voltages and prints them to the console until I hit a CTL-C.  Once I do that an exception handler catches the exception and I use it to aggregate all the data, take an average of the difference between the values as-read and the voltage value I’m sending, and send it to a file, opened in append mode.

I noticed something interesting:
The more quickly I took readings, the lower the offset reading was.  If I take readings at 0.10 second intervals, the “delta” between “12v” and the read value is relatively small.

If I increase the pause between reads, the delta value also goes up.

Here’s the software: (2.4 KB)

. . . and the results:

voltage_test.7z.txt (1.7 KB)
. . . which contains the program file and copies of the outputs.

Why should reading at a higher speed, (0.10/second) have grater accuracy than reading at one second per reading?


Interesting indeed.

I don’t have a fixed voltage source, so I made some measurements based on an initial voltage reading on Carl.
(Carl also running voicecommander, juicer, wheel logger, health check, life logger)

I added an optional command line argument for the sleep_interval,
and removed the rounding of the measured voltage differential after testing showed {:.3f} printed identical means to the rounded means.
(I was worried about compounding rounding error into the means and std deviations.)

Additionally, I wanted to see how the min/max/mean/sdev varied so here are:

  • two 10 second runs at 1.0 s
  • two 10 second runs at 0.1 s
  • two 60 second runs at 1.0 s
  • two 60 second runs at 0.1 s
  • two 120 second runs at 1.0 s
  • two 120 second runs at 0.1 s
We took 10 measurements at 1.0 seconds (based on reference voltage of 10.057)
min: -0.128  max: 0.000  mean: -0.071  sdev: 0.043 

We took 103 measurements at 0.1 seconds (based on reference voltage of 9.997)
min: -0.205  max: 0.000  mean: -0.160  sdev: 0.037 

We took 105 measurements at 0.1 seconds (based on reference voltage of 10.091)
min: -0.103  max: 0.060  mean: -0.056  sdev: 0.026 

We took 10 measurements at 1.0 seconds (based on reference voltage of 10.048)
min: -0.120  max: 0.000  mean: -0.075  sdev: 0.033 

We took 60 measurements at 1.0 seconds (based on reference voltage of 10.022)
min: -0.129  max: 0.085  mean: -0.084  sdev: 0.038 

We took 60 measurements at 1.0 seconds (based on reference voltage of 9.98)
min: -0.145  max: 0.043  mean: -0.108  sdev: 0.038 

We took 605 measurements at 0.1 seconds (based on reference voltage of 9.98)
min: -0.120  max: 0.120  mean: -0.072  sdev: 0.033 

We took 638 measurements at 0.1 seconds (based on reference voltage of 9.851)
min: -0.214  max: 0.017  mean: -0.158  sdev: 0.036 

We took 1200 measurements at 0.1 seconds (based on reference voltage of 9.466)
min: -0.154  max: 0.146  mean: -0.091  sdev: 0.044 

We took 121 measurements at 1.0 seconds (based on reference voltage of 9.466)
min: -0.128  max: 0.035  mean: -0.083  sdev: 0.037 

We took 120 measurements at 1.0 seconds (based on reference voltage of 9.483)
min: -0.103  max: 0.086  mean: -0.056  sdev: 0.043 

We took 1202 measurements at 0.1 seconds (based on reference voltage of 9.431)
min: -0.146  max: 0.154  mean: -0.096  sdev: 0.042 
pi@Carl:~/Carl/Examples/voltagereadspeed $ 

I don’t think I can see a significant difference between 1.0s readings and 0.1s readings,
but I it might take a lot more tests to be confident of that.

Having a nice fixed voltage source would allows longer runs,
but perhaps the standard deviation is related to the varying processor load more than A2D variation.
I have no idea how to characterize the constantly changing processor load.

If I was going to try to hypothesize why 0.1 s readings might have greater accuracy over 1.0 second readings,
I would propose that the processor load changes less in 0.1 seconds than in 1.0 seconds,
but that is just wild free running brain storming.

How I changed the program:



# PURPOSE: Characterize GoPiGo3 battery voltage measurements


  USAGE: ./ [-h] [-i INTERVAL]

  optional arguments:
    -h, --help            show this help message and exit
    -i INTERVAL, --interval INTERVAL (n.n in seconds, default=1.0)


import sys
from easygopigo3 import EasyGoPiGo3
from time import sleep
import argparse
import statistics

argparser = argparse.ArgumentParser(description=' Characterize GoPiGo3 battery voltage measurements')
argparser.add_argument('-i', '--interval', dest='interval', type=float, help="n.n in seconds, default=1.0",default=1.0)

args = argparser.parse_args()
sleep_interval_seconds = args.interval

mybot = EasyGoPiGo3()

# value = 0
values = []
count = 0
# Reference_Input_Voltage = 12.0
Reference_Input_Voltage = mybot.get_voltage_battery()

file1 = open("./voltage_test.txt", "a")

def round_up(x, decimal_precision=2):

#  "x" is the value to be rounded using 4/5 rounding rules
#  always rounding away from zero
#  "decimal_precision is the number of decimal digits desired
#  after the decimal divider mark.
#  It returns the **LESSER** of:
#     (a) The number of digits requested
#     (b) The number of digits in the number if less
#         than the number of decimal digits requested
#     Example:  (Assume decimal_precision = 3)
#         round_up(1.123456, 3) will return 1.123. (4 < 5)
#         round_up(9.876543, 3) will return 9.877. (5 >= 5)
#         round_up(9.87, 3) will return 9.87
#         because there are only two decimal digits and we asked for 3
    if decimal_precision < 0:
        decimal_precision = 0

    exp = 10 ** decimal_precision
    x = exp * x

    if x > 0:
        val = (int(x + 0.5) / exp)
    elif x < 0:
        val = (int(x - 0.5) / exp)
        val = 0

    if decimal_precision <= 0:
        return (int(val))
        return (val)

    print("batt_test with read interval {:.2f} seconds".format(sleep_interval_seconds))
    while True:
        # Measured_Battery_Voltage =  round_up(mybot.get_voltage_battery(), 3)
        Measured_Battery_Voltage =  mybot.get_voltage_battery()
        Five_v_System_Voltage = round_up(mybot.get_voltage_5v(), 3)
        # Measured_voltage_differential =  round_up((Reference_Input_Voltage - Measured_Battery_Voltage),3)
        Measured_voltage_differential =  Reference_Input_Voltage - Measured_Battery_Voltage
        # value = value + Measured_voltage_differential
        count = count+1
        print("Measured Battery Voltage =", Measured_Battery_Voltage)
        print("Measured voltage differential = ", Measured_voltage_differential)
        print("5v system voltage =", Five_v_System_Voltage, "\n")
        print("Total number of measurements so far is ", count)

except KeyboardInterrupt:
    print("\nThat's All Folks!\n")
    data="\nWe took " + str(count) + " measurements at {:.1f}".format(sleep_interval_seconds) +  " seconds (based on reference voltage of " +  str(Reference_Input_Voltage) + ")"
    data="min: {:.3f}  max: {:.3f}  mean: {:.3f}  sdev: {:.3f} \n".format(min(values), max(values), statistics.mean(values), statistics.pstdev(values))

The intent of this research was to determine a “correction factor” to include in the voltage measurement so that the battery voltage measurement is identical to the actual voltage at the battery connector.

I used a known, fixed, voltage supply so that the reference input voltage is a constant value.

It would be interesting to have you run the procedure as submitted so that we’re comparing apples to apples; though to me, statistically, the mode/modal value would be of more interest than the mean/median value.  What value is most common and/or which value is the center of the bell-curve.  That would be, (IMHO), a more accurate calibration offset constant than the simple average of all the values.

IMHO, there should not be such a wide variation in the VCC voltage.

This tells me that something is wrong.

  • There is excessive series resistance between the actual barrel connector and the point where the measurement is taken.
  • The power source cannot supply the required current and current spikes briefly overload the supply.
  • The AC bypass around the supply and the internal circuit rails is insufficient.
  • The impedance of the measuring instrument is too low and is drawing significant current.
  • Something else we have not discovered yet.

I am presently favoring the first possibility because the Pi on my robot has a tendency to fold back the 5v rail when even lightly loaded.  It should be noted that, (at least), the USB-3 connectors on the Pi-4 are rated at an amp or so, so there should be enough power to drive a small external SSD.

The best way to test that would be to take a “test” GoPiGo3 board and wire the barrel connector direct to VCC, and provide a high-current buck-converter direct to +5, bypassing all the on-board circuitry.  A high-resolution 'scope across my power supply’s output probably wouldn’t hurt either.


I’ve been frustrated for years with this very issue, even before seeing it on the GoPiGo3. On my prior robot, I had a voltage divider VCC input and an ACS712 Current Sensor (Vout=current) to an MCP3208 12bit A2D for my Raspberry Pi 3B robot conversion of my RugWarriorPro robot “Pogo” ( Handyboard 68HC11 originally).

The voltage readings and current readings jumped up and down so much that I had no faith even with the mean of 10 readings (at the A2D settling time with a safety factor).

Does Charlie have a spare A2D port open? It would be interesting to see what the std dev of readings is when you put a rock steady known 4-5 volts into a GoPiGo3 A2D port.

I have wondered if the vref of the ATMEL is trustworthy.

Without a scope finding the source(s) of the A2D variation is impossible, and even with I would be scared I would short something out and kill my robot.

And WRT that A2D settling time, I don’t know what the ATMEL A2D settling time plus the GoPiGo3 reading time is, but if we read the battery voltage faster than that, we are going to get a whole lot of duplicate readings in the mean and deviation.

(Although I passed my college statistics class with a good grade, I really did not come away with any comfort to understand any of the measures. I thought I knew what an average was, but then they had to confuse me with mean, median, mode, small vs large samples, known and unknown. I got mono during the semester and the teacher must have felt sorry for me! )


ICUMI: added after you saw that last post


Sounds like either the dining hall should have been condemned or you were having a good time. :wink:  At least it wasn’t a :clap:.

If I remember correctly, (shrug):

  • “mean” = “average” = sum_of_all_samples / number_of_samples
  • “median” means the same thing in statistics as it does in driving:  It’s what’s in the middle.
    • In the set “12345” the median is “3” because that’s smack-dab in the middle.
    • In the set “123456” the median is “3.5” because that’s in the exact center of the sample set.
  • “mode” represents the value, or group of values, that shows up the most often.
    • If you have a reasonably large sample of numbers, and there is a bell-curve distribution around “15”, (15 occurs the most often, other numbers around it less often tapering as you move away), and there is a few samples in the 500 range, both the median and the mean will be skewed way north of “15”, but the mode will be “15” because that’s the value that occurs the most often.
    • In the census of a reasonably large population, (say in Detroit), you might get a lot of racial types distributed around “African American”, another whole bunch around “Caucasian”, with a smattering of other racial types filling in the cracks.  This would be considered “multi-modal” - as there are two distinct types that occur far more often than the other types.  This is the kind of demographic data that advertisers live for. (:wink:)
  • Ideally, if the sample is reasonably large, and things are distributed well, then mean, median, and mode will be centered around the same small set of numbers in your sample.  This tells you that your sample is both homogeneous and well distributed, resulting in a high level of confidence in the outcome.
  • “sample size” represents one of the factors used to determine if the level of confidence is worth spitting on or not.
    • In a group of two million individuals, a sample size of “fifteen” doesn’t lend itself to credibility.
    • Determining what size the sample must be to be believable is a whole 'nother can of worms.

When dealing with statistics, just remember the words of the famous mathematician and philosopher, Renee Descartes:

Wikipedia has an interesting article on its origins, attributing it to numerous people, including Mark Twain and Benjamin Disraeli.



statistics.mode(values) will spit that out

1 Like

The 60 and 66 measurements are at 1.0 second and the 601, 602, and 663 were at 0.1 second:

pi@Carl:~/Carl/Examples/voltagereadspeed/orig $ more voltage_test.txt 

We took 60 measurements and the average differential was 2.684
(based on an input reference voltage of 12.0)

We took 601 measurements and the average differential was 2.692
(based on an input reference voltage of 12.0)

We took 663 measurements and the average differential was 2.699
(based on an input reference voltage of 12.0)

We took 66 measurements and the average differential was 2.707
(based on an input reference voltage of 12.0)

We took 60 measurements and the average differential was 2.713
(based on an input reference voltage of 12.0)

We took 602 measurements and the average differential was 2.716
(based on an input reference voltage of 12.0)

1 Like

The one thing you need to do is measure and set the reference input voltage so that the differential values make sense.

I have no idea where these measurements came from as they don’t appear in the data.

Oh snap! That’s the number of measurements taken!
[cartoon curse words!]

1 Like

Don’t have a way to do that really, especially not to a 3-digit precision. The 3-digit precision voltage measurement at the battery jumps around too fast to read on my meter.

I looked in my junk drawer for a decent power supply, but all were either too weak or too high voltage.
I don’t often need a solid variable voltage supply but that and an oscilloscope have always been on the want list but never make it to the top of list.
Besides, us “software guys” can’t be trusted with tools - union rules when I worked for Lockheed-Martin Marietta. I’m surprised they didn’t flag me for pulling out my own desk chair.


Question: Lacking an oscilloscope, if I put my multimeter in AC voltage mode and measure across the battery under load, will the multimeter read the load induced voltage variation peak/rms in mV?

Using the reading at the start of the test, probably should use an average of ten reading as the reference, but using the reading at the start of the test as the reference and testing for a short time so that the battery voltage should not decrease very much during the test, should measure the GoPiGo3 systemic battery voltage reading variability. Matt was saying the readings were +/- 4% accurate which is a whopping +400mV to -400mV at 10V with an 8.6mV precision and with the 5v as the reference. Watching your printout of the 5v reading against the 5v reference claims the 5v is varying by around 70mV, but then asking the fox how many chickens are in the coop is probably just as foolish.

At any rate: Running with mode output shows that the most probable reading if you only take one reading is probably not near the median/mean/average or what ever that “add em up and div em out” number is:

We took 60 measurements at 1.0 seconds (based on reference voltage of 8.917)
min: -0.180  max: 0.077  mean: -0.134  mode: -0.155   sdev: 0.056 

We took 60 measurements at 1.0 seconds (based on reference voltage of 8.952)
min: -0.137  max: 0.026  mean: -0.098  mode: -0.120   sdev: 0.041 

We took 601 measurements at 0.1 seconds (based on reference voltage of 8.926)
min: -0.171  max: 0.129  mean: -0.115  mode: -0.137   sdev: 0.047 

We took 602 measurements at 0.1 seconds (based on reference voltage of 8.909)
min: -0.188  max: 0.112  mean: -0.129  mode: -0.154   sdev: 0.048 


Probably not.

What you will get is the AC component of the DC voltage - i.e. the ripple voltage.

The question of if you get a simple average, P/P, or RMS is a question for whomever wrote the software/designed the hardware for your meter.

Assuming that you don’t have a MULTI-hundred-dollar Keithley, Fluke, Techtronics, or such like meter accurate to God’s Own Left Toenail, (i.e., not an el-cheapo WallyWorld meter like mine), then it’s anyone’s guess.  I suspect it would be accurate to, (ahem!), “commercial” standards, but what those standards are is a mystery to me.

Like you, a set of precision voltage references, calibrated resistors and current shunts hasn’t reached the top of my “Holiday Gift List” either, though a reasonably precise digital scope with a minimum of a 30 mHz bandwidth should do as a voltmeter, if the people who wrote the software have a clue. :man_facepalming:

On a similar front, I am working on trying to figure out how to turn the JYE DSC-068 'scope I have into a respectable USB scope on Win-7 or XP.  I was able to find an older laptop to experiment with and I have the remnants of my TechNet subscription’s downloads and keys, so once I get a moment or two, I am going to mess with this.

A Saelae logic analyzer (that plugs into your laptop/desktop) is on my list too, but I think I’ll get back to the States before I have even the Ghost Of A Chance of getting one.  They weigh-in at the cost of a laboratory grade meter or so, and that’s just not in my budget until that federal grant to invent something comes through. (yea, right!)



I have test results from my battery test.

  1. The software used: (4.6 KB)

  2. The test configuration:

    I added binding posts to the robot itself and connected them across the barrel connector with (relatively) beefy wire to eliminate errors due to line drop.

  3. The tests:

    • There were two test series, one with the big power-supply I built - and discovered that the leads to the barrel connector I used are too small - and the other with a multi-amp 12v power brick.
  4. The results:

    • Power supply:

      • 0.25 sec/measurement = 0.511 voltage differential between the measured voltage at the barrel connector and the voltage as-read by the robot.

      • 0.35 sec/measurement = 0.513 and 0.516

      • 0.50 sec/measurement = 0.515

      • Average = 0.514 and the range was 0.005

    • Power Brick

      • 0.25 sec/measurement = 0.512 and 0.512

      • 0.35 sec/measurement = 0.513 and 0.510

      • 0.50 sec/measurement = 0.509 and 0.510

      • The average was 0.511 and the range was 0.004

The overall average is 0.513
The range of all the readings is 0.007, though I consider the second set more reliable as the voltage was more stable during the test.

  1. The test data: (2.6 KB)

I am going to set an offset of 0.512 to compensate for the circuit drops when measuring the battery’s voltage.

1 Like

Yes the wire again. So many people had mysterious power problems the first few years the Raspberry Pi came out that turned out to be the tiny, but significant, voltage drop between an “in spec” power supply and the Pi.

Perhaps measuring at the power supply with the double barrel plug jumper in line to the GoPiGo would be more representative of the batter voltage for particular GoPiGo3 readings?

I think that is why my reading to battery voltage constant is more than the usual diode drop value even on Dave.


I wired the binding posts directly to the barrel connector.

Because my meter is a 20,000Ω/volt instrument, (I hope!), the current draw through the meter should be negligible and the voltage reading there should be accurate.

Since what I am hoping to achieve is an increase in the accuracy of the “measure battery voltage” routine so that it accurately measures the battery voltage, that should be sufficient.

IMHO, (from a hardware engineering standpoint), the fundamental problem is that the Raspberry Pi has “outgrown” the GoPiGo - and current Raspberry Pi versions draw considerably more power than the original model 1 and 2 the device was designed for.

As a consequence, the GoPiGo’s power circuits get overwhelmed and my theory is that the voltage droop can become non-trivial.

I am going to continue examining this issue and I will report findings and recommendations.



I tried this test with the Li-Ion battery pack and not only is the voltage dropping relatively quickly from fully changed, but there is about a 40 mv variation in the readings.

How do you build on such quicksand?

OK, things settled down a bit - and I ran some tests.

If I modify, line 616, to read return ((value / 1000.0) + 0.51), I get battery values that are consistent with the measured input voltage.

I also changed my measurement resolution from three digit to two decimal digits - the last digit varies too much to be relevant to accuracy.

Viz.:  (with a measured voltage of 12.13v set into the program)

Test Cycle 1 of 4 which consists of 10 tests every 0.5 seconds)
The Average Differential for this test cycle was 0.0

The Average voltage for this test cycle was 12.13

Test Cycle 2 of 4 which consists of 10 tests every 0.5 seconds)
The Average Differential for this test cycle was 0.0

The Average voltage for this test cycle was 12.13

Test Cycle 3 of 4 which consists of 10 tests every 0.5 seconds)
The Average Differential for this test cycle was 0.0

The Average voltage for this test cycle was 12.13

Test Cycle 4 of 4 which consists of 10 tests every 0.5 seconds)
The Average Differential for this test cycle was 0.0

The Average voltage for this test cycle was 12.13

The cumulative average of all the test cycles was 0

Anyone else that wants to try this can, and results would be appreciated.

Would you like to try this on Carl/Dave to see how close to the actual voltage measured at the barrel connector you get?

  1. I found: current_Vbatt = egpg.volt()+0.81 to give the most accurate result for both Carl and Dave "at the battery"

  2. I pass the EasyGoPiGo3 object to a helper function:

from easygopigo3 import EasyGoPiGo3

REV_PROTECT_DIODE = 0.81    # The GoPiGo3 has a reverse polarity protection diode drop of 0.6v to 0.8v (n=2)
                            # This value results in average readings vs battery voltage error of +/- 0.03

def vBatt_vReading(egpg):
	vReading = egpg.volt()
	vBatt = vReading + REV_PROTECT_DIODE
	return vBatt,vReading

def voltages_string(egpg):
        vBatt, vReading = vBatt_vReading(egpg)
        return "Current Battery {:.2f}v EasyGoPiGo3 Reading {:.2f}v".format(vBatt,vReading)


import sys
import battery
import lifeLog
from easygopigo3 import EasyGoPiGo3

egpg = EasyGoPiGo3(use_mutex=True)
current_vBatt, current_vReading = battery.vBatt_vReading(egpg)
# OR
current_vBatt = battery.vBatt_vReading(egpg)[0]
# OR for convenient printing and logging:
print(data)   # example adding a timestamped voltages string to life.log

pi@Carl:~/Carl $ plib/
Current Battery 11.60v EasyGoPiGo3 Reading 10.79v

  1. I’m sorry, but measuring at Carl and Dave’s GoPiGo3 barrel connector is not easily accomplished.

“not easily accomplished”?  <== masterpiece of understatement!

In all seriousness, what I need is both a “Charlie” and a “Charlene”, one for the hard-core hardware lifting and one to do software development on that doesn’t depend on hardware mods.  Or, one that I can migrate successful hardware mods to. . .

Every time I flip Charlie’s red-board over with a soldering iron in my hand, I imagine you cringing while I do it!  (:wink:)

IMHO, (from a hardware perspective), the canonical voltage measurement is the voltage at the barrel connector, (v_batt), or the “12” rail after processing, (VCC), as those are the voltages with the most impact.

In fact, (IMHO again), there should be the ability to take both measurements because the current drain can cause the Vdrop across the active power processing components in the power switching section to become non-trivial.

Additionally, (again IMHO), the Raspberry Pi is beginning to “outgrow” the power capabilities of the GoPiGo’s power provisioning.  What was suitable for a Pi or Pi-2 rapidly becomes insufficient for the later version Pi-3’s or a Pi-4.

As I see it, there are two possible upgrade paths:

  1. Upgrade the power-handling components on the GoPiGo chassis itself.
  2. Provide a method where the existing power handling components can signal off-board components that can handle more power. This way the on-board power switching can provide “enable” signalling to the off-board circuitry.

For example, I have a XL4005 based DC-DC buck converter than can handle a fairly large handful of amps - and I am planning, (eventually) to tie this from the 12v power rail at the barrel plug and bring it to the “USB-C” connector on the Pi to provide a 5v “boost” to help prevent power droop when things are running, (like the servos or a SSD card).

The big issue is that I cannot signal the buck converter to “turn off” when the GoPiGo turns off the robot.  However, I can isolate the Venable pin on the controller IC of the buck converter and tie that to something like VCC or the “power_on” line to allow automatic switching.

The really best solution would be a drop-in replacement for the 12v switching MOSFET and the 5v buck converter IC, that will allow the GoPiGo to handle the power requirements of more modern Raspberry Pi’s.  And yes, I firmly expect that there will be a “Pi-5” out there that will want even MORE power and people will want to plug it into their robots or Grove boards. . . (:astonished:)


You’ve mentioned these “helper functions” in the past.  I am going to have to study this in greater detail to understand it/them better.




I have added the following to “my” version of the library file:

def get_voltage_vcc(self):
        Get the VCC rail after the big MOSFET switch and the protection diode

        Returns touple:
        VCC voltage, error
        value = self.spi_read_16(self.SPI_MESSAGE_TYPE.GET_VOLTAGE_VCC)
        return (value / 1000.0)