 # [SOLVED] GoPiGo3.volt() and .get_battery_voltage() vs Battery Voltage

I’m trying to understand better, the difference between the battery voltage measured at the battery pack, (say 10.53v at one moment), and the output from the ADC battery voltage measurement routine, (9.89 volts at that moment). It appears to be an approximate drop of 0.6 volts.

There is a Schottky diode, and the DMP3017SFG power switch in the path, which I’m guessing are the culprits. I’ve attempted to read the power switch data sheet, but I can’t understand enough get to the Vdrop math to match.

Googling Schottky diode voltage drop seems to point to 0.3v drop there, and the DMP3017SFG data sheet seems to read 0.7v drop there, so I would expect to be seeing a 1.0 volt difference between the battery and the ADC input.

(For anyone trying to follow along on this thread: The battery voltage tap goes through a 12.4k / 200k divider circuit to the 12-bit ADC with the regulated 5 volt supply as the reference, so the battery voltage precision should be 16 times 0.0012v/bit or roughly .02v.)

Sorry to be so pedantic - it is a curse. I know 0.4v to the good should be a “count your blessings, and shut up”, but I really would love to understand this.

Alan - the “cyclical obsessive”

For the following, I’ll assume a supply voltage of 12v, a load of approximately 1A, and an operating temperature of about 25C. I won’t factor the minimal resistance (and tiny voltage drop) of the cable between the battery pack and the GPG3, or the resistance of the PCB traces on the GPG3, as it would be negligible even at the maximum current draw of the GPG3.

The RB080L-30TE25 Schottky diode has a forward voltage of about 0.36v. The DMP3017SFG power MOSFET has a typical on resistance of about 10mOhms (0.01v at 1A). Together the diode and MOSFET have a theoretical voltage drop of approximately 0.37v. The “battery” (rail) voltage would be 11.63v.

The resistor voltage divider circuit is comprised of a 200k/12.4k voltage divider made with 1% tolerance resistors. That’s an output voltage of 0.666v to 0.692v (nominally 0.679v).

The ADC is 12-bit with a reference of 2.007v to 2.089v (2.048v ± 0.041v). With an input of 0.666v to 0.692v, that’s a raw output of 1305 to 1412.

Assuming I got all my math right, taking that raw output and calculating the voltage (same equation as the firmware uses), that’s 11.179v to 12.096v. Realistically I doubt all the tolerances will be stacked to one extreme, so this is a sort of worst-case scenario range.

Running the numbers on an actual battery voltage of 10.53v, the ADC voltage should be 0.582 to 0.604, the ADC value should be 1141 to 1232, and the battery voltage read should be 9.774v to 10.554v (nominally the actual voltage of 10.16v). The 9.89 volts you are reading is well within the tolerance range.

With the GPG3 HW setup, the voltage precision is about 8.5mv (about 0.008566v), and as explained above (after the voltage drop from the diode and MOSFET) the accuracy is about ± 4% (resistor tolerance and ADC reference tolerance).

Superb! Thank you. Note worthy! Love the detail and specificity. Thank you for including tolerance.

Turning the continuous real-world into the choppy discrete values of GoPiGo’s digital “brain” is quite involved. You explained it very clearly.

1 Like

This topic was automatically closed after 3 days. New replies are no longer allowed.