I’m trying to understand better, the difference between the battery voltage measured at the battery pack, (say 10.53v at one moment), and the output from the ADC battery voltage measurement routine, (9.89 volts at that moment). It appears to be an approximate drop of 0.6 volts.
There is a Schottky diode, and the DMP3017SFG power switch in the path, which I’m guessing are the culprits. I’ve attempted to read the power switch data sheet, but I can’t understand enough get to the Vdrop math to match.
Googling Schottky diode voltage drop seems to point to 0.3v drop there, and the DMP3017SFG data sheet seems to read 0.7v drop there, so I would expect to be seeing a 1.0 volt difference between the battery and the ADC input.
(For anyone trying to follow along on this thread: The battery voltage tap goes through a 12.4k / 200k divider circuit to the 12-bit ADC with the regulated 5 volt supply as the reference, so the battery voltage precision should be 16 times 0.0012v/bit or roughly .02v.)
Sorry to be so pedantic - it is a curse. I know 0.4v to the good should be a “count your blessings, and shut up”, but I really would love to understand this.
Alan - the “cyclical obsessive”