If you’re doing this as a fun exercise, for the S’s and G’s so to speak, that’s one thing and we all “go lobster” stubborn on projects from time to time.
In that case, enjoy learning about the finer details of process monitoring.
My concern was that this might be getting totally out-of-hand and you might be rapidly approaching the point of diminishing returns and potentially excessive risk and expense.
Maybe I’m missing something here, but it seems that this shouldn’t be this difficult.
Most of these sensors should be, more or less, plug-and-play - plug 'em in, perhaps do a bit of scaling in code to generate the precision you need, and away you go!
I fear that you may be wandering down that old bogus path, ultimately wasting time on something foredoomed to ignominious failure.
Another thought:
What is the spec’d voltage for this beastie? 3.3 or 5v DC? If it’s really-and-truly designed for +5v, 3.3v may be riding the ragged edge of inoperability. Sensors, especially A/D sensors, can be extremely sensitive to their power sources, doing bizarre and strange things as VCC drops.
Professional sensors go to great lengths to make sure voltages like VCC and Vref, (derived from VCC most of the time), are rock-solid, with generous filtering and decoupling.
Even if it is says 3.3v “should work”, if it wants, (or can use), +5, you might want to go with the higher voltage - if for no other reason than increased noise immunity and stability.
You might want to try running it at +5 and use a level-shifting i2c interface. I believe SparkFun has something that might work and plugs right in. Absent that, DigiKey or Mouser might be a good bet.
What say ye?