Seeed Luminance Sensor

Hi,

I’m interested in using the Luminance Sensor from Seed. Unfortunately, there is no Python Example available on GitHub.

As I’m a newbie in coding the code for GrovePi I wonder if there is some how-to or experience available on how to get the lux in Python from this sensor connected to GrovePi.

Any feedback or help is appreciated.

Thanks,
Claudia

Hi,

any idea? :slight_smile:

Thanks,
Claudia

1 Like

(You said Any - I don’t know anything about GrovePi or that sensor - take this FWIW)

  1. ~/Dexter/GrovePi/Software/Python/grove_read_analog.py to get A2D reading
  2. convert reading to Vout (find in grove pi documentation ? Vout = (reading * 5.0) / 1024 ?
    Seeed example seems to suggest you use Vout = (reading * 3.0)/1023
    no matter what Vref grove pi uses ??
  3. interpolate using the Seeed example Vout and Lux arrays
VoutArray =  ( 0.0011498,  0.0033908,   0.011498, 0.041803,0.15199,     0.53367, 1.3689,   1.9068,  2.3)
LuxArray =  ( 1.0108,     3.1201,  9.8051,   27.43,   69.545,   232.67,  645.11,   73.52,  1000)

(There is a scipy routine for interpolation). Here is how I brute forced an interpolation (same linear algorithm as FMultiMap func in the Seeed example - the data looks logarithmic so using linear interpolation is not the most accurate possible … ):


# #########
# INTERPOLATED ARRAY OBJECT
# (instead of polynomial estimation)
#
# http://www.zovirl.com/2008/11/04/interpolated-lookup-tables-in-python/
class InterpolatedArray(object):

  """An array-like object that provides
  interpolated values between set points."""

  def __init__(self, points):
    self.points = sorted(points)

  def __getitem__(self, x):
    if x < self.points[0][0] or x > self.points[-1][0]:
      raise ValueError
    lower_point, upper_point = self._GetBoundingPoints(x)
    return self._Interpolate(x, lower_point, upper_point)

  def _GetBoundingPoints(self, x):
    """Get the lower/upper points that bound x."""
    lower_point = None
    upper_point = self.points[0]
    for point  in self.points[1:]:
      lower_point = upper_point
      upper_point = point
      if x <= upper_point[0]:
        break
    return lower_point, upper_point

  def _Interpolate(self, x, lower_point, upper_point):
    """Interpolate a Y value for x given lower & upper
    bounding points."""
    slope = (float(upper_point[1] - lower_point[1]) /
             (upper_point[0] - lower_point[0]))
    return lower_point[1] + (slope * (x - lower_point[0]))
# You use it like this:
# points = ((1, 0), (5, 10), (10, 0))
# table = InterpolatedArray(points)
# print table[3.2]  returns 5.5