GoPiGo3 Occupancy Grid Project Update

One of the first things folks with ROS robots will “show off” is an occupancy grid or LIDAR map of the room. Since Carl doesn’t have LIDAR and doesn’t run ROS, I thought occupancy grids to be in the “too hard” or at the very least “not yet” category.

That was until robotshop tweeted the work of BigFace83 and his Big-Wheel-Bot project.

He implemented a sensor data recording program that stores [left and right encoders, imu heading, forward and rear facing ultrasonic sensors, and a pair of +/- 45 degree off heading IR sensor readings five times a second to a Data.txt file, and captures a synchronized MotionJPEG at FPS 320x240 into an avi container video that can be viewed by VLC on the Raspberry Pi desktop, (and also can be read in later and annotated with objects, ranges, headings, etc).

So basically four programs

  • Remote Control Driving
  • Data Gatherer - Python, OpenCV output Data.txt and <datetime>.avi
  • Path Plotter - Python, OpenCV reads Data.txt and plots path from encoders and imu
  • Occupancy Grid Plotter - Python, OpenCV reads Data.txt and uses sensor probability functions to create the Occupancy Grid in an OpenCV image array.

Being crazy obsessed with Carl as I am, I decided this looked like something I could reproduce for my GoPiGo3 robot Carl with his panning DI ToF Distance Sensor, DI IMU, and GoPiGo3 wheel encoders, and his PiCamera.

Getting the Data.txt file going was smooth sailing, but that synchronized video was really frustrating me this week. After hours and hours searching the net, trying every possible tweak from the various fragments I found, I thought “this one has beaten me.”

Now, having had a breakthrough by switching image height and image width in one line, I can hardly contain my happiness.

So I finish this week with the data and video gathering program done. Next week the autonomous explorer program and hopefully the path display program.

His path display:

His Occupancy Grid display:

Carl is charging up to be ready to explore:

2 Likes

I’m truly impressed! It’s so cool to see what you can get this little robot to do!

2 Likes

Carl has learned to collect the data needed for his occupancy grid investigation:

2 minute video showing a data collection:

Next step - path analysis and display

Getting close: IMU estimated path in red, wheel encoder estimated path in blue:

and another path, but got an early [Errno 121] I2C transfer: Remote I/O error and a throttled=0x20000 so no range data from DI Distance Sensor in this 14 minute run :cry:

The imu path shows returning exactly to the starting point, which it did.

Another day, another data collection, more testing.

  • This 12 min 30 second run only had four I2C errors, which were recoverable somehow.
  • The processor temp only got up to 62 degC, so no throttling.
  • Total travel of this collection was 7076 mm
  • I drove Carl back to approximately [205,297] on path plot of the starting point (2" right and 1" fwd of starting point)
  • Ending position estimates: Note [0,0] is upper left corner, starting position [x:200,y:300] cm
    • imu ended at (x:197,y:322) cm for error of 262 mm in 7076 mm travel = 3.7% error
      Not quite sure what to call this error, position error? (actual - estimate) / total distanced traveled
    • encoders ended at (x:222, y:287) mm for an error of 197 mm = 2.8% of the total travel
  • Ending heading estimates (Carl was roughly at same heading start and end):
    • imu began with heading 358.9 and ended with heading 358.8 (0.1 degree off - WOW!)
    • encoders “lost” 7.3 degrees, ending at 352.7 degrees (started at 0.0)

Here’s the annotated plot, proc temp, and I2C errors:

Wow, the reality of uncertainty in my robot’s understanding of reality is confounding me
now that I see a log probability map for the 25 degree beam width of the VL53L02
Time Of Flight Ranging Sensor.

It is not really an obstacle detector - it is a “no obstacle” detector!