May Be In Love With The $4 Grove Ultrasonic Ranger

What I see from your result is single readings will be +/- 14mm or +/-2.2% which is better than the device spec, and nearly twice better than Carl or Dave’s IR sensor.

Additionally if all Charlie is worried about is bumping into something, stopping at 200mm +/- 4mm is certainly tolerable.

Here is where it might really hurt: Using the IR Distance sensor to measure the robot’s angle with a wall up ahead with a 45 left, 90 ahead, 45 right measurements.

But then again, we would have to repeat the test with a 45 degree incident target to know what the error is for both sensors for a target that is not exactly norm to the sensor.

2 Likes

Which begged the question what does it look like with the ultrasonic on Dave:

Distance Readings: 299
Average Reading: 802 mm
Minimum Reading: 800 mm  Min Error: -0.21%
Maximum Reading: 803 mm  Max Error: 0.16%
Std Dev Reading: 0 mm  StdDevError: 0.06
Three SD as a percent of reading:  0.2 %

2 Likes

Wow, that would be hard to keep the data straight for testing: - single vs. 3 to 5 reading bursts as the bot or target changes known positions to estimate the error rate on a moving bot.

Can’t use the encoders to adjust the reference distance because they have an error rate as well.

I guess I’d have to

  • set target at a known distance, input to test
  • run the single and burst measurements
  • move the target 100mm, input to test
  • run the measurements again
  • repeat for a total of five to 10 distances

The problem with that is having to measure the reference distances to millimeter accuracy, where with the stationary test the average of a bunch of readings is the reference.

I remember doing some “error rate vs number of averaged readings” test very early on to decide on how many readings are needed for a trustworthy distance and settled on 3 quick readings averaged.

2 Likes

Which begs two additional questions:

  1. What was wrong with the way I modified your original program to take 300 samples?

  2. Do you really need sub-millimeter accuracy and incident angle?

AFAIAC, sub-millimeter accuracy is file if you’re doing a land survey, analyzing a Ramen spectrometer’s output, or examining pathogens under a microscope, but for a robot wandering around in a room, a rough approximation of where the walls are and their distances should be fine.  As you get closer to the wall, the resolution improves until you get “close enough” that the actual distance really matters, (within about 10-20 cm or so).

Your own eyes can’t measure distance or incident angle to the kinds of accuracy you’re demanding of Carl and Dave.  And you don’t really need it - you know when you’re “close enough” and precise incident angle isn’t really necessary.

You can get that kind of repeatable accuracy, but it’s going to really cost you for a millimeter band radar ranging device.

The next question is:  Are you using the optical processing capabilities fully and effectively?  I’m no expert on this, but it seems you should be able to use your fancy-pants kickstarter camera assembly to do more than you’re using it for.

What say ye?

2 Likes

OK, here’s data from Charlie for you.  I’ve limited the number of samples taken to 300 - it bails after that.

I can confirm one limitation:  Black surfaces.

When on my table, pointing at the front of my (black) printer, with a shiny surface at about 30-45 degrees, the readings are all over the place.

First test at 60 cm.

Distance Readings: 299
Average Reading: 623 mm
Minimum Reading: 601 mm  Min Error: -3.48%
Maximum Reading: 640 mm  Max Error: 2.79%
Std Dev Reading: 6 mm  StdDevError: 1.04
Three SD as a percent of reading:  3.1 %
The number of samples so far is: 300
300 samples have been collected.

Second series:

Distance Readings: 299
Average Reading: 623 mm
Minimum Reading: 601 mm  Min Error: -3.48%
Maximum Reading: 640 mm  Max Error: 2.79%
Std Dev Reading: 6 mm  StdDevError: 1.04
Three SD as a percent of reading:  3.1 %
The number of samples so far is: 300
300 samples have been collected.

What do you think?

My opinion:
It is perfectly fine for its intended purpose.

2 Likes

Wrong?? I don’t see anything wrong with the code you posted.

I am not seeking “sub-millimeter accuracy”.

My interest in knowing the distance measurement error effect on the incident angle computation comes from:

  1. Wall Follow Initiation Turn: What angle does bot need to turn to initiate wall following headed parallel to wall?
  2. Docking Approach Turn: What angle and drive distance is needed to arrive at the “docking approach point”?

Turns out I was wasting cycles as the angle error resulting from a +/- 3% distance measurement error will be roughly +/- 2 degrees. The robot’s commanded turn angle accuracy is roughly that.

A real robot needs to live with error - error in measurements, error in command response, and errors in the programmer’s mind.

Well, Duh. I am a procrastinating perfectionist afraid of reality. It is a lot easier to distract myself by investigating VScode native and remote, and then with investigating IR sensor vs US sensor.

2 Likes

Before I “saw” that LIDAR was totally blind to my black UPS at any non-normal incidence angle, I really had no understanding of what Carl or Dave might be “seeing” with the IR TimeOfFlight Distance Sensor.

I had a faint concept of the distance sensor having a “beam width” and that obstacle size and shape could affect if the obstacle is detected, but I did not have a clue that color or incidence angle could affect the measurement.

2 Likes

The next next question would be:

  • Are you over-engineering this?
    (i.e. Are you making this more complicated than it needs to be?)

The reason I ask is that this is something I do too.  For example:

When working on my New Remote Camera Robot project, I needed to calculate the ratio of the wheel speeds when making a turn.  Obviously, the further the joystick is moved to the right or left, the sharper the turn will be.  So I decided to trot out some trig to determine the magnitude of the vectors and use the magnitude of these vectors and their sign to determine the wheel rotational velocity in either direction, for both the inside and outside wheels.

Not only did I have to deal with left and right, but also forward and backward motion while making the turn.

And what happens when the joystick crosses the “X” axis in either direction?  I need to be careful there because if I am commanding full speed in one direction and suddenly change it to full speed in the other, I can damage the motors or burn up the “H”-bridge that drives them - so I have to compensate for that too.

I spent hours and hours, becoming days and days, working on refining the calculations to make them quick and easy, substituting addition and subtraction for multiplication and division - and using quick-and-dirty tricks to calculate sine, cosine and tangent, (along with the corresponding arc-functions).  I even enlisted the help of my brother who has a PhD in math to critique my techniques.

It was beautiful!  It was a mathematical dream come true!  And I still had this nagging feeling that I was missing something. . . .  I kept hearing this little voice:  “Inside every large program there is a small program struggling to get out.”

So, I looked again at the axis values - I even drew a circle-chart and plotted the X-Y magnitudes for various joystick positions - and then the light-bulb went on:  The best and fastest optimization for complex trig is no trig at all - in fact the fastest optimization is no math at all!  (Well, a tiny bit of math.)

I discovered that I could use the absolute position of the x and y axes to determine direction, velocity, and wheel ratio with the most trivial math possible.  My biggest challenges were calculating percentages and rounding to the correct precision.

I showed this to my brother and he responded with a face-palm too.

======================

Maybe you need to step back and try to find a stupidly simple way to do what you want, even if it doesn’t have the decimal precision you desire?

What say ye?

2 Likes

Yes, but “stupidly simple” is simply not simple to my simple mind.

2 Likes

Here’s a simplification:

  1. Approach until you get “close enough”.
  2. Turn in the required direction until the distance becomes huge.  (i.e. No longer pointing at the wall.)

IMHO, your biggest problem with docking is that you’re doing it backwards.  Using self-docking vacuum robots as an example, they always dock head-in so they can refine their position continuously and head toward the goal.

You have your robots designed to back-into their docks so none of the sensors can help you.

As I see it, you have two choices:

  1. Figure out a way to determine when the robot is directly in front of the dock and then turn to back in - with a mechanical guide to help you.
  2. Use a set of three line-follower sensors, (if you can get them on three different i2c addresses), and a narrow-beam IR source, (maybe an IR laser?) directly above the dock.
    • One line-follower on the left, one on the right, one on the back.
    • Drive in the general direction of the dock, adjusting to be some desired distance.
    • Drive toward, (roughly parallel) the dock until a side sensor sees the IR beam.  Continue until the beam is centered on the sensor.
    • Rotate in the appropriate direction until the beam is centered on the rear sensor.
    • Drive backward - correcting if the beam drifts from center.
    • The mechanical guide should then help you dock accurately.

Or something like this.

I really think you need to sit down with a piece of paper and forget the trig, calculus, and all the engineering statistics - and figure out how to do it with cardboard and sticky-tape.

What say ye?

1 Like

Nah - I need to sit down with VSCode and test some wall following, instead of constantly thinking about it.

2 Likes

:+1:

That’s one of the beauties of VS Code and GitHub - you can play and play and play and play and - if nothing’s working - back all the way out and try something different.

Tags are your friend!  They serve as “stakes in the ground” marking a particular point in your coding.  If you paint yourself into a corner, you can revert back to the nearest, (or any), tagged point very easily and start fresh.

One thought just came to mind.  Have you tried thinking like a pilot instead of a robot?

Assume you have some way of marking the docking station - an IR beam or laser pointing directly out from it.  Think like a pilot and simply drive in the direction of the dock - wherever it is - until you cross the “localizer beam” and then use that to home in on the dock.  No wall following needed.

1 Like

I have the short range unidirectional beam VOR (two LEDs with a separator), but Carl is not great at seeing them until he gets close. The have-to-turn-around part is quite limiting as you mentioned.

The wall following is a whole separate interest as you know - I’ve been thinking on that forever.

2 Likes

Have you considered replacing them with two IR lasers?  Or, if you want a visible light beam, two red laser-pointers or a couple of bright LED’s behind collimator lenses.  Or perhaps one beam?

1 Like

Actually, I’ve been avoiding the whole “finish the docking challenge”; I just can’t seem to get motivated to finish it. As “Cyclical Obsessive”, I am sure I will get back to it, just not sure when.

1 Like

We have a lot more in common than either of us may wish to admit.  I have the same bursts of energy followed by periods of lethargy.  This whole Ukrainian nonsense has made motivation even more difficult - but I’m getting there.

1 Like

I see today that my memory was failing me, as I just found a study I did for IR in 2017 and revised in 2020 to add US.

IR vs US Wall Detection by Distance and Incidence Angle


Analysis:

  • These values are for distance from bot to a beige color wall

  • IR provides valid distance readings at greater incidence angles than US out to 18 inches

    • IR max of +/- 60 degrees at 12 inches from a wall
    • US provides +/- 15 degrees irrespective of distance
  • US provides valid distance readings at greater distance than IR

    • IR max distance around 24 inches (+/-15 degrees incidence)
    • US max distance around 48 inches (+/-15 degrees incidence)
2 Likes

Doesn’t that contradict the data you just captured a few days ago?  I was golden out to about 60 cm or so, (I didn’t try further - I didn’t have the room.)

1 Like

No - The max distance in the old study was the max distance the sensor could provide reliable wall detection. The IR sensor can provide reliable “normal to wall” distance readings out to 48 inches but I did not feel it could reliably provide the two, off-normal readings to enable “wall detection”.

This was part of my initial study of GoPiGo3 “wall detection”. Before I wrote the wall following, I wanted Carl to recognize there was a wall worth following, not just an obstacle in its path.

I did write a “Guard Closest Object” program that would:

  • perform 360 degree scan to find the nearest object
  • navigate to the “guard post” in front of the chosen object
  • turn 180 to object
  • Perform 160 degree sector scan, warn:
    • If something moves closer, announce “I saw that”
    • If something moves within Guard Area, announce “Back off. I am protecting this area”

I still want to program my “wall sentry”:

  • find a wall, then repeat wall following loop:
    • wall follow (wall on right) till end of wall (obstacle or opening)
    • 180 turn, rest a moment,
    • wall follow (wall on left) till end of wall (obstacle or opening)
    • 180 turn, rest a moment,

And then, with an enhancement to:

  • pause at the center of the wall to face normal to the wall,
  • perform the sector-scan-warning behavior for 5 or 10 minutes,
  • continue the wall-following loop.
2 Likes

. . . . and what was that you were saying about “beautiful unicorns”?  :wink:

Seriously, this sounds like as ambitious a project for you as the New Remote Camera Robot project was, (and is), for me.

Far be it for me to suggest, but this sounds like a LIDAR task to me.

To me, wall following would work like this:

  • Approach and somehow assume a 90° position relative to the wall at some distance.  How to get to 90° is the tough part.
  • Turn an accurate 90° in one direction or the other.
  • Aim the sensor(s) at a 45° angle toward the wall.
  • Move forward such that the distance does not change.
  • Rinse and Repeat.

What say ye?

1 Like