Wall Following Code - Anyone?

In September 2018, I posted a question: Safe Wall Following Ideas?

Anyone know of some wall following code using a single distance sensor?

Or even better yet: wall following code using the pi-cam?

There are so many robot behaviors I can imagine using wall following:

  1. corner bot: (start along in corner) follows wall till opening or obstruction, (announces wall length), turns 180, follow wall till corner, (announce wall length), turn 90, follow the second wall till opening or obstruction, (announce wall length), 180 turn, then return to corner, (announces wall length), turn other 90, rinse and repeat.

  2. Changing of the guard: (Start along wall), pace to opening or obstruction, 180 turn, pace to opening or obstruction, 180, rinse and repeat.

  3. Hall Monitor: ( start along wall), fwd to opening or obstruction, 90 turn, fwd to wall following distance from opposite wall, 90 turn, rinse and repeat.

  4. room mapper?: this one gets really complex

1 Like

One thought is that you should be able to use your lane following code to find the “edge” where the wall and floor meet - and the distance sensor to locate obstructions.

After that, it’s just a walk in the park for an old code-hound like you!

True, but also a consideration is the higher cost for Carl (processing load, juice, and, and thermal).

I hope to make a first cut solution that will be useful for GoPiGo3 Starter Kit owners to build a class of simple bots that demonstrate behaviors which evoke synthetic emotions. (And self contained, simple to understand, simple math, and easy to extend hopefully.)

1 Like

I really wish I help, but you’re already way over my head.

I don’t know how you’d do a low-resource, simple mathematics solution to something that, essentially, involves image processing.

AFAIK, the words “image processing” translate almost directly into “non-trivial”. Roget’s Thesaurus (:wink:) suggests “active cooling” as a synonym. “math co-processor” was also a suggestion. :laughing:

All kidding aside, I’m not sure how you’d get a decent “simple” first-cut without a dedicated image processor.

Doable? No prob!

Easily doable? Not so sure about that one.

You DO have my curiosity riz, though.

I have not implemented it yet, but this is what I’m thinking:

Wall Following: (wall on right roughly parallel to bot heading)

  1. Point distance sensor 45 degrees to right of forward
  2. (Check distance > “safe to drive distance” or exit function)
  3. Start Driving Forward
    Loop:
  4. Check distance > “safe to drive distance” - else stop wheels and exit function
  5. If distance (to wall on the diagonal) is > (105% of desired distance): turn a little to the right
    • (by decreasing the speed of rt wheel)
  6. Else If distance to wall on the diagonal is < (95% of desired distance): turn a little to left
    -(by increasing the speed of rt wheel a little)
  7. Else we’re doing good so set rt wheel speed to equal left wheel speed
    • (I think there may be a reason to slightly bias the bot toward the wall)

This can be (must be?) tuned of course, choosing different percentage band and different safety distance.

Smarter folks can go math crazy implementing Proportional Integrative Derivative and tuning to maximize correction speed with minimal overshoot, but I’m guessing the simple math algorithm will work well considering the minimal effort to implement it and understand it.

I think the line follower example implements a PID controller, so I probably could “fake a line sensor” with the 45 degree diagonal distance to the wall and reuse the line follower PID code, but no one would understand it without pictures.

1 Like

If it were me, I’d think more like this:

  1. Point distance sensor directly ahead. Measure to determine if it’s safe to move.
  2. Rotate sensor 90° to one side, measure distance.
  3. Rotate sensor 90° to the other side, measure distance.

(Assume: The robot will be placed relatively close to the wall you want it to follow.)

  1. Move forward and continue measuring.

Ideally, a visual field description is the most efficient since you can determine where you are, where the wall is, (on which side), and clearance in front at the same time.

It might be a bit more processor expensive, but ultimately it may well be more efficient in the long run.

This is the kind of thing I just love to start thinking about at 11pm, after smacking my head on a stair while putting something away. :dizzy_face:

I start thinking about what the 'bot needs to do, then start wondering “but what if the [etc.] happens?”, and I don’t get to bed before five!

It can be…

Here I’ll get you started:

  • save it as: jim.py
  • python3 jim.py
#!/usr/bin/env python3

import easygopigo3
import easysensors
import time

STOP_DISTANCE = 4.0

def follow_wall(egpg):
    #
    print("distance reading: {}".format(egpg.ds.read_inches()))
    while (egpg.ds.read_inches() > STOP_DISTANCE):
        egpg.forward()
        print("distance reading: {}".format(egpg.ds.read_inches()))
        time.sleep(1)
    egpg.stop()


def main():
    egpg = easygopigo3.EasyGoPiGo3()
    egpg.ds = egpg.init_distance_sensor()
    egpg.pan = egpg.init_servo()
    egpg.pan.rotate_servo(90)
    print("OUTA MY WAY! I'm goin' till I can't")
    follow_wall(egpg)
    print("GUESS THAT'S ALL SHE WROTE")


if __name__ == '__main__':
    main()
1 Like