Wall Following Code

No problem. I would have preferred that you understood my problem and offered a solution, instead of disagreeing that “I have a problem”, or disagreeing with what you mistook to be my problem.

My problem “I want wall following code (without having to write it myself)”

September 2018:

February 2020:


Correct me if I am wrong, but didn’t “every robot” come with some kind of skirt sensor for obstacles?

That’s one of the things DI decided not to implement. Absent that, wall following just got a lot more difficult.


It seemed to me that you were saying that “wall following is easy” therefore “ROS should not make me reinvent the wheel to do a simple task” and in my socially inept way, I was trying to agree.

I was also trying to say that “wall following” might indeed be easy, (given the correct sensors), but expecting DI to do EVERYTHING seemed a bit much.

I will apologize profusely to everyone but, (as it occurred to me), ROS seems to go out of it’s way to make simple tasks difficult and difficult tasks impossible.

If I remember correctly what was said within a number of threads, it seems like doing something simple, (like turning on a LED), goes like this:

  1. Find the LED library, or write it if it doesn’t exist.

  2. Discover if there is a LED state “listener”

  3. Find the associated LED state publisher.

  4. Somehow or other, figure out how to get the publisher to publish the correct state to the correct LED.

  5. (Plus the dozen or so steps I don’t remember.)

Just reading about the pain and angst you go through just to make the robot decide it’s actually running has been painful and I hate to see you working so hard yet making so little (apparent) progress.

Then again, I should complain?

MY 'bot hasn’t accomplished one tenth of what Carl can do and nipple.js still has me stymied.

I guess I was just being too sympatheticly empathetic.

(But I still think ROS sounds too complicated for its own good.)

I do NOT intend to minimize the effort both you and Keith have put into this.

LIDAR? I’d be happy to get to the point where the IMU I bought, (and is currently collecting dust), can help me make accurate 90° turns!

Carl speaking:
“My battery is low so I will go recharge.”
“Oh, and by the way, would you please check my right wheel? I think it’s rubbing against something.”

(Hangs head in shame)


I’m with you. Carl is way more sophisticated than anything I’ve done with Finmark.


Actually, no - not every robot had contact sensors, and none of them used a skirt for wall following.

What many “old school” robots came equipped with were (effectively) two forward-sector non-contact, short-distance infrared obstacle sensors. Since TV remotes were ubiquitous, 40kHz near-infrared detectors and narrow-beam near-infrared LEDs were some of the least expensive components to implement robot protection circuitry.

One bot used a single 40kHz near-infrared light detector mounted front center, and two narrow beam, slowly alternating (25hz switching) 40kHz pulsed near-infrared emitters pointed at 45 degrees to the path. The switching allowed the single detector to serve both emitters. These sensors did not sense distance, only obstruction at some short maximum distance.

Another (more expensive) robot used 45 degree pointed wide beam ultrasonic emitter/detectors that actually sensed distance.

With either configuration, slowly curving toward a wall until the sensor yelled, and then slowly curving away from the wall until the sensor stopped yelling resulted in wall following, with the added benefit of forward obstacle detection if either the other side sensor began yelling, or the wall following sensor did not stop yelling, and sensing discontinued wall if the wall following sensor did not re-acquire when expected.

One bot didn’t have short distance obstacle or skirt contact sensing - it used stalled motor sensing just like the GoPiGo3 could. That robot could actually map a room and return to its recharge base at a price very close to the GoPiGo3. The company didn’t last past the venture capital phase because all they had was a robot with no market.

I returned that bot. The wife (and I) didn’t like the wheel tracks it left in the rugs all over the house:


So indeed, the GoPiGo3 is the only bot I ever built or bought with no wall following demo. It doesn’t need a skirt. It doesn’t need a distance sensor, or even a distance sensor on a servo. The servo mounted distance sensor with 45 degree angle and the same curved driving method is probably the easiest to implement a wall following demo, but someone with good trigonometric thinking could write wall following for a base GoPiGo3 with no added sensors.

Almost every low cost robot before the GoPiGo3 targeted the university robotics education market, and wall following was a basic robot function always taught. The GoPiGo3 was designed for a wider educational market, but even the high-school fire-fighting contest robots of the last 20 years needed wall following. The latest Pi-Wars robot contests have raised the robot competency requirements to beyond the dreams of university robotics of 20 years ago, but wall-following is still a part of the challenges.

I understand it sounds like I am all wrapped around the axle on this wall following topic, bringing it up over and over every year. I think we’ve beat this topic into submission again.

Meanwhile, it looks like Dave will probably get wall-following before his older brother Carl.


(Resurrecting an old topic)

Just had a brain-wave:


  • Use existing GoPiGo sensors and capabilities without investing in expensive LIDARs and such like.

  • Not depend on ROS, but rather be able to be programmed in Dexter Blockly or using R4R.

Thought #1:

Use two micro-switches with short wands on the front corners as collision sensors.

Use sensor contact to establish the path to follow.

Thought #2:

Use the existing distance sensor on a servo and do a scan alternating between 45° left and 45° right. Use distance to establish wall following and collision avoidance.

Thought #3:

Thought #1 AND thought #2 in combination.

Unfortunately I can’t offer any actual code, but it shouldn’t be any more difficult than fighting with ROS.)


The thought of using “stalled motor detection” as @cyclicalobsessive suggests scares me, especially since the motors are reasonably high powered and have relatively high ratio gearing. I’d be concerned about stripping the gears since I have never been able to stall Charlie’s motors.



The GiggleBot has already solved this problem using a micro:bit and a scanning distance sensor using a servo.


This is an implementation of Thought #2 above.

If the micro:bit can do it, surely the GoPiGo can. :wink:


Perhaps, If the gigglebot was actually doing it!

In this case the human is doing the sensor data interpretation and the commanding - no wall following by the bot.

Maybe one day we’ll find out someone actually thought about this problem. It seems so simple until you get into the boundary cases: recognizing a wall exists, turning parallel to the wall, the actual wall following, recognizing/handling loss of the wall (e.g. doorway), recognizing/navigating an interior corner (90-135 degree angles), recognizing/handling exterior corners (225-270 degree angles), recognizing and deciding what to do for obstacles that turn out to not be interior or external corners.

Perhaps you can see why I am hoping for someone to have written this for us.



It was late and I did not read down far enough.

However, (at least in concept), that kind of autonomous wall following should not be any more difficult than what you are doing with ROS.

Maybe I will take a look at it as a subsequent project sometime.