Line Following Sensor for GoPiGo3, GoPiGo1

G’day - I’ve got two different GoPiGo robots, a GoPiGo3 and the second which is just called GoPiGo…am assuming this is v1.

I run a volunteer codeclub for kids here in Melbourne, Australia and would like to use these robots for a line following challenge in class.

I have two questions -

  1. Would you know someone who sells these here in Australia at an affordable price (your shipping costs make it prohibitive to just order 1 sensor from the US)
  2. Can i use 2 of the Grove Line following sensors instead (assuming the libraries are available for linux) -

Thanks and appreciate your support.



Suggest sending copy of your post to

As for using 2 of the Grove Line Finder - they are a strictly binary output - 1 for black, 0 for white - line following is possible with two of them…

The GoPiGo3 has two “Grove A/D” ports so you could plug one into “AD1” and the other into “AD2”.
Software wise there is no “Grove Line Finder” example but several approaches are possible:

  1. Set up AD1 and AD2 as digital input ports, read the values - available in Python and other langs.
  2. Use the example and interpret button pressed as black, and not pressed as white (I think that is the correct mapping - didn’t look too closely).
  3. Use the GoPiGo OS (only on the GoPiGo3) and they can program in the visual Bloxter drag and drop browser environment

I don’t know anything about the GoPiGo v1 - the Git repository has a button sensor example that illustrates a digital read but it talks about pins so I’m really lost there. Again a question for support@modrobotics.

The other issue with the GoPiGo v1 may be what OS is available. Like I said, I don’t know anything about the early bot.

1 Like

Thanks @cyclicalobsessive. Much appreciated.

I stumbled across a local supplier ( who has the sensor in stock, have asked them the same question. Let’s see what comes of it. If that doesn’t go anywhere i’ll write to the support address you’ve provided.

Thank you for your feedback on the Grove line following sensor. Appreciate the support mate.

Have a nice one.



No pressure, just a thought - (I have no idea what level of kids you are designing your challenge for) - vision is the future of autonomous vehicles and a focus area for robotics. If you have an outstanding student that can grasp Python programming, and you have a PiCamera to attach to your Raspberry Pi, demonstrating line “recognition” using OpenCV is surprisingly straight forward. It is then possible to write the commands to drive a GoPiGo3 based on the angle of the line.

It might need to drive slower than the twin line sensor method, but as a technology demonstration the goal is to follow a line without the time constraint.

The GoPiGo OS comes with OpenCV installed (I think…) and has tutorials. (GoPiGo OS is the latest educational operating system with GoPiGo3 API installed, as well as TensorFlowLite object recognition so someone could even build a “follow the human” quite easily. Again although it is called GoPiGo OS, it is only for GoPiGo3 robots.

1 Like

G’day @cyclicalobsessive,

That’s a great suggestion. Much appreciated. In-fact it’s the reason i purchased those bots.

However, the challenge we have in class is that most of our “engaged” kids are between 6-13. We struggle to hold onto them (so-far at-least) past 13 (There’s various known reasons for that e.g. videos games, youtube, friends, getting more independent) and we’ve never been able to find a child who could progress to using OpenCV on the bot.

We use many other affordable robots (microbit, arduino, mbot, etc.) to help the kids work through basic robotics challenges. Am hoping once we get some line tracking experience with python on the GoPiGo there might be a possibility that some of the kids might move to more complex challenges on the Pi bot…however that’s just wishful thinking for now…lol.

You can check out the work we do at Thanks again for the suggestions and appreciate all the support you’ve provided.

Have a nice one mate.