Following a line from a projector


I am working on a project that projects a graph with a projector on table. My GoPiGo needs to be able to follow certain lines that will be highlighted in some color on the projection, lets say W.L.G black line. The problem is, the projector is above the robot, and the robot itself covers the lines it needs to follow, so placing a line follower bellow the robot makes no sense in this case. Is there a possibility to place the line follower infront of the GoPiGo, even tilt it a bit? Will it still work properly? If no what component should I use for this?

I still have yet to buy any of the extra parts so I would like some insight before actually purchasing anything.

Thank you in advance.


I am not aware of anyone testing a forward tilted line follow sensor, so you will be leading - not following!

Does the sensor work, if rather than tilting it, you mounted it looking straight up toward the projector, in a very dark room? If there is enough contrast between the projected light areas and the projected dark line, it may work. Perhaps you will need to customize the line follower driver software to recognize the projected dark line.


The fact is I am working on DexterOS, so I feel limited within the environment that has been provided to me. Since it is from the educational kit I think. Do you think a camera or a RGB sensor would be better choice? I have not found any materials on how the RGB sensor works. Maybe you have some insight in that as well?


Working within the limitations (reality) is always a factor in designing a problem solution.

I always design for success but consider the risks and impact of not succeeding with my initial design. In an educational setting, often a well documented effort which does not succeed at the stated goal will be scored as an educational success. Be sure to plan for post effort documentation time to document the design, the implementation, the result, analysis of the approach, and in the case of not solving the stated goal - alternate approaches for future investigation.

In the DexterOS / GoPiGo OS environment you have several options

  • Graphical Blockly: may allow success with line follower sensor, may not - limited adaptability I think
  • Python: I am not that familiar but choices between Jupyter and Command Line, not sure if other. Here you have full access to the GoPiGo3 sensor APIs and the drivers.

Every sensor has advantages and disadvantages:

  • camera:
    – advantages: Can use OpenCV line recognition (If installed on DexterOS/GoPiGoOS - I believe it is)
    – disadvantage: Seriously Complex to understand for noobies
  • RGB sensor:
    – advantages: ?? if you can choose line color projected, perhaps better discrimination of line vs no line
    – disadvantage: single sensor, so harder to handle line edge and line loss
  • LineFollower:
    – advantages: multiple sensors make recognizing pending line loss much easier
    – disadvantage: designed for limited range sensing - unknown how it performs with projected line

It seems you could test the projected line and line follower output very quickly - not sure about the Blockly output - would start with that and see what the data output consists of and how the linefollower reacts to a projected line.


If you are using a GoPiGo-3, you should upgrade to the new GoPiGo O/S.

Dexter O/S has been depreciated and the new O/S is a much better environment to work with.

Can you project the line-image upward to a translucent surface to create a pattern that the robot can drive on?


I think it would be hard to pull of with the current materials that I have (for the projecting upwards) Any tutorials on how to upgrade the OS?


Another question regarding connecting extra sensors to the GoPiGo:

I have tried connecting with an external camera. It was connected directly to the RaspberryPi, but it could not be detected by the robot. My mentor told me that I could try using TCTR5000 Line Tracking Sensor since it is cheaper alternative for the line following.

Do you think it will be compatible with the GoPiGo? Or will it be the same as the camera, and the robot will not be able to detect that it is connected.


This is the starting point for downloading/installing/using the GoPiGo OS:

  • Was this an official Raspberry Pi Camera?
    Having a camera attached is fun, but line following with a camera is seriously advanced programming. Probably not worth investing much time to get it working.

That sensor is no good for a projected line - would only detect painted or printed black line on a white surface or paper. With an appropriately crafted cable that sensor could be used with one of the data ports of the GoPiGo3, but there is not a built in sensor driver.

An additional negative about that sensor, is it has only one state output: {“see line”: 1, 0}. While this does work, it is more difficult to program the robot for the “lost line” condition than a multi-element sensor such as this official GoPiGo3 sensor, or this one or this one - BUT these also are for black line on white area such as electrical tape on a white floor or paper.

For a projected from above black line or white line, I would think using two of these sensors would work best with the GoPiGo3. The sensor puts out a voltage that varies by the amount of light it sees. I would mount them close together with a wall between them so each one sees up and to a side.

Both sensors see line: drive straight
Only one sensor sees line: drive in arc slightly toward line (side that sensor sees line)
Neither sensor see line: spin until see line or have spun 360 degrees

Since the sensors output a variable voltage, a more advanced control algorithm could vary the arc to balance the values from the two sensors. This might prevent loss of the line by one sensor resulting in a much smoother/faster line following.

Again, you will need to purchase the connectors to attach to the sensor and to the GoPiGo3 such as these

Another sensor that would work for a projected line - two of these again with a wall between them. These connect via I2C, so it is a little more complicated to write the interface to, but offers some advantages over a variable voltage interface.


Thank you for all the feedback. I have one more question:

Do you know if there is a possibility of creating a GUI inside the robot?

Example: A person is shown a window, and then they can draw something on it. Is it possible of doing it in the robots coding environment in Python? I tried doing this a while back but I wasn’t succesful.

If I manage to do this there will be no need for the line follower.


Possible? Yes, but not simple for those of us that have never done it before.

Possible “in the robots coding environment in Python?” I don’t know which environment you are referring to, but I am not aware of any GoPiGo3 examples we could follow to create a “mousable/touchable window”.

Showing a window in a browser with driving buttons is obviously possible, since the GoPiGo OS has just such a thing, but the I haven’t a clue what the design is or if the code is accessible to us. My webpage days are long forgotten even if the code could be found. Sorry.


I have a project called the “New Remote Camera Robot” I was working on awhile back, (before Covid hit and I was seriously ill for quite a while).

The 2000 pound gorilla in the room isn’t so much creating the code, (obviously doable), but interacting with the interface.

The Powers That Be at the W3C consortium and the web browser companies have decided for us that any device that wants to interact with a web page, especially joysticks and/or game controllers, have to be behind a payware legitimate SSL site certificate from a real trusted root certificate provider.

Eventually, I accomplished my goal of driving a robot around using a joystick to guide it - and viewing a first-person image via the camera on a pan-and-tilt - but I had to buy a domain, get a real certificate for it, and install NGINX on the robot to provide a SSL (HTTPS) interface to the browser.

Depending on how you do it, (if your pen/touchpad emulates a mouse, for example), you may be able to skip that.  Otherwise, it’s going to be “interesting”.  You’re welcome to look at my GitHub repo. . .

. . . and make use of whatever you find there.



I was talking about the Jupyter Notebook, that is used to run python. I wanna make like a GUI inside of it, but ther aren’t many resources online. The GUI should be like a canvas that people can draw on. I already have the environment in JS, but I just wanted to be able to do it in the GoPiGo as well. But I guess it’s not going to be that easy. I will try some different solutions I have thought of next week and update if I was successful.

@jimrh Thank you for sharing your git repo with me. I will check it out to see if I find something helpfull.

Again, thank you all for all the great feedback.



Unfortunately, no one here has much experience in Jupiter.  I spend time in both Bloxter and Python, @cyclicalobsessive spends his time (mostly) programming ROS and others have similarly minute experience in Jupiter.  In fact I never knew you could do graphical interfaces in Jupiter.

You would be the only person who has done anything in Jupiter here on the GoPiGo3 in quite a long time.  In fact, we’d appreciate you staying around as the resident Jupiter expert. (:+1:)

Please let us know if there is anything else we can help you with.

1 Like

I am only a humble beginner and no expert at all, I appreciate the hospitality however. I will also share my git repo once the full project is finished. I will also probably write a blog on how I did everything here, so if someone else wants to improve it or do anything similar can avoid my mistakes, which I am sure there are many.


That’s true in great measure for all of us, especially me.

No matter how skilled you are, (or are not :wink:), there’s always something we can learn from you, so don’t hesitate to jump right in and ask whatever you want.

Don’t hesitate to jump in with an opinion or suggestion either.  Your idea may well be the flash of light that will solve someon else’s frustrating problem!

Welcome in and don’t hesitate to stick around.

1 Like

I am a fervent user of written tutorials, but there are some things for which video is easier to produce and conveys the user experience much better.

If I might suggest - a video showing using Jupyter for GoPiGo3 programming/debugging would be very interesting to me.

I tend to develop using a very basic text editor, and still debug with print statements. I know that environments such as Jupyter and VSCode are much better, but somehow my unfamiliarity with working in these environments always pushes me back to ssh, nano, and print().

A video showing Jupyter setup for the GoPiGo3, writing and debugging a simple program with Jupyter would help me and new GoPiGo3 owners greatly, but understand that it is not part of your assignment.

Wishing you success in this project, and hope that you will be able to continue using the GoPiGo3 robot after your class ends.

1 Like