New GoPiGo Pi B+ Dexter Raspbian Speed Difference with Older GoPiGo Pi B

Yes, that is the one.

Stretch can only be used on pre Pi-4 systems.

Install Buster, try it, and then clone the SD card.

Then do a DEXTER update, see what happens, and then clone the card again.

Then, (after you’ve cloned the card again), do an apt-get update and apt-get upgrade, (as root), try it again and see what happens.

At that point, you should have enough information to decide what operating system you wish to use.

@loringw That “experimental” is possibly the best starting point for a Raspbian For Robots based GoPiGo3.

The latest, most supported OS from ModRobotics is their GoPiGo3 OS 3.0.1 which is fully Python3-ified. It offers a lot of ways to get to know your bot all the way up to using TensorFlow-Lite for face recognition, and with programming either via Blockster, or Python in the JupyterLabs software development IDE. There are supported/current tutorials - where the Raspbian For Robots tutorials have not seemed to have continued growth by ModRobotics.

The extensive Grove sensor example code was written in the Raspbian For Robots “heyday” two years prior. I am pretty sure they work from the command-line in GoPiGo3 OS, but that is also a good question for Mitch as GoPiGo3 OS seems to be more oriented toward graphical development environments.

I have a three year old robot Carl which is running Raspbian For Robots, which I have needed to be oh-so-careful about updates because something always seems to break.

For the last three months, I have been developing robot Dave which is running ROS2 over 64-bit Ubuntu Server. This was definitely not an easy path, but the only way to use the GoPiGo3 to learn ROS2.

Before you get too far down any path, I suggest you pop the question off to @mitch.kremm at, explaining your students’ levels and what you want them to accomplish. I am guessing they would like to help you to use GoPiGo3 OS if it will meet all your requirements.


True, every bit of it.

However you do NOT have to use Jupyter Labs.  In fact, Jupyter makes me crazy since it seems to be “Power-Point for programming” - i.e. if you want to demo a piece of code with line-by-line commentary as an educational aid, you can’t beat it.

However, for any kind of (so called) “serious” programming, use something else, and push everything to GitHub.

Geany, Thonny, etc., are all good.

I use Visual Studio Code on my laptop and execute directly on my 'bot, though I may reconsider that in the future.

I’m not nearly the programming guru that @cyclicalobsessive is, and I really like the new GoPiGo OS because I can pop back and forth between Blockster and full-bore Python.

I “try things out” in Bloxter to smoke test them for feasibility and then do the heavy lifting in Python.

Though, I would be willing to guess that there’s little you can do in Python that you can’t do in Bloxter - though it might be less efficient.

Sending a line to the folks at MR is a good idea.

Just remember to tell us what they say!


Thanks for the tip about Geany, much faster and nicer than Thonny. It runs the python script in a terminal prompt very nicely while I am prototyping.
I like Raspbian for Robots much, much better for my purposes. It will also do C, Java and other languages. My students learn more about Linux. The GoPiGo3 OS has way too many training wheels for my community college student training needs.


We are publishing our code and tutorials at

1 Like

Are there “find the bugs” in


I am not sure what you mean, they all work. I was working on them this afternoon, and finished testing them all a bit ago.


I saw two suspicious statements and didn’t want to spoil if they were intentional:

  • Lines 67 vs 68
  • Lines 70 vs 71

BTW, there are some new orbit commands in the API that are pretty cool also.


That was a mistake, it was supposed to print 180.
I did add in an orbit, it is pretty nice!


I didn’t look over those. Will students ever use multi-threading, multi-processing, or multiple programs accessing the same sensor, or accessing the I2C bus?

The EasyGoPiGo() class defaults use_mutex = False. I believe this to be a philosophical error to not use the protection mutex in preference to speed. I believe in optimizing only if and where needed, and always coding for errors/exceptions/conflicts/edges from the start. Thus, suggest all instances be created with use_mutex=True, which will get passed down to sensor instantiation where important.


You might find this interesting:


I don’t think so. I can get them up to accessing the sensors while doing something else. I am going to try to get them up to multi-threading.
I appreciate the info on use_mutex=True. I haven’t dug that far into the API or know enough yet to quite understand how mutex relates to multithreading. Always something more to learn!


DI implemented optional mutex protection around the I2C bus to prevent multiple threads or processes accessing the bus at the same time. If egpg=EasyGoPiGo3(“use_mutex=True”) was declared in each egpg object when instantiated, then the process or thread that wants to access a sensor via the bus, either acquires the mutex or waits until it acquires the mutex, so that only one process at a time will be sending commands across the bus.

The concept of a single resource with multiple users becomes important when complex programs get broken up into smaller independent programs, beyond the I2C bus also. On my robot I have an independant health check program that instantiates an EasyGoPiGo3() object to check the battery and perform a system shutdown if the battery level falls to low. This way, I don’t have to write battery level checks into every single program I write for the bot.

BUT - every instantiation of the EasyGoPiGo3() class performs a set_speed(300) under the covers. This can set up a race condition of who started when. My “juicer” program that puts the bot back on the dock wants the speed set at 150DPS for most accurate turns. If juicer starts first, resets the speed to 150DPS, and then “thinks” the speed will not change, it will not know that the healthcheck program starting is going to set the speed back to 300 DPS.

This last case is not solved by mutex, but illustrates some of the complexities of multiple processes accessing a single resource (the GoPiGo3 red board in this case).

One of the most difficult transitions in my programming career was to stop thinking linear execution of software. Today with every processor having multiple cores and algorithms so complex that execution gets farmed out across multiple cores in nets or graphics, avoiding linear thinking is a must.

Just a thought.


Along the lines of a first look at threading - an exercise could adapt the following “WiFi LED Blinker” excerpt to for the eyes. (I have disabled antenna_wifi.service so that I can use the WiFi LED, but the concept works for the “Eyes” as well.)

import threading

WHITE_BRIGHT = (255, 255, 255) # color 0
RED = (255, 0, 0)            # color 1
ORANGE = (255, 125, 0)       # color 2
YELLOW = (255, 255, 0)       # color 3
YELLOW_GREEN = (125, 255, 0) # color 4
GREEN  = (0, 255, 0)         # color 5
TURQUOISE = (0, 255, 125)    # color 6
CYAN = (0, 255, 255)         # color 7 light blue
CYAN_BLUE = (0, 125, 255)    # color 8 
BLUE = (0, 0, 255)           # color 9
VIOLET = (125, 0, 255)       # color 10
MAGENTA = (255, 0, 255)      # color 11
MAGENTA_RED = (255, 0, 125)  # color 12

def do_wifi_blinking(egpg,color=RED):
	global wifi_blinker_thread_quit
		r,g,b = color
		while wifi_blinker_thread_quit is not True:
	except Exception as e:
		print("do_wifi_blinking: Exception {}".format(str(e)))
		raise e
	# print("do_wifi_blinking() exiting")
	wifi_blinker_thread_quit = False

wifi_blinker_thread = None
wifi_blinker_thread_quit = False

def wifi_blinker_on(egpg,color=RED):
	global wifi_blinker_thread,wifi_blinker_thread_quit

	if wifi_blinker_thread:
	else:   # need to start thread
		wifi_blinker_thread_quit = False
		wifi_blinker_thread = threading.Thread(target=do_wifi_blinking, args=(egpg,color,), daemon=True)

def wifi_blinker_off(egpg):
	global wifi_blinker_thread,wifi_blinker_thread_quit

	if wifi_blinker_thread:
		wifi_blinker_thread_quit = True	# tell thread to quit
		# wifi_blinker_thread.join()	# wait for thread to quit
		timer = 0
		while wifi_blinker_thread_quit and (timer < 5):
		wifi_blinker_thread_quit = False
		wifi_blinker_thread = None

You’ve got a lot on your plate with teaching - I’m sorry for going on here. What you have done looks great. Glad you have chosen GoPiGo3 to be a part of your pedagogy.


Very nice, I appreciate it. You will probably see a version of this appearing in our github site at some point.
Do you have a github for your code?
Thanks again!


I would call that a bug/design flaw.

BTW, technically that’s not a “race condition” since it can be temporally distant.  However it does illustrate the kind of conflict that can happen when a class/method doesn’t respect that other instances or classes may have changed something.

Is it possible to query the current set speed?

It sounds like there needs to be some kind of “dirty” flag, (killroy_was_here = TRUE), to indicate that it has already been initialized so that subsequent instantiations don’t reset it.

Can classes have privately global flags/variables?  That is, something local to the class that can be set and read by any instance of the class?

You can access the eyes directly within the easygopigo class.

@cyclicalobsessive had to jump through hoops to mess with the antenna LED because it’s dedicated to a different use.

Another tid-bit:

The three signal lines to the on-board neopixels are brought out to pads on the PCB so you can actually add neopixel LEDs if you want!  I haven’t tried this, (yet!), but it should be possible to address them by simply incrementing the LED address.


Yes, github serves as a backup for Carl, Dave, DeskPi, and the ROS2 Ubuntu VM I run on my Mac.

Carl’s repository contains some notable stuff:

  • /plib contains all his operational code
  • /Examples/threads compares Python 3.7 threading vs multi-processing for i/o or cpu bound ops
  • /Examples/pycam motion detect, several streaming techniques
  • /Examples/classVars test if Python class variables are available across multiple processes (hint no)
  • /Projects/LogLightValueAndPlot uses the picamera to sample room light levels over time and plot the logged levels
  • /Projects/raspberry_pi_sound_meter_and_plot uses a microphone to sample noise levels over time
    and plot the logged levels
  • /Projects/easyPiCamSensor Offers PiCam as four sensors: light intensity, color, motion, camera
  • /Projects/servoscan and scan360 perform distance sensor scans and plot to a terminal window. I never got around to doing a windowed plot for desktop use b/c most of my dev is remote shell terminal

and there are some unfinished, dead end stuff to avoid all over the place.

Carl’s /systests contains test programs I have written to test every subsystem from processor, motors, encoders, sound, you name it there is probably a test - the most useful:

  • motors/ allows trying various wheel diameters to find most accurate
  • motors/ allows trying various wheel base values for most accurate turns

I am sure glad I posted to this forum. I am learning all sorts of good things.
I have Raspberry Pi Cameras, IR cameras, and a couple of piCams to experiment with.
So little time, so many toys . . .
Thanks for the github address, there is a lot of interesting code and shell scripts to look through. I will keep you posted on my progress.


Just a quick note because it’s Thanksgiving for me today.
Gopigo OS is now an open platform and allows you to connect to the normal Raspberry Pi desktop where you can code with whatever languages and editors that strike your fancy.

You start with an wifi access point but you can switch to your normal network and make that permanent if you prefer.


Happy Thanksgiving!