[Stale] Testers Wanted: "v3 image" ROS2 for GoPiGo3"

PLEASE TEST (No ROS knowledge needed):

Any GoPiGo3 will work - no sensors required

New image link and modified instructions

  • removed existing netplan from image and reshrunk
  • Added configuration for ethernet to the edit network-config step
  • Require three boots before first login attempt

Follow the new instructions at: GitHub - Install ROS2 For GoPiGo3 From Image


Great - will try to test it tomorrow night.


Don’t know if you will need an “emergency stop” but I just did, and added to my ros2ws/


#!/usr/bin/env python3

# FILE: stop.py
# PURPOSE:  stop runaway bot

# import the GoPiGo3 drivers
import easygopigo3 as easy

# Create an instance of the GoPiGo3 class.
# GPG will be the GoPiGo3 object.
egpg = easy.EasyGoPiGo3()
print("stop.py: EasyGoPiGo3 commanded with stop()")

$ chmod +x stop.py

To use it:
$ ./stop.py


Well, didn’t get to it. Was one of those days. 3D printer went haywire - have to trouble shoot that later. Wanted to see what was wrong with Finmark, but SD card with finmark seems totally dead. Tried another SD to burn with backed up image - also totally dead (? something happened during the move). Third SD was able to burn image and boot, but system is acting strangely. Commands not taking - system just hangs. CTRL-C gets me back to command prompt, but then next command just hangs. And keyboard mapping is wrong. No idea why.

Off to bed - FIRST LEGO League judging tomorrow, and trip to visit friends Sunday. May be next week before I get back to it.


Decided to take a page from @cyclicalobsessive’s playbook (with a nod to @jimrh who had asked in an earlier thread about GoPiGo OS). To be more deliberate in testing I downloaded GoPiGo OS and installed it. Everything seems to work fine. So my Raspberry Pi and GoPiGo board seem OK as far as hardware goes. That’s a relief.

Re-downloaded the10/27 image - will burn to the card I am now sure is good. But testing will have to wait until tomorrow night.



I downloaded and burned the image last night.
Tonight I tested it on Finmark (Raspberry Pi 3B+ - stock GoPiGo3 model). I wasn’t looking at the instructions, and I had a keyboard and monitor attached. First boot took forever (and more time after logging in). Second boot was reasonably fast (just over a minute). Ros2hh showed up on my network, and I was able to ssh in.

On the machine itself I ran the two test scripts.

$ python3 /home/pi/Dexter/GoPiGo3/Software/Python/Examples/Read_Info.py  

yielded this:

$ python3 /home/pi/Dexter/GoPiGo3/Software/Python/Examples/Motor_Turn.py  

Motors turned 3 times as expected. Seems like it might have been a bit slow, but I didn’t use a stopwatch.

All I have time for tonight, but off to a good start.

Thanks @cyclicalobsessive for the Ros2 image.


Perhaps deleting /home/pi/Dexter/gpg3_config.json. It should be created again with the correct info for your bot.

I don’t think I deleted it before the shrink.

You can also run
python3 ~/home/ubuntu/systests/gpg_config/print_gpg3_config.py
to check it, or of course
more ~/home/pi/Dexter/gpg3_config.json

I just checked v4 - sure enough my script was deleting


but the filename is:


so it did not get deleted.

(Dave is a “new GoPiGo3” with 16 ticks per rev encoders, where yours is probably the 6 ticks per rev. I think that is what is making your bot slow.)

Thanks for testing.

1 Like

OK - I’ll check.
Will also try the new version once it’s available. I’ll probably go ahead and use the Raspberry Pi 4 I have as well at that point, and I need to get ROS2 running on a laptop (although I may try again with either WSL; don’t think I’ll run it on Windows since I’m really not as familiar with those command line tools).

1 Like

Advanced user you.

I am still running ROS2 Galactic on Ubuntu 20.04 Desktop Focal to visualize the ROS2 Humble GoPiGo3 with rviz2 and rqt.

The only Humble feature I miss on the Galactic desktop is the --once option to ros2 topic echo --flow-style /xyzzy. It echos only one topic message and then quits. For the 30Hz topics, the --once option is very convenient.

I find it really ironic that the 2nd ros.org “Beginner: CLI Tutorial” requires a desktop ROS2 to run the turtlesim and rqt. So much of ROS education is targeted to having the desktop tools available and using simulation. I was very antsy going through the tutorials to “play with my robot.” I don’t care about simulating a robot when a real one is sitting on the floor staring back at me - but then I have always been in a hurry.

I intend to create a “Installing ROS2 Desktop for the ROS2 GoPiGo3” guide eventually to allow using the ROS2 GoPiGo3 to learn rviz and rqt (without having to learn about the whole turtlebot simulation thing).

Simulation is a big part of learning and using ROS, but I am more interested in ROSification of the GoPiGo3 than learning about ROS. I’m always wanting a self-contained autonomous robot, versus the main concept of ROS is to enable distributed processing.


Sign me up when that happens!

Now I am puzzled.

I thought the whole idea of ROS/2 was to have a working robot that could do useful things, as opposed to having a robot tied to an external system.  IMHO, having a robot that REQUIRES an external system is no better than flying a tethered airplane in endless circles.1

Maybe that’s one of the reasons I’ve not been in a hurry to jump on the ROS bandwagon - not only do you need some hellaciously expensive robot, (which is what we’re trying to dis-prove here :+1:), only to find that you need to perform The Twelve Labors of Hercules 2 to get ROS running on it.  And then they expect you to tie up potentially valuable/expensive desktop resources so that the 'bot has the wherewithal to pick its own nose?

I can do that in Bloxter on a plain vanilla GoPiGo OS install without a separate system, (using a keyboard, mouse, and monitor), or remotely using anything with a browser!

The idea of running my 'bot around using an inexpensive tablet running Android has its own special je ne sais quoi, (if you don’t need fancy, Android tablets can be had for a song nowadays), and you don’t need a peta-hertz processor, massive amounts of RAM, a insane video card, etc., to get things running.

Sigh. . . . I just don’t get it. :thinking:

[1]  By comparison, I consider my joystick controlled “Remote Camera Robot” more like that same plane running R/C, instead of spinning around in mindless circles tied to a tether.

[2]  If you prefer something more interesting, (and with a little bit of irreverent French humor thrown in), there’s The Twelve Tasks of Asterix, a cute, full-length animated film about how Gaul, (France), thumbs its nose at Imperial Rome.

Here’s hoping this cheers you up a bit while you’re not well.  You are in my prayers.


Not hardly - certainly not compared to you.

It is - and I’ve actually liked it. I haven’t done any ROS2 tutorials so that’s also on the agenda as I migrate platforms. The nice thing about an online tutorial with simulation is that you know if things aren’t working it’s not because of an issue with the hardware platform - it’s something you did. I think I’ve mentioned that I’ve done tutorials on The Construct and have learned a lot.

ROS (and ROS2) don’t require it per se. ROS is more resource intensive than something like Bloxter, but also more capable. So it depends on what you want to do.


That’s OK - you don’t have to. (I don’t get “cat people”, but that’s a different discussion).


Discovered Asterix and Obelix when I was in college studying in Europe. Haven’t thought about them in a long time.


It’s easy.

“Cat” people are just like “dog” people, only different.  :wink:

Actually, I’m cool with both.

Snakes, birds, goats, horses, pigs, etc. - some of the most decent people I know are animals.