How Much ROS Can Raspberry Pi 3B+ on GoPiGo3 Support

This is one of the most interesting characterizations I plan to pin down with my ROSbot Dave.

I am hoping that Dave will support:

Realtime Actions

  • Native fusion of odometry and IMU data for localization
  • Path Plan execution with localized obstacle avoidance replanning against safe box or ref map
    (bag locations, distance sensor, optionally bag LIDAR)


Non-Realtime Activities:

  • Autonomous Path Plan generation (List of Goals?) for:
    • Wander in room/named_area “safe box”
    • Plan to Return to starting point
    • Navigate to center of particular “room/named_area” or room/named_area “safe box”
  • Periodic Stationary localization “check” from a brief LIDAR scan
    against a static reference map
  • Develop room/named_area “safe box” from recent wander bags
  • Post-excursion map creation from LIDAR scan bag
  • Lower boundary confidences in reference map
    from differences between post-excursion map and reference map



  • Do you think the activities listed will be possible on the Raspberry Pi 3B+
    (with the non-aspirated GoPiGo3)?
  • Does ROS have non-realtime “path plan” generation separate from during goal execution time?
    Does it have a concept name?
  • Does ROS gave a concept of a “Goal List”
1 Like

I think you mean “normally aspirated” as opposed to NOX injection or turbo/super charging.

As much as you don’t like the Pi-4, that’s one of the reasons I bought it, to allow me to do local development using the Visual Studio Code local server talking to my laptop without crushing the device’s performance.

Trying that on a Pi-3 was painful.

You may want to try some things on a Pi-4 to ease the development burden, and then try on the Pi-3 and tweak for power performance.

But that’s me. You do you.

1 Like

I was never into muscle or sport cars - right - “naturally aspirated” as opposed to forced air (50mA) or “super charged” (Pi4 with liquid cooler):

1 Like

Holy Christmas!

I can smell the nitromethane fumes all the way to here!

Not 'fer 'nuthin, but when you start putting active refrigeration technology, or liquid nitrogen cooling, on a Pi of any flavor, it’s time to:

  1. Have the IRS check the sources of your income.
  2. Get your head examined. (Are you really dead-set on finding the next Merseianne prime/mining Bitcoins on your Pi?)
  3. Get a system that is “slightly” more capable than that. (Anyone got Cray’s phone number?)
1 Like

$19 on Amazon with over 500 (89%) five-star ratings (but GPIO connector is blocked)

But to really impress the robot club, better outfit your GoPiGo3 with this:

Boy did I take this thread OT.


And I promise not to beat you up for it.

It is fun and interesting, and that’s good enough in my book.

1 Like


That’s for WIMPS!

You want liquid nitrogen as a minimum!!

1 Like

That’s hilarious.

I used to work in a lab that had huge cylinders of liquid nitrogen. Definitely fun to play with after hours.



It would be worth trying. I’m just not sure how well it will handle the ongoing localization and planning. I’m guessing it won’t be able to handle those too well on it’s own, at least not in a reasonable time frame. It might be able to generate a global plan based on a static map, but I don’t think it would be able to keep up with local planning as the robot moves. And as far as I understand, using a bag file is still essentially real time, just deferred. One thing you might be able to do is record the LIDAR readings to a bag file, and then generate a map file later on a PC. I’m trying to think how you’d generate a “safe box” - I guess you could put make-believe walls on the map to define boundaries. But I’m not sure how that would mess with localization. There are probably other ways I just don’t know about.

move_base does have a service called /make_plan that you can call - it will generate the path but not start moving the robot. That’s the closest thing I know of, and you do have to have move_base up and running (which requires in turn the global and local planners).

Not that I know of. What I’ve done is kept the goals in a python data structure of some type, and then sent them one by one (as poses) to the move_base action server. Once the action is completed I send the next goal. I want to figure out how to save the goals in a YAML file (or I should say - I want to figure out how to read in a YAML file that has the goal poses), but that’s a future project.