A Robot's Life Is Complicated

I ran across this “[u]Choose between LIDAR and Stereo Camera Robot Navigation[/u]” post today, which touches on a subject I have thought a lot about since getting my GoPiGo3 robot over three years ago.

I started trying to create “occupancy grid” (map) for my GoPiGo3 using the DI Distance Sensor mounted on servo. The math was oppressive and I got mired in frame-transform mud trying to use someone else’s ultrasonic-sensor-ROS code in the GoPiGo3 “frame”.

Another coding/learning excursion for my bot was to use the PiCamera as a light, motion, color sensor, and also to use OpenCV for lane following and object detection (Knowing the difference between my wife’s green Crocs and the green LEDs on Carl’s dock).

One of the attractions of ROS for me has been the idea that someone already figured out the math and created a software-integrated-circuit component that I could “simply” connect to with a few wires / lines in my programs and voila, my GoPiGo3 would know where it is. (Understanding those “simple wires” is almost as complex as understanding the underlying math.)

For my robot(s), I often vacillate between wanting the simplicity of a bug’s life “responding” to stimulus without needing to be fully aware of the world, and wanting my bots to move up the function hierarchy toward “observing”…“remembering”…“thinking” … “understanding” … “learning.”

One of the basic principles of ROS is to allow decentralizing the computation, so many ROS robots are quite lifeless without WiFi communication with relatively supercharged off-board processing. I want “all that and more” to occur autonomously on 20-25Wh batteries, with a “must be on-board” RPi3B(+).

In parallel with folks writing more modules for ROS, there are folks packaging enormously complex code, a graphics processor (GPU), and multiple cameras together to bring a “camera with depth” sensor that can actually be used for on-board vision processing. The “discontinued or not?” Intel RealSense product, the Zed line, and one I expect in December - the Oak-D-Lite.

All this is to say “A Robot’s Life Is Complicated”


Just my two centavos here:

If you can get goal #1 solid, you will have accomplished much.

Goal #2 is a challenge for a @thomascoyle11859 class nitro-burning turbo-bot, let alone a Pi-3.

Even though this might count as heresy, you may wish to consider a second-generation Pi-4 if you want to keep it all on-board.  Or maybe even a Jetson Nano.



Stack two Pi’s and connect them via a short length of Ethernet crossover cable so they can communicate with each other and use WiFi for off-bot communication.

This way one Pi can handle the sensors and the second can do the heavy lifting.

Of course, this will really increase your workload. . . .

1 Like

Did just that this week, and read an article that said to fix the supply chain problem we all need to stop buying non-essential stuff.

I resurrected a “believed-to-be-bad” Pi3B this time, but was quite ready to push submit on a fully aspirated Pi4.


Using that brilliant piece of wisdom, we should buy peanut butter, bread, milk, toilet paper and nothing else.

If the Pi-4 is in stock, go for it.

“believed-to-be-bad” would scare me as a dev device.


The toilet paper may be a problem but as long as the bidet doesn’t break we’re going to be okay.


It has passed every test I’ve put to it, but if my memory isn’t totally askew the problem was the SPI bus was bad - it didn’t work for RugWarriorPi or GoPiGo3, so I declared it bad. I’ve loaded all cores to 79 degF, run ssh over WiFi for better than 10 days with no drops, run sd card tests, and run ROS2 programs over 64-bit Ubuntu Server for hous as well. BTW, running the Create3 ROS examples only loads the Pi3B at 10% of one core!

A Pi4 would be a real waste, except for making powering the “Create3 Control Processor” much simpler USB-C to USB-C cable and done.