GoPiGo3, TortoiseBot and The Quest For Better Odometry

My ROS explorations with the GoPiGo3 have been flavored with a feeling the whole-degree encoder based odometry has been holding Dave back.

The Construct (ROS education platform) has started using the RigBetel TortoiseBot, an Indian robotics platform based on the Raspberry Pi 4, that does not include encoders or an IMU!

The kit without a Pi, which runs $186, appears “cheesy” to my eyes so I am a bit surprised at the Construct choosing to build upon it.

Their claim of “relational reasoning to give precise odometry” sounds like hype, but I have been suggesting for years that a robot knowing its location precisely becomes less important when the robot has usable vision capabilities.

Vision is hot, fusing encoders and IMU is not. Indeed, they are combining LIDAR precise environment distance measurements with RTABmap vSLAM - exactly my target for Dave in ROS2 Humble (They are “original” ROS Noetic only).

2 Likes

I think you’re right. Encoders will always be problematic due to inevitable slippage. Inexpensive IMUs probably just don’t have sufficient accuracy. With LIDAR and good vision you should be able to get very reliable position data.
/K

2 Likes

Punt the LIDAR.

How many people have built-in spinning lasers on their heads?  People know their way around by vision alone.

Though the software for it is probably totally gnarly and God doesn’t have a GitHub repo with the firmware listings!

2 Likes

True - but people also have incredibly sophisticated video processing firmware with processing capacity well beyond that of the cameras with embedded processing that @cyclicalobsessive will be using. Even more than a laptop if he offloads processing to a laptop. At least in the short run autonomous robots will need additional input to navigate the world. Elon has been pushing vision only navigation - can’t say it’s worked out well for Tesla.
/K

2 Likes

And perhaps also because GoPi5Go-Dave does not have wheel alignment screws, I have to add bias to make Dave drive straight. Then the ROS node sees the biased encoders as heading changes. If I let Dave do the driving, he gets the heading estimate within 3 degrees. If I drive Dave (with bias) his heading estimates are +/- 15 degrees.

The Create3 had an optical mouse looking at the floor to fuse with more precise encoders, twice the wheel-base, and a well tuned IMU, and a team of smart engineers specifically for the odometry function.

The GoPiGo3 API reduces the extremely fine encoder ticks to whole-degree precision. If I wanted to devote my life to rewriting the GoPiGo3 encoder API, the slippage would still be haunting Dave. (I actually added a get_raw_encoder() method in my Pi5 C++ EasyGoPiGo3 API, but abandoned that project when I got Python working.)

Using vision and Lidar and the IMU will probably allow me to turn off the encoder based odometry.

2 Likes

Damn!

Another project idea for my list!

2 Likes