Got SLAM'd by SLAM-Toolbox

I’ve been playing with SLAM-Toolbox now for months. I did synchronous mapping, and async mapping passing my custom parameter files with the most important change: frame_id: base_frame needs to be frame_id: base_link to match the URDF that I built following the Hands On ROS book.

So naturally after running Dave around the kitchen in async mapping mode and saving off my map, I created “my_loc_params.yaml” with my map name of “xyzzy” and the frame_id change, then ran

 ros2 launch slam_toolbox localization_launch.py 'slam_params_file:=./my_loc_params.yaml'

thinking that’s what I needed to run slam-toolbox in localization mode for GoPiGo3.

Something weird going on:

[localization_slam_toolbox_node-1] [WARN] [1700961733.572515908] [slam_toolbox]: Failed to compute odom pose
[localization_slam_toolbox_node-1] [WARN] [1700961733.630688930] [slam_toolbox]: Failed to compute odom pose
[localization_slam_toolbox_node-1] [WARN] [1700961733.745148974] [slam_toolbox]: Failed to compute odom pose
[localization_slam_toolbox_node-1] [WARN] [1700961733.852297938] [slam_toolbox]: Failed to compute odom pose

Searching the net for common reasons said a transform issue - not able to find a transform from base_frame → odom.

Wait - I told it not to use base_frame - use base_link. Why didn’t it use my params file?

Wouldn’t you guess, slam-toolbox has eight launch files and one of them isn’t like the others… the localization_launch.py is hard coded to use the default params file. Ugh. I already got in trouble for trying to report an issue with navigation, I’m not about to try again with slam-toolbox. (update: I reported it - glutton for punishment I guess)

I made the localization_launch.py look like the others, and voila! GoPiGo3 robot HumbleDave was able to figure out to the millimeter where he was in the kitchen map.

3 Likes

Wow - very nice. Sounds like you’re really homing in on the ideal navigation stack for the GoPiGo3 (pun intended).
/K

2 Likes

The “home” actually turns out to be a problem. Every posted video shows lots of hallways connecting relatively clutter free rectangular rooms with no shiny appliances, and not black UPS, computer enclosures, trash cans, chairs, and table legs. My great room concept home seems to make the mapping and localization algorithms “uncertain” ( uncomfortable? ). It is a tiny home with complexity to make it feel larger than it is.

The other issue getting in the way is heading errors in the odometry. Often, the LIDAR cannot “see” far enough to find more than one wall, and the heading error builds as the GoPiGo3 crosses the room such that the map is built with walls that are not parallel where they should be, or the wall corners are not the correct angles. Especially, reflex angled corners seem harder to map than obtuse corners, which seem harder than right angled corners.

Dave’s home room is our “office” which lacks visibility to the walls in nearly every direction, and the most “black holes”, totally defeating the mapper, thus my occupation of the kitchen as Dave’s playground.

I do feel progress, but I resent having to crawl when my thoughts are racing to “use ROS”, and I would much rather be following than forced to the GoPiGo3 frontier. ( Probably the reason ROS is taught with simulations first and very little with real robots, real sensors in real rooms. )

3 Likes

Just figured out the Turtlebot4_navigation is Apache licensed which means as long as I leave the attribution and copyrights - I can derive for GoPiGo3.

Here is the first “GoPiGo3 Navigation” lifted lock, stock, and barrel from TurtleBot4 Navigation (with only a name change everywhere!):

Dave’s Home Room


**Startup in Kitchen Playground**


Completion of straight drive in Kitchen


And for comparison - completion of straight drive in Kitchen using former “slam-toolbox with my params (including changing resolution from .05 to .01m)”

Interesting

UPDATE: The goodness is in my params - the badness (for the GoPiGo3) is in the TurtleBot4 params. Mine only updates the map every 5 seconds and only after traveling a half meter, with a resolution of 1cm. Theirs updates every 1/2 second with no minimum travel with a resolution of 5cm.

Mine must be allowing the localization to take its time for a confident heading and position, while theirs is spewing out more frequent maps with low confidence in the heading and position from the GoPiGo3 wheel encoders, making their global map a mess. The TB4/Create3 has triple fused odometry - wheel encoders, an IMU, and an optical flow sensor pointed at the floor. Fusing all those gives well behaved headings, where my lowly Humble Dave only uses wheel encoders to emit the odometry, which rapidly causes Dave not to know North from South, where he has been nor where he is heading.

2 Likes

AND driving slower - I decided to retry mapping Dave’s home room with these params and driving slower - much improved:

(The path is only showing the last 100 /odom /pose topics)


But once the boundaries are filled in, I have to stop mapping - it will not get better, only worse:

What is very interesting about the displayed path is the “magic jumps” visible after going straight for a little while. These jumps happen when the localization realizes the /odom topic has drifted and the robot is actually in a different spot on the map than the /odom topic is claiming. The localizer function starts adjusting new /odom topic “estimated location” based on the “better estimated location” until it detects the error is greater than some setting in the param file ( I don’t know which of the two pages of settings controls that).

The GoPiGo3’s terrible heading uncertainty and the resultant x,y drift is clearly evident. Dave started on his dock, perpendicular to the wall on the right of the map, and ended on his dock perpendicular to the angled wall displaced in the “y-axis”

2 Likes