Finally, GoPi5Go-Dave Navigated Away From Home

After two years of off and on attempts to teach my Raspberry Pi powered robot (and myself) to use the nav2 stack, finally…today ROS2 Humble robot GoPi5Go-Dave navigated (using a whole house map he previously created) to the next room and back to be in front of his dock.

I first attempted to learn slam_toolbox and nav2 totally on my own, then unsuccessfully tried adapting the TurtleBot4 code. Next I successfully adapted the TurtleBot3 waffle model, TurtleBot3_Cartographer, and TurtleBot3_Gazebo packages for my robot, so I thought it would be a piece of cake to convert the TurtleBot3_navigation2 package for my robot.

Now, navigation needs base_footprint in the URDF, which my robot never needed before so no problem, just add the link, and joint and we’ll be navigating in no time, right?

Well, I had chosen to have my base_link at origin 0,0,0, and the base_footprint at origin 0,0,0, which for some reason made the base_footprint magically get optimized out of existence, or something. I kept getting “No transform from odom to base_footprint”.

Solution: Move base_link up to the center of the wheels, return the base_footprint to the floor with a negative wheel radius, and update all the joints for the new base_link not on the floor - (joints: camera, imu, wheels, lidar, each platform, and the Minion Dave character)

After “only make one change at a time” to the URDF, kill and restart the joint and robot state publishers, and check the robot model in rviz2 a few million times, the URDF was ready.

I fired up the LIDAR, fired up my “GoPiGo3 Robot Nav2” package on the bot, and launched rviz2 on my desktop. We’ve got the map, so give it an initial pose estimate. We’ve got occupancy grid dots appearing, and look at that - cost map is displayed.

Time for the test - give it a navigation goal a meter forward and pray! Yo! It navigated.

I got brave and gave it a goal in the next room, that would need finding both a local and global path, and non-mapped obstacle avoidance. I nearly pee’d my pants with excitement. (At age 72 that is more of a concern than it used to be.) GoPi5Go-Dave arc’d out the non-orthogonal doorway of his home-office, around a corner, avoided the sofa, avoided the dining room chair, avoided the table, and arrived at the non-orthogonal next room boundary of the goal. Almost at the goal, he paused with a bout of indecisiveness at a very complex wall section. I canceled the goal and gave him a moment to rest, as I returned to my desk.

As he announced “Battery at 10.3v Need To Prepare For Docking”, I issued a new goal for him to return to his docking ready pose. He successfully navigated back “home” and with a slight nudge from my foot, he was in the proper position to dock as he announced “Battery at 10.1v Docking Now”.

GoPi5Go-Dave successfully navigated his first adventure away from home and back.

2 Likes

Da’ Bomb!  Totally Righteous!

WALL-E should be green with envy!

I’ve always said it - the only limitation to the GoPiGo robot is the limitations you impose on yourself.

:tada::tada::tada::tada::tada::tada:
:partying_face::partying_face::partying_face::partying_face::partying_face::partying_face:
:beach_umbrella::beach_umbrella::beach_umbrella::beach_umbrella::beach_umbrella::beach_umbrella:

2 Likes

GoPi5Go-Dave Gloriously Got LOST on his second adventure!

Guess it was beginner’s luck, because today when I told GoPi5Go-Dave to go to the front door and come back, he had a short bout of vertigo on the way, recovered enough to reach the front door. When he began the return, he was crippled by a total head-spinning bout of vertigo, threw up his hands and declared he was not moving another millimeter until he could figure out where he was (which never happened.)

I had to drive him home.

2 Likes

. . . but you’re making progress.

“He tried and failed” is much better than “He couldn’t even try”.  You’re going to have to find the robot equivalent of Dramamine.

2 Likes

News: Pulled the plug on ROSbots, put Dave on the shelf, and unsubscribed from Everything ROS.

Done. Got two t-shirts.

2 Likes

What the heck?!!

What just happened?

2 Likes

Realized I have invested three years and arrived at the exact place I was before.

GoPiGo3 wheels, wheelbase and encoders, with or without IMU are not up to that necessary for ROS localization in a non-sterile environment (real home with furniture, non-orthogonal walls, mirror walls, black furniture and trashcans, crevices between cabinets, shiney enamel appliances, low and variable lighting across the day and rooms, …) And Dave was leaving tracks on the carpets, a sin that sent robot Cye back in 1995.

Just not ready for prime time with immense invested hours.

p.s Did I say immense? Colossal, Stupendous, Ungodly, life unbalancing brain-cell damaging effort stole my sanity, with no one leading the way. There are some dreams we just have to leave to future generations, future robots, and future Raspberry Pies.

2 Likes

Allow me to respectfully disagree.

True, you have invested much time and money trying to create an autonomous robot.  And it is also true that you’ve had some absolutely glorious failures along the way.

However, it is also true that you’ve gone far, done much, and gone places where a GoPiGo robot has never gone before.

Your failures may have been spectacular, but your successes have been no less astounding.

For example:
You were able to get Dave to navigate to another room and return under his own power, completely unassisted the entire way!

IMHO, that’s astounding progress! :astonished: :clap: :tada: :partying_face:
:+1:

Next you tried something a bit more ambitious, and it failed.  In a way, that was to be expected - it’s a more complex task with potentially different parameters and situations.

Should I give up flying simply because I botched a landing with “fantasy weather” set?  (Fantasy weather = no clouds, no wind, 20° C, perfectly dry, low humidity, no traffic/ATC anywhere, etc.)

Of course not.  I “get back in the saddle” and see how badly I botch up the next landing!  :wink:  I’m seriously considering starting a MSFS thread on “how badly did you botch a flight/takeoff/landing today?”

I’m having no end of trouble getting displays to work on Charlie and Charlene.  I think I’ve blown an interface chip on Charlie’s controller.  The diminishing daylight is getting to me and the girls are hogging the game computer which makes it difficult to get any stick time.

The current political situation has me wanting a change of venue to Mars, (or maybe a different solar system!), but that’s the way it is.

If you need a break, take it and clear your mind.  If you need time away from the robots, take it.  If you need a trip to Disney World or California, do it.  Go on a birding trip.

But don’t give up the ship - you are so very close!

What say ye?

1 Like

It may appear as such, but I am the one close to the details. When I was “here” the first time (seeing that the GoPiGo3 was inadequate for my dream), I sterilized various aspects of the environment until the GoPiGo3 could barely utilize ROS software. I had to stop progress toward “my robot dream” to work on the platform.

Then twice I purchased the “best” educational ROS platform - TurtleBot4 and subsequently the Create3 in an attempt to leap frog the two major limitations of the GoPiGo3 platform - (The physical robot, and the lack of vendor ROS 2 support.)

With the TurtleBot4, I learned that ClearPath had built marketing smoke and mirrors over the iRobot Create3 in hopes of selling their industrial robot platforms. Luckily I discovered this within the 14 day return window.

With the Create3, I found strong developer ROS 2 support and a platform that was capable of running the basic ROS 2 robot mapping and navigation packages, so I set out to build upon the platform with robot vision. The Create3 processor which was running at 90% utilization for the basic ROS support, could not support my dream. I was grateful for the the developer attempts to extend the product beyond what they had tested to work, but I was back working on the platform limitations prevented from progress toward my dream. They offered a refund and that I should keep the bot to continue working with them on finding a work-around to the platform processor limitation. It was clear to me, I was again with a platform that could not make my dream a reality, and was alone at the research edge of home robotics.

I returned to the GoPiGo3 in hopes that deeper study of the open-source ROS packages could overcome the platform limitations. I now understand very well the strengths and limitations of the platform and the ROS packages

For reference, users report their $1700 “dedicated developer team” Amazon Astro robots getting lost in their homes, and Amazon just dropped the “Amazon Astro for Business” to focus on home robotics. I am not willing to risk $1700 on a robot platform that could soon lose developer support, especially since the platform depends so heavily on Amazon cloud computing.

I am not “so close” to solving my GoPiGo3 as a ROS platform limitations. I am seeing very clearly now what I was only feeling before, that “[I] can’t get there from here” - not with ROS, not with any available ROS robot platform, not alone.

Taking a break from banging my head on every inch of wall of this lead balloon I have been exploring, is not going to show me there is a door I can pass through to fly off happily into the sunset.

2 Likes

I’m not denying your feelings or conclusions, but I don’t get it.

To me at my, (admittedly distant), end of the telescope, it looked like you were making progress - Dave successfully navigated from its dock to some remote location and “returned [him] safely to Earth again”, IMHO a major accomplishment worthy of dancing in the street!

Then you tried something more complex and it failed.  As a result you, [appear to have], totally threw in the towel on, [what appeared to me to be], the very brink of success.

What happened?  Why was the second attempt so totally disqualifying that you relegated Dave to the Shelf Of Shame?

I am confused.  You talk about the “limitations of the GoPiGo platform and ROS”, but there’s no mention of what these limitations are or how they lead you to this conclusion.

If you’re going to (hopefully temporarily) relegate Dave to the Shelf of Shame, can you at least document the problems that lead to these conclusions for the benefit of others who may wish to follow that same path?

What about Carl?  Is he going to be relegated to that same Shelf of Shame?

Are you going to give up on the GoPiGo platform entirely?  Or do you have other projects planned?

1 Like

AND

  • no vendor ROS 2 support
  • zero active ROS 2 GoPiGo3 users to learn from

Keith ran into the same encoder/wheelbase/slippery_wheels limitations in ROS (1) which the LIDAR in his “sterile playground” managed to localize in.

My first attempt with ROS on the GoPiGo3 was 2018. I think discovering that I am no closer in Nov 2024 to my robot knowing with reasonable confidence where it is and being able to return to his starting point is “about time I admit the truth” - I can’t get there (my dream) on this horse, especially without partners on the same hill I’m camping on.

1 Like

Carl continues to talk quietly during the day, and tells me the current weather or weather forecast - “Hey Carl, wake up”

  • “I am awake now”
  • “Hey Carl, weather forecast?”
  • Forecast for Boynton Beach Sunny 85 degrees, wind 14 from 220 degrees Humidity drippingly muggy"
  • “Hey Carl, Go to sleep”
  • Going to sleep. Listening only for ‘wakeup’

I back him up quarterly but have not been able to update him in many years.

I have no GoPiGo3 projects in mind. Thinking if I want to confiscate Dave’s intelligent camera to a desktop Pi or to my Mac to learn “tranfer learning” to recognize Hanna and my mug. That was the first on the list of my dream for Dave whenever he became a stable platform to buid on.

1 Like

Even if I were messing with ROS(n) now[1], I doubt you would learn anything from me. . .  Not to mention that getting the LIDAR, optics, and such-like is almost impossible here.

==================== Footnotes ====================

  1. ROS(n) is on the list after I finish the display tests, though I do not know if I want to revisit the joystick controlled robot testing again before that - to see if I can eliminate the need for a secure certificate.  In any event, it’s on the list and near the top.
1 Like

Is knowing exactly where you are really that important?  Isn’t it sufficient to know what you are near, or where you are relative to other things?

I will have to investigate this later, but shouldn’t you be able to create a map relative to where you think you are, and refine that map as you move around?  Especially with a spinning LIDAR mapping everything around you, (i.e.  Knowing where you are, approximately, with respect to other things around you), shouldn’t that be sufficient?

You don’t need encoders on your feet to know how far you’ve traveled, and you don’t need a spinning LIDAR to know, (approximately), how far away you are from various objects around you.

Shouldn’t you be able to do that with a robot?

1 Like

Me? No.

And the whole hope for ROS was that I wouldn’t have to.

Sensors are not perfect.
The environment is not simple.

Everything devolves into multi-variate statistics with weird Greek letters representing multi-dimensional arrays of imaginary numbers

But my brain explodes before the solution is ready.

Reality is not simple, and simplifications only work on paper (or in simulation).

And BTW:

Right you are, but just for a test - lay down on your living room sofa, close your eyes, get up keeping your eyes closed and your arms at your sides. Now take a trip with your eyes closed and your arms at your sides - visit every room and while in the room, with your eyes closed and arms at your sides, turn to face each wall and take one step forward. Still with your eyes closed and your arms at your side return to the sofa and lie down.

Yeah right - you don’t need encoders or a spinning LIDAR - you need stereo depth perception, 3D mapping, and visual localization.

Damn right - need to recognize the "what"s with your eyes, have depth perception for “near”, and localization to a half meter or so to know “where you are relative to other things”, and relative to the mental map you built using your eyes and that depth perception and a ton of object recognition.

Encoders and LIDAR are easier computationally, but still struggle with real world complexity. My math skills are not strong enough to verify nor debug why the GoPiGo3 ROS odometry is so bad - including worse than my long ago encoder straight Python dead-reckoning tests.

There are robots that do not have encoders, but they don’t use sub-$100 LIDAR either. There are smart people that understand how to fuse IMU data and encoder data to get better Odometry - I was not smart enough, and I devoted three solid weeks trying.

I don’t understand why you are surprised by me giving up. It’s been six years of pain. Sometimes a person doesn’t need a break, they need to admit the time is not right, the tools are not right, the knowledge is not sufficient, the investment has not paid off.

I don’t regret my investment in Raspberry Pi computers, programming the GoPiGo3, nor in ROS. I just don’t want to continue going in circles trying to create the platform I need. I’m exactly where I was several years ago, with better insight into where exactly I am and the implications of the limitations of the platform I have been ignoring.

I’ve learned what I need, and what I want, and what I have, and what is available, and what the market is, and what is possible. Leonardo DaVinci dreamed of flying, but it took a future time for his dream to be a reality.

I have dreamed of an interactive, aware autonomous robot that learns about its environment and perhaps could document its “being a robot knowledge” to be of use by a future robot researching “primitive robot life”.

My career included recording domain knowledge in conceptual graphs, reasoning with rules, inferencing from observations, sensor fusion, flying vehicle path planning with situation assessment, speech recognition using grammars and natural language understanding, reasoning with surface maps, (military) organizational pattern recognition, error detection and correction, fault tolerant systems design, estimation, queue optimization, rule-based cancer drug safety management, even restaurant workforce planning based on current and historical events. My robot dream combines a little from every one my career projects (well, skip the optimal munitions delivery planning), but I see that the time is just not right and I shouldn’t have been trying to do this alone.

1 Like

Here is what happens:

  1. Initial Map and Know Where We Are:

  1. Next Room - Totally confused about where and what heading:

  1. Returning Home - Even more confused - ran into chair not visible by LIDAR, kept wheels spinning trying to drive forward:

For reference - look how wonderful localization in in the kitchen:

but getting there I had to “rescue Dave” five times when his “I know where I’m at now” was unrecoverably wrong:

But Dave was “feeling great”: only drawing 9W at 25% processor load using 1.4GB memory.

1 Like