GoPi5Go-Dave On The Shelf With Other "Reached Its Limit" Robots

GoPi5Go-Dave On The Shelf With Other “Reached Its Limit” Robots

WALL-E from 2023 Create3-WaLi robot, GoPi5Go-Dave built in 2018, and RugWarriorPro robot built in 2000

GoPi5Go-Dave View Of Environment Is 45 Degrees Off

GoPi5Go-Dave Got Confused and Decided To Woo Some Chair Legs

It was going well on the global path planned, but apparently the local planner decided Dave was too close to a “pop-up” chair leg. Perhaps in this case an increased inflation value might work, but would cause more problem than it solves in other areas.

GoPi5Go-Dave Cannot Accurately Return To His Dock

Seven Years Invested In GoPiGo3 Robots

I am done. Can’t do it anymore.

  • The odometry is just not accurate enough.
  • The PID control does not drive straight.
  • Turns are not accurate.
  • The tires are not “grippy”.
  • The encoder API is whole degree resolution.
  • The wheelbase is too narrow.
  • The vendor stopped maintaining drivers two years ago.
  • Vendor has no ROS 2 support
  • No other ROS 2 GoPiGo3 robots to learn from.
3 Likes

As painful as it is to not have a “robot goal”, the dissonance of I was expecting from my GoPiGo3 robot to Dave’s sensory abilities continues to become clearer.

One example - I was sure vision would help Dave avoid obstacles and “recognize where he is” (localization). I first purchased an Oak-D-Lite stereo depth camera for Dave but was not at a point to use it.

Later, I attempted to use it with Create3-Wali but the additional ROS message traffic kept crashing the Create3. From this experience, I determined that the limited field of view (FOV) of the Oak-D-Lite was not optimal and purchased a wide-angle FOV camera - the Oak-D-W.

Yesterday, while reading an article about the Amazon Astro’s superior home navigation technology I realized that even my wide FOV stereo camera is not sufficient for navigation in my home. Notice the dots in this Astro robot visualization:

Those dots come from an IR illuminator. That would have raised the cost of my $350 stereo depth camera another $150 so I chose not to buy that feature. Bad choice - because the lighting in a home is so variable, safe and reliable home navigation needs reliable 3D vision.

There is a reason the Amazon Astro costs $1600 and has the best home navigation and best human user interface available in a commercially available robot, but with no user API it is reduced to being strictly a home security robot.

1 Like

. . . .which emphasizes the importance of the KISS rule:

Reduce a problem to simple, manageable chunks.

Every time a major project of mine crashed and burned it was because I “bit off more than I could chew” and tried to do too much too quickly.

As Einstein said:
“Make things as simple as possible, but no simpler.”

Maybe you should take the tools and knowledge you’ve gained and explore the limits of what IS possible.  Once you’ve carefully defined the current limitations, then you can - slowly - try to exceed them, one tiny step at a time.

And remember, some of the most important inventions and discoveries were accidental while looking for something else.

  • The guy at DuPont who was looking for a flexible, chemical resistant plastic and accidentally invented Teflon.
  • The guy at Corning Glass who accidentally over-annealed a glass bowl and discovered the glass-ceramic Corning-ware.
  • Dr. Lee DeForest was doing research into flame physics and propagation by studying the ion-exchange within a flame, (using a heated piece of wire in a vacuum), trying to improve furnace combustion efficiency and accidentally invented the amplifying vacuum tube.

What say ye?

May get there one day.

Dave “ran” a 1k with me “driving” - it would be an advance for Dave to be able to “sidewalk lane follow” for 1k.

With Create3-Wali, I managed to get RTABmap to produce a 3D point cloud a couple times before the Create3 crashed. It would be interesting to understand what Dave is actually able to “see” with the wide camera and a “won’t die” Dave node.

Dave is a “16 tick encoder” GoPiGo3 which means his encoders actually tick off 5.33 ticks per degree of wheel rotation, even though the API only reports single tick per degree. It may be that Dave could have better heading odometry if I code the ros2_gopigo3_node to use the raw encoder counts instead of the GoPiGo3 API. The wheel slip would still be there, and the not driving straight, and …

There are some robots that do not have wheel encoders. It would be interesting to know if Dave would navigate more reliable without the encoder odometry feeding into the localization - pure LIDAR, or Dave has an MPU9250 IMU that I never built a publisher node for, perhaps LIDAR plus IMU would work better than LIDAR plus encoder odometry.

The last two are “platform limitation workarounds” which I just cannot stomach anymore, especially after seeing how wonderful the Create3 odometry was. (And it could find its dock and dock itself, had bumpers, and and and … if it would only not have died when I fired up RTABmap, and now perhap if only the iRobot company wasn’t sinking.)

It just feels so “for what?” in light of reading what a team of PhDs did for Astro:

This is what I had in mind for Dave to be able to do:

but in truth, I don’t understand the vocabulary so it is unlikely I could have taught Dave “where to find objects”, let alone learn how to extend the object recognition model as needed. I’m just not able to “go where no man has gone before” alone. (Interesting they were using a TurtleBot3 for that investigation - Also has far superior odometry, and their “home” was very sterile compared with my real nasty home.)

Then there is the whole field of machine-learning (Donkey Car approach) where I would drive Dave to the kitchen and back to his dock twenty times, and Dave would build a model from the encoders and drive commands so he could drive himself to the kitchen and back. And then add a “drive around the house” self-learned model from me driving him around the house twenty times. It works for Donkey Cars with a couple ultrasonic sensors I think.