GoPiGo3 ROS2 Robot Dave Wanted To Run A 5K

After Dave saw “Cassie” run a 5k, he asked for a remote controller and to start training for his 5K.

There are several ROS joystick controller nodes to choose from. The most basic only translates /joy messages to /cmd_vel translation and rotation messages. For some time I was trying to understand a more complicated version that allows assigning any ROS command to buttons. Yesterday, I decided to just configure the simple teleop_twist_joy to drive ROS2bot Dave via remote control.

ROS allows distributing nodes anywhere on a network, but if Dave wants me to take him for a 5k attempt outdoors beyond his home network, the joystick has to be connected directly to the robot.

Most ROS robots are using Xbox or PS3/PS4 controllers which cost $40 to $50, but finding a USB Wireless SNES Gamepad for $14 seemed like a perfect mate for Dave. If you need to train your DonkeyCar to go around sharp curves, having continuously variable direction and speed is a must, but Dave can make do with coarser commands like “walk: translate at 0.1m/s”, “run: translate at 0.2 m/s”, and “turn: rotate at 1 rad/sec” which the SNES Gamepad can command easily.

Yesterday I took Dave for his first 5K training runs. I don’t think he’s got it in him, and I found out my thumbs don’t either. My thumb started aching after the first two or three minutes holding the “up” SNES gamepad button, and about ten minutes in Dave seemed to get tired ignoring my “TURBO ENABLED” requests. After only one “run” and one “walk” through all the rooms of our home, (roughly 150 meters), Dave started begging for “juice”.

2 Likes

You need to implement Newton’s First Law:
“A command remains in effect until countermanded or changed.”

1 Like

Only if the robot has a complete set of “Asimov’s Laws” implemented to protect itself. Dave isn’t that smart (yet?).

2 Likes

That’s what bumpers and distance sensors are for. :wink:

P.S.

Great video.

I especially liked the “ROS” views of what the robot was doing interspersed with the live views.

2 Likes

Did you catch the mystery that ROS Rviz revealed?

Update: (ROS Rviz not ROS rqt - got my tools scattered all over the place)

Today:

 
Yesterday:

(I didn’t expect the DI Distance Sensor to be able to see through walls, and today the distance sensor display stopped displaying until I restarted Rviz2. I confirmed the data is being published, but Rviz2 is getting confused after a while.)

2 Likes

Very cool. I haven’t tried showing distance sensor data with RVIZ.

Maybe if you found a nice 5K straight downhill slope (would be even better if Dave had regenerative braking to keep his batteries topped up).
/K

2 Likes

This is South Florida - the largest slope Dave encounters is the 1/2 inch going up and back down to/from the living room rug and tile floor. Turtlebot3 specs a 10mm climbing limit, so GoPiGo3 Dave is proving the better bot in our home.

Dave squeaks like an old farm house water pump even when rolling on the flats. Not sure if all the “new GoPiGo3” motors are squeakers, but Dave must be using half his energy just in the squeaks.

Today, I took him twice around the house for 130 meters in 29 minutes, squealing like a piglet the whole way. I have to work my thumbs (and patience) up to meeting the challenge of seeing just how far he can go with no rest stops till the safety shutdown happens.

Someone needs to 3D print a protective undercarriage for the GoPiGo3 motors to seal out the dirt and metal flakes that rolling on the sidewalk will pick up. I’d love to see how far around the block he could go.

2 Likes

Yes for sure, except it is very enlightening to see what Dave is and isn’t “seeing” with the LIDAR and the distance sensor in ROS2 Rviz2. The very black, very large UPS right next to Dave’s “home” is completely invisible, as is the black trash can under my desk.

Neither LIDAR nor DI Distance Sensor see the black UPS

 

I need to flesh out my virtual bumpers from motor status idea.

(There might be something to the “old tech” ultrasonic sensors that served for obstacle detection before laser diodes and time of flight measurement of a few photons became the “new tech.” Also makes me worry about all those autonomous vehicles sprouting LIDARs from every corner. Folks in black cars might want to put some big reflective X stickers on all sides.)

2 Likes

What is a "ROS rqt? And other than displaying a very rough approximation of Dave’s surroundings, I don’t know what other mysteries ate there. Maybe that’s why they’re mysteries?

Perhaps a drop or two of light machine oil on the bearings at both ends would help?

A small syringe and needle makes an excellent pin oiler.

Is something rubbing?

2 Likes

I had wondered about that too, having noticed that the lidar doesn’t “see” black. Of course it’s possible that the top coat is reflective in the right frequency. But that wouldn’t help the car with a cheap Maaco paint job…

/K

2 Likes

The mystery now is how I could use the ROS2 tool Rviz2 so often and refer to it as “rqt” here and in the video. There is a ROS tool called “rqt” for topic (messages) graph visualization - what publishes what and what subscribes to what. The ROS Rviz and ROS2 Rviz2 tool is for robot state and data visualization.

The mystery revealed by looking at the Rviz2 visualization of what Dave was “seeing” was that the black UPS was totally invisible.

Except in this case the over-reliance on a single technology in multiple distance sensors and the lack of bumpers is endangering Dave’s peaceful co-existence with his environment.

On a LIDAR equipped robot, the V53L0X based (Laser Time-of-Flight) sensor is dangerous redundancy. I need to change out Dave’s servo mounted $30 DI Distance Sensor for a GoPiGo3 compatible “$4 Grove Ultrasonic Sensor” version of the not-compatible HC-SR04 ultrasonic sensors that I happen to already have in my junk box, to improve Dave’s obstacle detection abilities. (And write a ROS2-on-GoPiGo3 compatible ultrasonic sensor node, and change my URDF, and, and, and.)

This visualization of the invisible exposes the biggest concern I had when abandoning my RugWarriorPi effort to begin creating a GoPiGo3 based robot - the GoPiGo3 has no skirt or bumper.

I thought that the PiCamera would provide the obstacle detection of the missing bumper, but I never got to that competency in my OpenCV learning. I remember touting how the GoPiGo3 didn’t need a lot of hardware sensors because it can use the PiCamera for everything. I even created an EasyPiCamSensor for

  • color detection
  • stereo light intensity measurement
  • brightest light position measurement
  • motion detection
  • small image capture (to feed future image processing sensor streams)

Big ideas with tiny follow through.

So I ran off to learn about ROS instead of digging in on something useful. Bummer.

ps. Even though Dave is not an outdoor bot, Ultrasonic Sensing may work better outdoors than light based sensors.

2 Likes

Some might think that ROS is useful :slight_smile:

I think you’ve accomplished a ton. As for the pi camera and OpenCV - image processing is computationally intensive. Might well be able to run some trained models on the pi while also running some control software of choice, but not sure how much you can train models on the Raspberry Pi.

I need to finish the last 3 chapters of the Hands on ROS book - they touch on this. I’ve been stalled at navigation/localization.

/K

1 Like

The Dexter Industries TensorFlow-lite object classification demo I tried on Carl ran quite compatibly, and object detection should be much easier than classification or recognition. I remember a pre-AI era robot that used a horizontal line in the image to detect when any blob crossed from above the line to below the line indicating a nearing obstacle. That kind of “impending doom” detection may require less image processing than TF-lite, or perhaps if implemented as a TF-lite model would be able to handle 10-20 frames per second on the RPi3B/3B+ of Carl/Dave.

Ouch. I should have qualified that as “immediately useful to my bot.”

2 Likes

I sure understand you there - I stalled for a couple weeks from frustration with the teleop_joy node’s lack of documentation or tutorial. Switching approach to using teleop_twist_joy is sufficient success to feel eager to move on to localization, mapping, and navigation.

I’m not sure if I should continue working through the Hands-on-ROS book for that, or perhaps start following the TurtleBot3 tutorials (available for ROS and ROS2) for Localization / Mapping, and Navigation. Especially since I am not interested (at the moment) to learn using ROS simulation. I have the ROS2 robot up and running now, and my ROS2 desktop environment working somewhat. I want to give it some smarts - even if it is a “two headed monster” for the moment.

2 Likes

Yeah - I’m with you. For some reason I thought the last few chapters were more about using the actual pi camera for image recognition. Now that I look in more detail it really is more simulation stuff. Not that simulation doesn’t have its place, but I want to play with my actual robot.

/K

2 Likes