GoPi5Go-Dave Shines With Turtlebot3 Cartographer

Last year I spent a month of days and nights learning to install, configure, and run the nav2 slam_toolbox package that is widely esteemed as the most accurate mapping and localization package yet. This is how slam_toolbox maps Dave’s “office” at 0.05 meter resolution:

Last week I spent one hour learning how to run turtlebot3_cartographer, and then one solid day creating the GoPiGo3 version called gpg3_cartographer. This is how gpg3_cartographer maps Dave’s “office” at 0.05 meter resolution:

Turtlebot3 Cartographer sure makes GoPi5Go-Dave shine!

1 Like

Glad to hear it!

(So, which of the smudges is the cat? :rofl:)

What really impresses me in this result:

  1. It managed to “see” the black UPS, the black flightsim computer, and the black filing cabinet enough to close the area. This is the LIDAR return standing in front of the dock:

  1. The red odometry line shows how bad my ros2_gopigo3_driver is at estimating the bot position, but using the odometry with LIDAR data, Cartographer managed to localize the robot correctly in the room after the first 2 meters of travel in the room.
  2. I did not have to tune or change anything, period! ros2 launch turtlebot3_cartographer cartographer.launch.py and it used my GoPiGo3 robot description (dave.urdf), my LIDAR (ydlidar x4 not turtlebot3 RPILidar, my lousy GoPiGo3 odometry, and didn’t complain there was no IMU data.
1 Like

I found the cat!

1 Like

Luckily, Dave doesn’t get that “juiced”, but I do think I read someone built a ROS cat detector.

1 Like

Doesn’t the latest GoPiGo O/S have some Tensor-Flow object recognition stuff in there?  I bet that would work as a “cat detector” and I’m sure you could cobble up something with Dave and/or Carl.  (Didn’t you have Carl doing lane finding awhile back?  I wonder if Carl’s getting lonely. . .)

1 Like

The original map looks like something you could put up in a museum alongside Van Gogh or Pollock.  (Sometimes I think they were “sauteing” some “mushrooms” for dinner.)

Maybe you don’t have self-aware robots yet, but they’re doing quite well as impressionistic artists!

That’s the sweetest part of all. . . .
Fiddler on the Roof

Like your scripts, it’s sweet when things come together like they’re supposed to.

Once I get some traction on the display stuff, I’m going to dedicate a SD card to trying out your downloadable ROS build, just to see what happens.

1 Like

Yes - with the PiCamera

Yes - it detects cats at roughly 1 frame per second - (sometimes even dreams it is seeing cats)

Yes, Create3-WaLI was detecting cats with the Oak-D camera at 30fps. Dave will be able to reuse the Create3-WaLI code without a change.

Carl has not indicated he is jealous of my attention to his little, younger brother. I do try to interact with Carl quite often to get the weather forecast, and to check his battery health of course. Carl has an annoying habit of trying to talk to the TV, so I often have to yell from livingroom to him “Hey Carl, go to sleep!”.

1 Like

Goal met!  He’s “interacting with his surroundings”!

Now you can take that off your bucket-list. . .

Sweet!  Nice when you DON’T have to re-invent the wheel for a change.

1 Like

It also recognizes bottles and wine glasses, so I thought about teaching Dave to guard my wine collection … but I’m not planning on restocking after I finish this 592nd bottle (in 20 years). It’s an aged, terroir select, heavy tannin, oaked Cab, (worth guarding), that I selected to be my last.

==== YOLOv4 

ros2 launch depthai_examples yolov4_publisher.launch.py camera_model:=OAK-D-LITE spatial_camera:=false

       "labels":
        [
            0 "person",
            1 "bicycle",
            2 "car",
            3 "motorbike",
            4 "aeroplane",
            5 "bus",
            6 "train",
            7 "truck",
            8 "boat",
            9 "traffic light",
            10 "fire hydrant",
            11 "stop sign",
            12 "parking meter",
            13 "bench",
            14 "bird",
            15 "cat",                     <<<------- CAT
            16 "dog",
            17 "horse",
            18 "sheep",
            19 "cow",
            20 "elephant",
            21 "bear",
            22 "zebra",
            23 "giraffe",
            24 "backpack",
            25 "umbrella",
            26 "handbag",
            27 "tie",
            28 "suitcase",
            29 "frisbee",
            30 "skis",
            31 "snowboard",
            32 "sports ball",
            33 "kite",
            34 "baseball bat",
            35 "baseball glove",
            36 "skateboard",
            37 "surfboard",
            38 "tennis racket",
            39 "bottle",                      <<<--- BOTTLE
            40 "wine glass",              <<<--- WINE GLASS
            41 "cup",
            42 "fork",
            43 "knife",
            44 "spoon",
            45 "bowl",
            46 "banana",
            47 "apple",
            48 "sandwich",
            49 "orange",
            50 "broccoli",
            51 "carrot",
            52 "hot dog",
            53 "pizza",
            54 "donut",
            55 "cake",
            56 "chair",
            57 "sofa",
            58 "pottedplant",
            59 "bed",
            60 "diningtable",
            61 "toilet",
            62 "tvmonitor",
            63 "laptop",
            64 "mouse",
            65 "remote",                  
            66 "keyboard",
            67 "cell phone",          <<<--- CELL PHONE
            68 "microwave",
            69 "oven",
            70 "toaster",
            71 "sink",
            72 "refrigerator",
            73 "book",
            74 "clock",
            75 "vase",
            76 "scissors",
            77 "teddy bear",
            78 "hair drier",
            79 "toothbrush"
        ]

Maybe I should teach him to answer “Hey Dave, Where’s my cell phone?”

1 Like

Here is the tile areas of my house mapped with “gpg3_cartographer” (lifted from turtlebot3_cartographer):

It is surprising to me how well it is doing considering it is ignoring the encoder odometry.

Here is the official floorplan map, and the “with furniture cartographer map”:

1 Like

Cool beanies!

Looks like your idea to (ahem!) “borrow” some of the Turtlebot code for Dave was a good idea.

1 Like

What parts of this, if any, can you teach Carl?

Four years ago, I went down the path of trying to figure out how to do this for Carl with very limited success for the effort invested:

1 Like

. . .and absent a LIDAR, (et. al.), it’s a black hole with a wide event horizon sucking up time, 'eh?

1 Like

It was a good introduction for me to what is going on under the covers in ROS. I really believe starting with a non-ROS robot lays a good foundation, then leaving the basics to ROS allows computing to use the robot and sensors to accomplish higher level functionality.

Many ROS learners seem to skip:

  • understand the robot’s SBC
  • understand the robot controller, motors, encoders, battery, joysticks, and motion

and rush into:

  • run a tutorial to make a map (without understanding how the LIDAR data becomes an occupancy grid)
  • start a node that localizes the robot within the map (without understanding how the LIDAR data is matched to the map)
  • run a tutorial for navigation from pose to pose
    without understanding obstacle avoidance, global and local path planning and replanning,
    managing acceleration, cost maps, and 50 other configurable parts in the ROS 2 navigation package

and then claim they know ROS but they:

  • never wrote a program that “interprets” the LIDAR data
    (find a corner, find a wall, find a door, recognize an object based on the scan of the object from different angles)
  • never wrote a program that uses the localization
  • never wrote a program that navigates with a purpose
  • never wrote a function for when the robot loses localization
  • never wrote a program for when navigator returns “failed to reach goal”, or “cannot find route to goal”
  • never wrote a program to only run the LIDAR when needed, or the localization, or navigation package
  • never wrote a robot health program
  • never wrote a battery management program
  • never wrote a “safe wall following” program
  • never wrote a “safe wander” program
  • never wrote a visual mapping program
  • never wrote a visual localization program
  • never wrote a visual obstacle avoidance program
  • never wrote a “varying light level tolerant” visual localization/mapping/navigating robot
  • never wrote a program for two-way robot communication with humans
  • never wrote a behavior tree or state machine program for an autonomous robot

ah, I guess I should stop being so negative and get back to learning how do some of these myself, before I forget why I want to learn them.

1 Like

:rofl:

You sound a lot like me there.  Me, being silly enough to want to know why something works the way it does, and how to make it do what I want, before leaping off that cliff trying to do it.

Not having a fundamental grasp of the subject is like bungee-jumping without attaching the bungee-cord.

I am playing with my display test script and was thinking that setting a mutex was a good idea - it turns out that it’s easier to just test for a file and bail if it exists.

1 Like