New Toys for Charlie!

Greetings!

I’m here in the US - and just had the inestimable joy of a total knee replacement!

While I’m here, I bought some new toys for Charlie that I plan to install when I return.

  1. A line-sensor
  2. The IMU board.
  3. A Pi-4. (actually, a couple of them, but they’re not all for Charlie)
  4. An Adafruit Real Time Clock module for the Pi.
  5. Two touch-screen displays - a 7" display that connects through the display connector and a smaller one that connects through the GPIO ports. The idea here is to give Charlie a way to communicate with the world - and accept inputs - while autonomous.

Since I am also a confirmed masochist, I picked up a NVIDIA Jetson Nano Dev board. It’s about the same size as a Pi, (but has a humongous heat-sink), with a zillion CUDA cores, eats Tensor Flow for lunch, uses the Pi’s camera natively, and is specifically designed as an AI platform. It would be interesting to see if I can get it working with the GoPiGo robot.

I am also talking to a company in the UK about getting a round-cable camera cable so I can attach the camera to the distance sensor/servo and have it track with the distance sensor.

More when I know more!

Jim “JR”

1 Like

Efficientnet-B3

Summary

Raspberry Pi 3B running TensorFlow / Efficientnet-B0 had 80% accuracy in 539ms
and Efficientnet-B3 had 81% accuracy in 1.9s

Jetson Nano “winner” using TF-TensorRT-PyTorch Resnet-50 2.67 seconds 64% Top-1 accuracy

1 Like

I would expect the Jetson Nano to eat the Raspberry Pi’s lunch, what surprised me was how well the Pi did. It would be interesting to try it again with a Pi-4, 4G version.

What did disappoint me was that there appeared to be no, (or very little), effort to normalize the data across all the systems tested. Some systems had a “smaller” dataset than the others - ergo, (IMHO) the results are meaningless because we don’t know the sample size used for each system.

Even if we did, the results are still apples-to-oranges because we don’t know anything about the data. (i.e. Was the data “cherry-picked” for the Pi? What about the data for the other systems?) We’re given so little information that it is literally impossible to attempt to duplicate and verify these results, or critique the actual methodology used.

With a consistent dataset, and a reasonably repeatable processing model, (even though it may be tweaked for each system - as I would expect), the results would actually mean something. We could then ask real quality questions like:

  • Given datasets of increasing size, what is the effect on accuracy?
  • How does processing time vary with data-set size?
    (i.e. How quickly does it rise per ‘x’ increase in dataset size? Is the progression linear, logarithmic, exponential, or what? Etc.)
  • What is the effect different kinds of images on processing time? Clear backgrounds? Busy backgrounds? Monochrome? Color?
  • And so on, and so on, and so on. . . .

In summary, it’s an interesting article that really doesn’t tell us much except that each of these systems can do at least some kind of tensor-flow/deep-learning processing. Which, of course, we already knew.

What will be interesting is when I get a chance to try some of this stuff with Charlie. I’ll probably go with the Pi-4 first and see where this leads me.

Jim “JR”

@jimrh,

I have to agree with you. The Jestson Nano blew the Rpi 3B+ away when it came to SLAM performance using the RPLidar A1M8. I couldn’t find the original article concerning the performance comparison, but it was substantial if I remember correctly.

Also, you have to keep in mind that DI has been promising to incorporate the RPLidar into the GoPiGo3 programming but have yet to provide anything substantial in the way of a “how to”. I think that they have found that the Rpi 3B+ is somewhat overwhelmed by the required SLAM processing to be effective for reasonable navigation.

Regards,
TCIII

1 Like

@jimrh,

Here are some Deep Learning Inference Benchmarks:

Model Application Framework NVIDIA Jetson Nano Raspberry Pi 3 Raspberry Pi 3 + Intel Neural Compute Stick 2 Google Edge TPU Dev Board
ResNet-50
(224×224) Classification TensorFlow 36 FPS 1.4 FPS 16 FPS DNR
MobileNet-v2
(300×300) Classification TensorFlow 64 FPS 2.5 FPS 30 FPS 130 FPS
SSD ResNet-18 (960×544) Object Detection TensorFlow 5 FPS DNR DNR DNR
SSD ResNet-18 (480×272) Object Detection TensorFlow 16 FPS DNR DNR DNR
SSD ResNet-18 (300×300) Object Detection TensorFlow 18 FPS DNR DNR DNR
SSD Mobilenet-V2 (960×544) Object Detection TensorFlow 8 FPS DNR 1.8 FPS DNR
SSD Mobilenet-V2 (480×272) Object Detection TensorFlow 27 FPS DNR 7 FPS DNR
SSD Mobilenet-V2(300×300) Object Detection TensorFlow 39 FPS 1 FPS 11 FPS 48 FPS
Inception V4(299×299) Classification PyTorch 11 FPS DNR DNR 9 FPS
Tiny YOLO V3(416×416) Object Detection Darknet 25 FPS 0.5 FPS DNR DNR
OpenPose(256×256) Pose Estimation Caffe 14 FPS DNR 5 FPS DNR
VGG-19 (224×224) Classification MXNet 10 FPS 0.5 FPS 5 FPS DNR
Super Resolution (481×321) Image Processing PyTorch 15 FPS DNR 0.6 FPS DNR
Unet(1x512x512) Segmentation Caffe 18 FPS DNR 5 FPS DNR

Regards,
TCIII

Did you see DI’s latest servo and sensor mount - it stacks the camera and distance sensor on the servo

1 Like

Ok. Where do I buy one?

All I can find is the original servo:
https://shop.dexterindustries.com/shop/sensors-accessories/sensors-actuators/servo-packagehttps://shop.dexterindustries.com/shop/sensors-accessories/sensors-actuators/servo-package

Re: SLAM

  1. What’s that?
    (aside from what you want to do with a robot that just looks at you stupidly despite hours and hours of programming effort)
  2. Maybe try it on the Pi-4?

The Pi-4 should be, (at least), somewhat able to handle it.

@jimrh,

SLAM: Simultaneous Localization and Mapping.

Regards,
TCIII

I love your hat. My dad was “old Navy” and served all over the place during WWII. We all still call the bathroom, “the head”. :slight_smile:

@jimrh,

My great uncle, on my father’s side of the family, served on the cruiser USS Olympia during the Spanish-American War at the battle of Manila Bay.

I did six years in the USN and two tours of Vietnam before I was honorably discharged in late 1973.

Regards,
TCIII

I tried to enlist several times during the '70’s. Unfortunately, surgery for a collapsed lung impressed the chief medical officer at Ft. Hamilton so much he eventually 3-F’d me. (I can’t volunteer, but I CAN be drafted!)

I still get a pang of regret when I meet folks like you.

Have a great Veteran’s day!

Jim “JR”

@jimrh,

Thank you Sir, much appreciated.

I used my GI Bill to finish my BSEE degree and went on to spend 35 years in the Aerospace Industry as a System/Reliability Engineer to help make the US Military second to none.

Regards,
TCIII

@jimrh
If you buy the “original servo package” we will ship you the new one. The shop website hasn’t been edited yet.

1 Like