New Toys for Charlie!

Greetings!

I’m here in the US - and just had the inestimable joy of a total knee replacement!

While I’m here, I bought some new toys for Charlie that I plan to install when I return.

  1. A line-sensor
  2. The IMU board.
  3. A Pi-4. (actually, a couple of them, but they’re not all for Charlie)
  4. An Adafruit Real Time Clock module for the Pi.
  5. Two touch-screen displays - a 7" display that connects through the display connector and a smaller one that connects through the GPIO ports. The idea here is to give Charlie a way to communicate with the world - and accept inputs - while autonomous.

Since I am also a confirmed masochist, I picked up a NVIDIA Jetson Nano Dev board. It’s about the same size as a Pi, (but has a humongous heat-sink), with a zillion CUDA cores, eats Tensor Flow for lunch, uses the Pi’s camera natively, and is specifically designed as an AI platform. It would be interesting to see if I can get it working with the GoPiGo robot.

I am also talking to a company in the UK about getting a round-cable camera cable so I can attach the camera to the distance sensor/servo and have it track with the distance sensor.

More when I know more!

Jim “JR”

1 Like

Efficientnet-B3

Summary

Raspberry Pi 3B running TensorFlow / Efficientnet-B0 had 80% accuracy in 539ms
and Efficientnet-B3 had 81% accuracy in 1.9s

Jetson Nano “winner” using TF-TensorRT-PyTorch Resnet-50 2.67 seconds 64% Top-1 accuracy

1 Like

I would expect the Jetson Nano to eat the Raspberry Pi’s lunch, what surprised me was how well the Pi did. It would be interesting to try it again with a Pi-4, 4G version.

What did disappoint me was that there appeared to be no, (or very little), effort to normalize the data across all the systems tested. Some systems had a “smaller” dataset than the others - ergo, (IMHO) the results are meaningless because we don’t know the sample size used for each system.

Even if we did, the results are still apples-to-oranges because we don’t know anything about the data. (i.e. Was the data “cherry-picked” for the Pi? What about the data for the other systems?) We’re given so little information that it is literally impossible to attempt to duplicate and verify these results, or critique the actual methodology used.

With a consistent dataset, and a reasonably repeatable processing model, (even though it may be tweaked for each system - as I would expect), the results would actually mean something. We could then ask real quality questions like:

  • Given datasets of increasing size, what is the effect on accuracy?
  • How does processing time vary with data-set size?
    (i.e. How quickly does it rise per ‘x’ increase in dataset size? Is the progression linear, logarithmic, exponential, or what? Etc.)
  • What is the effect different kinds of images on processing time? Clear backgrounds? Busy backgrounds? Monochrome? Color?
  • And so on, and so on, and so on. . . .

In summary, it’s an interesting article that really doesn’t tell us much except that each of these systems can do at least some kind of tensor-flow/deep-learning processing. Which, of course, we already knew.

What will be interesting is when I get a chance to try some of this stuff with Charlie. I’ll probably go with the Pi-4 first and see where this leads me.

Jim “JR”

@jimrh,

I have to agree with you. The Jestson Nano blew the Rpi 3B+ away when it came to SLAM performance using the RPLidar A1M8. I couldn’t find the original article concerning the performance comparison, but it was substantial if I remember correctly.

Also, you have to keep in mind that DI has been promising to incorporate the RPLidar into the GoPiGo3 programming but have yet to provide anything substantial in the way of a “how to”. I think that they have found that the Rpi 3B+ is somewhat overwhelmed by the required SLAM processing to be effective for reasonable navigation.

Regards,
TCIII

1 Like

@jimrh,

Here are some Deep Learning Inference Benchmarks:

Model Application Framework NVIDIA Jetson Nano Raspberry Pi 3 Raspberry Pi 3 + Intel Neural Compute Stick 2 Google Edge TPU Dev Board
ResNet-50
(224×224) Classification TensorFlow 36 FPS 1.4 FPS 16 FPS DNR
MobileNet-v2
(300×300) Classification TensorFlow 64 FPS 2.5 FPS 30 FPS 130 FPS
SSD ResNet-18 (960×544) Object Detection TensorFlow 5 FPS DNR DNR DNR
SSD ResNet-18 (480×272) Object Detection TensorFlow 16 FPS DNR DNR DNR
SSD ResNet-18 (300×300) Object Detection TensorFlow 18 FPS DNR DNR DNR
SSD Mobilenet-V2 (960×544) Object Detection TensorFlow 8 FPS DNR 1.8 FPS DNR
SSD Mobilenet-V2 (480×272) Object Detection TensorFlow 27 FPS DNR 7 FPS DNR
SSD Mobilenet-V2(300×300) Object Detection TensorFlow 39 FPS 1 FPS 11 FPS 48 FPS
Inception V4(299×299) Classification PyTorch 11 FPS DNR DNR 9 FPS
Tiny YOLO V3(416×416) Object Detection Darknet 25 FPS 0.5 FPS DNR DNR
OpenPose(256×256) Pose Estimation Caffe 14 FPS DNR 5 FPS DNR
VGG-19 (224×224) Classification MXNet 10 FPS 0.5 FPS 5 FPS DNR
Super Resolution (481×321) Image Processing PyTorch 15 FPS DNR 0.6 FPS DNR
Unet(1x512x512) Segmentation Caffe 18 FPS DNR 5 FPS DNR

Regards,
TCIII

1 Like

Did you see DI’s latest servo and sensor mount - it stacks the camera and distance sensor on the servo

2 Likes

Ok. Where do I buy one?

All I can find is the original servo:
https://shop.dexterindustries.com/shop/sensors-accessories/sensors-actuators/servo-packagehttps://shop.dexterindustries.com/shop/sensors-accessories/sensors-actuators/servo-package

Re: SLAM

  1. What’s that?
    (aside from what you want to do with a robot that just looks at you stupidly despite hours and hours of programming effort)
  2. Maybe try it on the Pi-4?

The Pi-4 should be, (at least), somewhat able to handle it.

@jimrh,

SLAM: Simultaneous Localization and Mapping.

Regards,
TCIII

1 Like

I love your hat. My dad was “old Navy” and served all over the place during WWII. We all still call the bathroom, “the head”. :slight_smile:

@jimrh,

My great uncle, on my father’s side of the family, served on the cruiser USS Olympia during the Spanish-American War at the battle of Manila Bay.

I did six years in the USN and two tours of Vietnam before I was honorably discharged in late 1973.

Regards,
TCIII

1 Like

I tried to enlist several times during the '70’s. Unfortunately, surgery for a collapsed lung impressed the chief medical officer at Ft. Hamilton so much he eventually 3-F’d me. (I can’t volunteer, but I CAN be drafted!)

I still get a pang of regret when I meet folks like you.

Have a great Veteran’s day!

Jim “JR”

@jimrh,

Thank you Sir, much appreciated.

I used my GI Bill to finish my BSEE degree and went on to spend 35 years in the Aerospace Industry as a System/Reliability Engineer to help make the US Military second to none.

Regards,
TCIII

1 Like

@jimrh
If you buy the “original servo package” we will ship you the new one. The shop website hasn’t been edited yet.

2 Likes

The new Jetson Nano

Admittedly, this isn’t a GoPiGo - at least not yet! - but here is the initial assembly of the new Nano in its new plastic case.

Interesting points of fact in its favor - especially as far as Charlie is concerned:

  1. It uses a Pi camera. Also there are a whole host of “Jetson” cameras with all kinds of unique capabilities that should also fit the Pi. (i.e. Camera with built-in rangefinder, camera with two big IR “headlights”, wide angle, etc. etc. etc.)

  2. It has a 40-pin header on 0.1’ centers that is - by some strange coincidence - set up with the identical pin-out to the Pi. Including the IDENT pins on 21 and 22, and the i2c on pins 3 and 5. (Etc.) This means that - at least in theory - any Pi hat will work on the Nano. More important is that the GoPiGo daughter board should also work. Of course, I haven’t heard of a Raspbian for Robots port to the Nano - or even a plain-vanilla port of Raspbian for that matter! - so anything you, I, or anyone else wants to do will be done “the hard way”.

Possibility:
There are a number of robot kits available for the Nano, all of which use the GPIO. This may be useful as a starting point for interfacing the Nano with Charlie.

Of course, anyone with half the brains God gave a squirrel will know that they (I) can’t come crying to the engineers at Dexter for anything other than the most generic of information about their own hardware. (i.e. What’s the i2c address for the [insert name of interface here].

In it’s current form I’m planning to fuss with it, get to know it, and research ways of upgrading Charlie to use it, if possible.

P.S. If I figure out how, I’ll be sure to let Carl know!

Thanks!

1 Like

Thanks, but Carl reminded me that he doesn’t have 5-10 watts to spare in his diet, but will admire Charlie’s power lunching.

Now if you do something with that IMU, on the other hand, Carl is very interested. Carl has been fitted with the nano-watt-sipping IMU and has been playing with it a bit. (Can save and restore a calibration data, and learned that it absolutely must be run in software I2C mode, connected to AD1 or AD2, to properly implement the clock stretching.)

1 Like

Yea, I see a Ryobi One+ Lithium battery in Charlie’s future if I go that route. Or a small 6v gell-cell.

Not sure how much I can help with the IMU - mine is the Dexter “OEM” version which has native support. One thought is to get the “OEM” IMU, experiment with that and make all the stupid neo/dweeb mistakes first, and then - if needed - graduate to the fancier ones. This is the main reason for my going “OEM” as much as possible. It cancels out most of the silly stuff ahead of time. (i.e. It has native software support, so issues should - mostly - be keyboard-chair interface issues. :wink: )

I already know I’m going to have enough issues trying to fit Charlie with a RTC and a GPS!

Jim “JR”

P.S.
If I decide that the cross of a GoPiGo and a Nano is reasonably feasible, I may end up having to buy another 'bot to keep the two lines of experimentation separate.

1 Like

There is native support for a specific GPS sensor, even within DexterOS. But nothing for an RTC, unfortunately.

1 Like

Depending on how much of the native Raspbian comes over into Dexter’s re-spins, it may be a cinch since Raspbian has native support for certain i2c based RTC modules. It’s a matter of editing a couple of scripts.

All of Raspbian makes it to Raspbian for Robots. We just add stuff, we don’t remove any.

1 Like