I’m here in the US - and just had the inestimable joy of a total knee replacement!
While I’m here, I bought some new toys for Charlie that I plan to install when I return.
A line-sensor
The IMU board.
A Pi-4. (actually, a couple of them, but they’re not all for Charlie)
An Adafruit Real Time Clock module for the Pi.
Two touch-screen displays - a 7" display that connects through the display connector and a smaller one that connects through the GPIO ports. The idea here is to give Charlie a way to communicate with the world - and accept inputs - while autonomous.
Since I am also a confirmed masochist, I picked up a NVIDIA Jetson Nano Dev board. It’s about the same size as a Pi, (but has a humongous heat-sink), with a zillion CUDA cores, eats Tensor Flow for lunch, uses the Pi’s camera natively, and is specifically designed as an AI platform. It would be interesting to see if I can get it working with the GoPiGo robot.
I am also talking to a company in the UK about getting a round-cable camera cable so I can attach the camera to the distance sensor/servo and have it track with the distance sensor.
I would expect the Jetson Nano to eat the Raspberry Pi’s lunch, what surprised me was how well the Pi did. It would be interesting to try it again with a Pi-4, 4G version.
What did disappoint me was that there appeared to be no, (or very little), effort to normalize the data across all the systems tested. Some systems had a “smaller” dataset than the others - ergo, (IMHO) the results are meaningless because we don’t know the sample size used for each system.
Even if we did, the results are still apples-to-oranges because we don’t know anything about the data. (i.e. Was the data “cherry-picked” for the Pi? What about the data for the other systems?) We’re given so little information that it is literally impossible to attempt to duplicate and verify these results, or critique the actual methodology used.
With a consistent dataset, and a reasonably repeatable processing model, (even though it may be tweaked for each system - as I would expect), the results would actually mean something. We could then ask real quality questions like:
Given datasets of increasing size, what is the effect on accuracy?
How does processing time vary with data-set size?
(i.e. How quickly does it rise per ‘x’ increase in dataset size? Is the progression linear, logarithmic, exponential, or what? Etc.)
What is the effect different kinds of images on processing time? Clear backgrounds? Busy backgrounds? Monochrome? Color?
And so on, and so on, and so on. . . .
In summary, it’s an interesting article that really doesn’t tell us much except that each of these systems can do at least some kind of tensor-flow/deep-learning processing. Which, of course, we already knew.
What will be interesting is when I get a chance to try some of this stuff with Charlie. I’ll probably go with the Pi-4 first and see where this leads me.
I have to agree with you. The Jestson Nano blew the Rpi 3B+ away when it came to SLAM performance using the RPLidar A1M8. I couldn’t find the original article concerning the performance comparison, but it was substantial if I remember correctly.
Also, you have to keep in mind that DI has been promising to incorporate the RPLidar into the GoPiGo3 programming but have yet to provide anything substantial in the way of a “how to”. I think that they have found that the Rpi 3B+ is somewhat overwhelmed by the required SLAM processing to be effective for reasonable navigation.
I tried to enlist several times during the '70’s. Unfortunately, surgery for a collapsed lung impressed the chief medical officer at Ft. Hamilton so much he eventually 3-F’d me. (I can’t volunteer, but I CAN be drafted!)
I still get a pang of regret when I meet folks like you.
I used my GI Bill to finish my BSEE degree and went on to spend 35 years in the Aerospace Industry as a System/Reliability Engineer to help make the US Military second to none.
Admittedly, this isn’t a GoPiGo - at least not yet! - but here is the initial assembly of the new Nano in its new plastic case.
Interesting points of fact in its favor - especially as far as Charlie is concerned:
It uses a Pi camera. Also there are a whole host of “Jetson” cameras with all kinds of unique capabilities that should also fit the Pi. (i.e. Camera with built-in rangefinder, camera with two big IR “headlights”, wide angle, etc. etc. etc.)
It has a 40-pin header on 0.1’ centers that is - by some strange coincidence - set up with the identical pin-out to the Pi. Including the IDENT pins on 21 and 22, and the i2c on pins 3 and 5. (Etc.) This means that - at least in theory - any Pi hat will work on the Nano. More important is that the GoPiGo daughter board should also work. Of course, I haven’t heard of a Raspbian for Robots port to the Nano - or even a plain-vanilla port of Raspbian for that matter! - so anything you, I, or anyone else wants to do will be done “the hard way”.
Possibility:
There are a number of robot kits available for the Nano, all of which use the GPIO. This may be useful as a starting point for interfacing the Nano with Charlie.
Of course, anyone with half the brains God gave a squirrel will know that they (I) can’t come crying to the engineers at Dexter for anything other than the most generic of information about their own hardware. (i.e. What’s the i2c address for the [insert name of interface here].
In it’s current form I’m planning to fuss with it, get to know it, and research ways of upgrading Charlie to use it, if possible.
P.S. If I figure out how, I’ll be sure to let Carl know!
Thanks, but Carl reminded me that he doesn’t have 5-10 watts to spare in his diet, but will admire Charlie’s power lunching.
Now if you do something with that IMU, on the other hand, Carl is very interested. Carl has been fitted with the nano-watt-sipping IMU and has been playing with it a bit. (Can save and restore a calibration data, and learned that it absolutely must be run in software I2C mode, connected to AD1 or AD2, to properly implement the clock stretching.)
Yea, I see a Ryobi One+ Lithium battery in Charlie’s future if I go that route. Or a small 6v gell-cell.
Not sure how much I can help with the IMU - mine is the Dexter “OEM” version which has native support. One thought is to get the “OEM” IMU, experiment with that and make all the stupid neo/dweeb mistakes first, and then - if needed - graduate to the fancier ones. This is the main reason for my going “OEM” as much as possible. It cancels out most of the silly stuff ahead of time. (i.e. It has native software support, so issues should - mostly - be keyboard-chair interface issues. )
I already know I’m going to have enough issues trying to fit Charlie with a RTC and a GPS!
Jim “JR”
P.S.
If I decide that the cross of a GoPiGo and a Nano is reasonably feasible, I may end up having to buy another 'bot to keep the two lines of experimentation separate.
Depending on how much of the native Raspbian comes over into Dexter’s re-spins, it may be a cinch since Raspbian has native support for certain i2c based RTC modules. It’s a matter of editing a couple of scripts.