I haven’t tested all sensors, but over here, my motors do move
I’m using Ubuntu 21 (Hirsute), and needed to :
sudo apt install python3-lgpio
sudo apt install git
sudo apt install openssh-server (for remote access)
sudo apt curl
I did a softlink so that python points to python3.9
Then I did sudo git clone http://www.github.com/DexterInd/GoPiGo3.git /home/pi/Dexter/GoPiGo3
followed by git checkout install_on_ubuntu before running the install script.
We currently support Ubuntu Linux Focal Fossa (20.04) 64-bit x86 and 64-bit ARM.
I was working with the Ubuntu 20.04 Server 32-bit image knowing I would have to start again with the 64-bit image once I got Ubuntu talking to the GoPiGo3 - that first before even thinking about ROS or ROS2.
Tried take 2 on Pi3B: Edited network-config as noted in OP, but system did not connect to my WiFi. (Also checked to see if it broadcast GoPiGo3 SSID - not.)
This image does not broadcast an AP. I have not tried setting up wifi on it (I used an Ethernet cable) but I haven’t touched anything related to setting up wifi.
I have no idea how to connect to wifi with the Ubuntu command line. So I connected a monitor, keyboard and mouse.
Yes there is a small desktop, that will allow you to set settings.
I used that to connect to my local wifi and it worked on the first attempt.
I’m sure it can be done via the command line, but I don’t recall how.
Updated the procedure - 64-BITS - see original post.
WE’RE GO FOR FOXY FITZROY
In case anyone is wondering where I’m going with this - I believe the GoPiGo3 $229 kit, will compare very capably with the $549 Turtlebot Burger, offering a lower cost entry point to learning about ROS.
Not to put the GoPiGo3 running GoPiGo3 OS or Raspbian4Robots down - I believe before anyone even thinks about ROS, they need to use the DI/ModRobotics OS and run every example, learn about every function of the GoPiGo3 platform.
That said, the ROS world is where robotics research plays and where industrial robots earn their reason for existence.
Most folk build their first ROS robot with a LIDAR scanner and use off-board processing. The robot becomes a remote controlled sensor platform. The LIDAR is a bit noisy, and a bit power hungry, and a bit expensive. It does do a great job of mapping the environment, and allows precise localization.
I’m thinking the GoPiGo3 with the servo mounted Distance Sensor running a slow sweep scan sending distance readings to the off-board rviz will be able to create a map, albeit not instantaneously like the LIDAR.
So that’s the thinking - Can GoPiGo3 basic kit or basic with the DI IMU do the job for $300 less?
I maybe should have done that. Nice thing with the Raspberry Pi is that I can always swap SD cards to go back and forth fairly easily (although perhaps not as easily as with @jimrh 's multiboot approach once that’s working).
I’ve realized that right now my ROS setup is actually not using the IMU. I’ve been doing a lot of ROS tutorials just to get to the point where I understand the navigation stack. I’m just about there. Then I can start using my GPG3 more. I’m very much at the point you described - it’s more of a remote sensor platform. But I plan to see what I can do to get the IMU integrated at least.
I’m sure you can run ROS without LIDAR, although it may limit which navigation packages you can use. I’ve written maze navigation programs (in simulated environments) that don’t use the navigation stack. These did use LIDAR readings, but used them just as distance readings from distinct directions (left, left-forward, forward, right-forward, right). That could just as easily be done with distance sensors, or one pivoting distance-sensor. The only issue would be whether you could actually scan fast enough to keep up with how fast you want the robot to go. And all of that could run locally.
As far as that is concerned, you are correct, though I believe I have the multi-boot process just about completely knocked.
I have learned, and am learning, a lot about how Raspbian is organized and the gnarly details of booting the Pi. This translates into knowledge of the boot process and how to fold it in ways that can be very interesting and useful.
In a very real way, this is not unlike your experiments with ROS: You can do 99.99999% of what ROS does using stock Dexter software, but you’re learning a lot about robotics in ways that the folks at MR/DI never even imagined.
Eventually when you decide to try the Dexter software, you will be approaching it from an entirely new and fresh viewpoint that none of the rest of us can match. You will bring to the analysis of the software and its features a fresh and imaginative point-of view.
In a manner similar to the way @cyclicalobsessive is my standard as a robotics software god, and I will always bet money on my own hardware skills, you will bring a third, valuable, point of view - and I am learning a lot just watching you.