URDF files for dave, finmark, and gpgMin (a basic GoPiGo3 without any sensors)
robot state and joint state publishers on GoPiGo3 (use the on-platform URDF)
WiFi IP announcement at boot to the attached speaker (or headphones)
Slam-toolbox mapping and save (map also visible in rviz2 on any desktop ROS2 installation)
Work around found to eliminate shutdown errors in all the ROS2 GoPiGo3 code
(The code I wrote. One last shutdown error from joint_state_publisher I can’t figure out.)
start_robot_xxxx (xxxx= dave, finmark, or gpgMin), and universal stop_robot.sh script to shutdown all nodes
/usr/local/bin/stop (just type stop from anywhere for a “runaway robot”)
10 frames per minute PiCamera /Image topic publisher for camera equipped GoPiGo3 robots
odometry reset batch file to demonstrate publish to the ros2_gopigo3_node service listener from command line
no default /home/pi/Dexter/gpg3_config.json so it gets generated automatically
monitor.sh script reports CPU load and temperature, Memory usage, and Battery Voltage
“odometer” node /odom subscriber prints to console+logs motion start and ending x, y, heading
(./start_odometer.sh to run it)
No longer auto starts distance_sensor node which had huge CPU load impact (vs imu or ultrasonic ranger nodes)
Tuning ROS2 GoPiGo3 utility programs and usage document
Testing ROS2 GoPiGo3 document updated
auto update disabled - run update/upgrade at your discretion
ROS2 GoPiGo3 Desktop Visualization Guide
(For users with ROS2 installed on a desktop/laptop)
And how about a dictionary and/or a translation into English? ()
Â
What’s a “URDF file”? If I build out Charlie or Charlene as ROS bots, do I need one? How do I make one? If I want to include sensors that Charlie/Charlene have, how do I do it?
Those are all good questions, that I will be happy to discuss after you have downloaded my “You don’t need to know anything - ROS2 for GoPiGo3” image and can tell me what my docs lack from a “I don’t know anything about ROS” user viewpoint.
What you don’t see is how bad the map looks if the robot turns. Problem left for someone smarter than I am at this point.
(Has to be in the nasty math interpreting encoder left and right change to heading change in the ros2_gopigo3_node.py /odom topic publisher)
/odom wrong for small heading changes? - eyeball close for 2, 5, 15, 30, 45, 90, 180, 360 deg turns
/odom/reset service performs reset but sometimes not returning success? - CND
my_mapper_params_online_async.yaml: 0.01 is better value for resolution
Fix: Edit ~/ros2ws/my_mapper_parames_online_async.yaml
change line:
resolution: 0.05
to:
resolution: 0.01
odometer.py only detects forward/backward motion, but not spins.
(Ouch- reminded /odom heading goes from -pi/2 to +pi/2 in an instant passing the lower quadrants)
Apply fix:
cd ~/ros2ws/src/ros2_gopigo3_node/ros2_gopigo3_node
cp odometer.py odometer.py.v4
wget https://raw.githubusercontent.com/slowrunner/ROS2-GoPiGo3/main/updates/odometer.py
YDLiDAR returning zero distances 25% of time:
Apply fix:
cd ~/ros2ws/src/ydlidar_ros2_driver/params
cp ydlidar.yaml ydlidar.yaml.v4
wget https://raw.githubusercontent.com/slowrunner/ROS2-GoPiGo3/main/updates/ydlidar.yaml
Image does not have swapfile setup
Apply fix:
cd ~/utils
wget https://raw.githubusercontent.com/slowrunner/ROS2-GoPiGo3/main/utils/make1GBswapfile.sh
chmod +x make1GBswapfile.sh
./make1GBswapfile.sh
cd ~/systests
mkdir swap
cd swap
wget https://raw.githubusercontent.com/slowrunner/ROS2-GoPiGo3/main/systests/swap/use1GB.py
chmod +x use1GB.py
Test Swapfile
In another shell start monitoring memory use:
cd ~/ros2ws
./monitor.sh
In the first shell:
~/systests/swap/use1GB.py
and watch for Mem: line to show very little available, and then return to full memory available
Place bot square to a walled corner (suggest 50cm from front wall, 50cm from side wall)
Measure distance from a front wall to center of wheels
Front Distance (wall to center of wheels) _____ mm
Measure distance from side wall to center of bot between the wheels
Side Distance (wall to center of bot between wheels) _____ mm
./start_robot_xxxx.sh on robot
Start rviz2, load RobotModel with robot_xxxx urdf
Select rviz2:Move Camera
mouse-drag down to be looking straight down on bot
shift mouse to put grid-cross under the wheels in center of big yellow circle in center of window
Select rviz2:Measure
left click on grid crossing at center of bot
left click on front wall scan dot on the grid line
Note distance to front wall displayed in lower left
(This is base_link-to-laser_frame X-distance plus /scan distance: laser_frame to front wall)
Visualization Distance from center of bot (base_link) to front wall: ______ mm
right click to reset/clear measurement
left click on grid crossing at center of bot/grid cross
left click on side wall scan dot on the grid line
Note distance to side wall displayed in lower left
(This is base_link-to-laser_frame Y-distance plus /scan distance: laser_frame to side wall)
Visualization Distance from center of bot (base_link) to front wall: ______ mm
If the displayed measurement is larger than the physical measurement,
decrease the appropriate x or y value (x to front wall, y to side wall)
in the robot’s URDF file (~/ros2ws/src/ros2_gopigo3_node/urdf/xxxx.urdf and in ros2desk/xxxx.urdf
If the displayed measurement is smaller than the physical measurement,
Increase the appropriate x or y value (x to front wall, y to side wall)
in the robot’s URDF file (~/ros2ws/src/ros2_gopigo3_node/urdf/xxxx.urdf and in ros2desk/xxxx.urdf
./rebuild.sh on the robot
Load the new URDF file in rviz2, then click Reset button in lower left
Retest measurements
Between tuning the gpg3_config.json wheel-diameter and wheel-base, the updated ydlidar.yaml param that gives 100% reliable readings, and verifying the URDF values for dave: the mapping and localization seems to be working quite well. The bot moves then a few seconds later the localization will update the position and orientation in the map very well.
3MB was the remainder of the 1GB array of '1’s compressed into the /swapfile (that did not fit into the physical memory) … I think. I’m not actually sure exactly, but since the reject_compress_poor is 0, I am guessing the compression is working. I did the test with a use2GB on my 1.5GB free, and it succeeded while using only 687Mi bites of swap. I also ran the full dave and image pub and use1GB and use2GB which ran the CPU load up to 17! and caused a temporary LIDAR "[ydlidar_ros2_driver_node-1] [YDLIDAR ERROR]: Device Tremble
" message, but everything kept running after the use memory programs returned the memory. Load dropped back down, available memory appears fully returned. Looking pretty robust on the Pi4 2GB using 3.6GB of memory.
@KeithW This fix may be especially important for Finmark with the 1GB Pi3B config.
OK- I’ve actually been back for a while, but just had time to work on this today.
Initial boot was slow. And then I did an apt update/full-upgrade, which I realize was stupid after I did it - just a holdover habit from running Ubuntu on laptops. That took forever.
But I can ssh in. And the communication and motor test (sans wheels) was successful. I had a keyboard and monitor attached just in case, but tomorrow I can try it untethered with the wheels on. But based on today’s performance I worry that it’ll be really slow.
/K
Finally had a chance to play with Finmark. I applied all of the updates above. When I try the swap fix I get an error:
Traceback (most recent call last):
File "/home/ubuntu/systests/swap/./use1GB.py", line 13, in <module>
arr = np.ones((1024,1024,1024, 1), dtype=np.uint8)
File "/usr/lib/python3/dist-packages/numpy/core/numeric.py", line 204, in ones
a = empty(shape, dtype, order)
numpy.core._exceptions._ArrayMemoryError: Unable to allocate 1.00 GiB for an array with shape (1024, 1024, 1024, 1) and data type uint8
Not sure if that has do to with me being on an RPi 3B+ not a 4???
/K
PS - on the bright side with the wheels on the motor test worked as expected
Everything worked. Driving from the keyboard was fine, and overall didn’t seem too slow (or too much latency). I ssh’d in from Ubuntu using WSL on my Windows desktop.
So - so far so good with the RPi3B+ running ROS2.
Next step - check out the lidar. But first, since we’re now using Ubuntu 22.04 Jammy/ROS2 Humble, and since I got a new laptop this fall, I took the opportunity to repurpose the laptop I was previously using for ROS, and have put Jammy Jellyfish on my former personal laptop. Now I need to install ROS2 Humble on that laptop. Although I should probably look into trying WSL on my desktop computer, or even trying ROS2 on Win11. Lots of rabbit holes to dive down
Perhaps the maximum allocatable memory to a process might be the size of the physical memory, I don’t really know if an entire process must be able to fit in physical memory. If so, then you might be able to run three processes that each used half a gig say. Each one will fit in memory by itself, and two could be swapped out while any one was executing. But I thought I tested a 2gb and a 1gb with my 2 Gb pi4, so I expected that you would be able to run the use1gb.
You rebooted and saw free -h showing 1GB swap free?
I think the ROS works good anywhere, but the visualization tools and sim have problems from what I have read. I didn’t pay too much attention because the say they only officially test on the recommended Ubuntu release and don’t even recommend mixing ROS version over a later Ubuntu version than tested for that release.
Very interested to see if you get 100% readings with my config file parameters set. Did I document using the LiDAR test I created. I was going so quickly to get it released I don’t recall what I documented. There are some things on the image I didn’t think would be needed by users initially.