I got the GoPiGo3 specifically to learn ROS, and have been happy with it as a learning platform. But there really is a lot to learn with ROS. As I played around with SLAM and AMCL navigation I decided it would be easier to learn if I had a more controlled environment than my den/kitchen/etc. (chair legs seem to be an issue for the LIDAR since they’re small). So I used some plywood I had in the basement to make a small arena for learning.
I’m able to set goals, etc. Will be adding internal obstacles soon in order to learn how to tweak the local and global maps. One thing I noticed running AMCL was that the robot seemed disproportionately large.
The robot doesn’t take up 1/3 the width of the arena at that point. Looking at the gopigo3_actual.gazebo file, the dimensions in the URDF did seem to be larger than the physical robot. I tweaked the dimensions so that ROS would have a clearer idea of my robot’s actual size once I start to include obstacles.
I’ve attached my current version of the .gazebo file (with an extra .txt extension so the forum will allow the upload) in case it’s of interest. It seems better, but I’m sure it’s still not quite correct. I’ve started working through the tutorials at the Robot Ignite Academy so that I have a deeper understanding of what I’m doing. If I tweak the file further I’ll provide updates here.
One other note in case it happens to you. I had launched the YDlidar launch file on the robot, and then delayed the launch file on my laptop. When I finally ran it I got errors that there was no transform between my robot base and the map. What finally solved that was shutting down ROS on the robot (ctrl-c in the launch terminal), and then starting the robot launch file and the laptop launch file in quick succession. Not sure why that makes a difference, but it seemed to.
/K
Finally finished tweaking the URDF file. Seems to be more realistic in terms of size. Maybe a bit better in navigation, but still working on that part. As I look at it I should make the visual box shorter than the collision box.
Was able to tweak the colors as well to look better in RVIZ and Gazebo.
This looks totally righteous - especially the test-bed you created. Excellent piece of work!
Though I do have to admit, I’m kind-of depressed. Charlie hasn’t moved in months (while I figure out the multi-boot process), and here you go with a robot that looks like an Apache Reconnaissance chopper!
If you decide to work with more than one O/S at a time, I’ll be glad to help you set up a PINNified multi-boot environment.
Yep - URDF (Unified Robot Description Format) is a way to describe a robot for simulation in Gazebo, RVIZ, and other programs. The files can bet pretty complicated - these are actually way towards the simpler end. Not something you could send to a 3D printer, but some of the more detailed visual elements (e.g. the LIDAR) are actually mesh files that could be sent to a 3D printer.
I’m still learning ROS navigation. I believe the green arrows are pose estimates (i.e. they represent one guess of where the robot is located). Obviously they’re very scattered. If everything was working ideally they’d rapidly converge to being very near the robot. So I clearly have a lot of tweaking to do to get reliable navigation. But then that’s the fun of it.
/K
I’ve forked their project, and am now trying to figure out how to do a pull request. Not something I’ve done before with GitHub. Don’t know if they’ll accept it since I rotated the lidar, but I guess they can undo that if they want to merge it.
Thanks - I hadn’t seen that. I’ve used Sparkfun parts and been happy with them. I know they were very excited about the fact that the Mars Ingenuity helicopter used a lidar JPL bought from SparkFun as an off-the-shelf part.
/K