In ROS there is a URDF file (Universal Robotic Description Format) which allows the robot visualization application to show a symbolic robot and allows the ROS robot and state publisher to tell subscribers where important items of the robot are located in the environment as the robot moves.
These files are simple in concept but amazing complex to get the numbers correct for every part of a robot to appear in the correct place, and for the sensor data to be accurate. (A LIDAR reports a wall is 3.0 meters from the LIDAR but the LIDAR may be 2 centimeters behind the center of the robot, so the URDF allows LIDAR data subscribers to know the wall is actually only 2.98 meters away from the center of the robot and even less from the front of the robot.)
Over the years I have created a number of GoPiGo3 URDF files from:
What a total pain to convert the URDF to an SDF for Gazebo!
Next I have to see if GoPi5Go-Dave in Gazebo actually produces a laser /scan topic with distance data and moves around to /cmd_vel topics. I don’t think I’m going to worry about the RGB camera simulation.
Update: he is producing scans according to Gazebo but not on the simple /scan topic so Rviz2 won’t display them, and I can’t drive him around yet - details.
It appears I will need a lot of luck. I managed to get my GoP5Go-Dave to generate /scan topics but Rviz2 doesn’t want to display them. I think the Gazebo sim is publishing in bursts rather than evenly spacing out the topics. ROS reports the average is 5 Hz like I asked for, but the min time between packets is 0.2 seconds and the max time is 5 seconds.
Considering I don’t feel that I need (and don’t particularly like) simulation, preferring to try everything on my physical robot, I am again devoting time/effort to build software reuse without anyone asking for it, and no one likely to use it anytime soon.
I keep getting distracted by interesting challenges. In the long run, I always learn a lot but don’t need that particular knowledge at the moment
At least I appreciate the effort you’ve put in - not just on Dave but helping me with my projects, boldly porting things where no GoPiGo has gone before, and. . .
I’m waiting for the opportunity to try some of your ROS stuff, though I have no idea what I’ll DO with it when I get it.
Like how is GoPi5Go-Dave able to float in the air? (Probably why he isn’t listening to my drive commands - his spinning wheels aren’t touching the ground.)
Longing for the days when I was running debugged Dexter GoPiGo3 examples…
UPDATE: Yes! The pose z value was 0, for my base_collision and base_inertia “box size z=0.37”, so the collision box was preventing movement. (Every simulated shape has {0,0,0} at its center.)
I changed the z value to 0.19 to put both boxes 5 mm off the floor, and now the bot responds correctly to /cmd_vel topics.
BUT now the LIDAR is obstructed by false “collisions”. The file has 611 lines of possible errors. I think I’ve found over a hundred “issues” so far.
Sure, but I would need more GPU than I can provide in my Ubuntu virtual machine on my Mac. Most of my usage for simulation will either from a bird’s eye view of the entire house where Dave will look like a tiny, tiny rectangle, or looking through “Dave’s eyes” at what’s in front of him, so displaying all the data points for a realistic Dave would only slow the sim.
Well, it doesn’t have to be a fully rendered, Blender compatible, 4k HD image. Maybe a few vertical lines here and there would make it look more like a “bot”?
Then again, if the effort wouldn’t be worth the time spent - don’t bother. We all have the things we go all “OCD” over, and the visual interface/UI is my big bugaboo. (Which is why I color-code everything. . .)
Ah, be not so fast to conclude that “responds correctly”. When I told Dave move forward, the caster ball moved correctly with Dave. When I told Dave to spin, the caster ball disassociated itself with Dave and went out to lunch.
It turns out telling the Gazebo simulation that a joint is of type “ball” does not result in three axis movement. I would have to make three virtual caster balls, each with inertia specification, one for each axis, and would have to make three virtual joints, one for each axis, and Gazebo would combine them into one ball that can move in any direction.
Baloney - I set the joint as type “fixed” and let Dave drag his caster ball everywhere he goes without it rolling (away).
It has taken a while, but I have finally caught up to where @KeithW was years ago using the HandsOnROS book. That was such a great book that went out of date way too quickly.
In light of the ROS community moving to the new Ignition Gazebo Sim system, I went exploring yesterday to see if Turtlebot3 or Create3 have done the work to move to the latest ROS2 and the new sim. Discovered Turtlebot3 development/support appears to have dried up except for their forum.
Discovered the iRobot team is still going strong keeping the Create3 software stack current including complete support for the new sim.
Create3 hardware cannot help me, but I am pretty sure I will be able to “lift” their latest Create3 simulation package to create a new “gpg3_ignition” sim package for GoPi5Go-Dave.
That’s smart. Gazebo has a lot of weird configuration issues to be sure. You can probably set the friction of the caster ball to be 0. Or if you wanted a little more authenticity set it to some appropriate low value.
/K
It was a great book, and got me a long way. If there was something like it for ROS2 I’d probably get back into the swing of things - as it is it doesn’t seem worth the energy required to get back to where I was with ROS and Finmark, since I’d be on an old and mostly unsupported platform.
/K