I’ve thought about this “angled sensor” concept a lot. Distinguishing between rotation away from the wall due to non-straight driving, versus straight driving not at 90 to the wall is not possible, but it may not matter, the correction for either is a slight turn toward the wall.
And at 45 degrees angle, it should detect the end of the wall by a corner or an opening, but it may miss a chair leg, so I’m thinking the bot needs to scan for a no-obstacle distance along wall before initiating the wall following with 45 degree thing.
You are correct it is not a simple behavior, but really, your Remote Camera Robot project is “order - moon shot” while the wall sentry is “order - two burgers at McDonalds” (unless the bot is in Russia I guess.)
I’d rather order the burgers from Burger King - I think they taste better. And, AFAIK, they’re still open. KFC is still open as far as I know too.
Likewise, in the couple of shopping malls I checked this last weekend, the McDonald’s was open there - maybe the mall owners told them that if McDonald’s pulled out, they’d sell the space to someone else - or re-negotiate a “penalty” rent when they decide to come back. Of course, that was then, this is now, and who knows what’s going on tomorrow.
Sigh. . . I guess that’s a matter of opinion. Yes, my project has been “chewy”, but then again all this ROS stuff has my head spinning - and as far as I am concerned, if my project is a moon-shot, you’re trying to colonize Mars!
I had the advantage that most of the really hard stuff - the servers and the streaming camera - came pre-programmed. I just added fluff.
Good luck! My hat’s off to you!
It’s not possible to detect and, (as you said), it doesn’t really matter. Also that solves the “curved wall problem” - you simply adjust toward or away as necessary.
Try not to over-engineer this and you’ll be happier.
So then, what’s happening with the fancy camera and the LIDAR? All that was ROS stuff, right? Or, do you have a version of di_sensors I haven’t seen yet?
The LIDAR is ROSbot Dave only. The Oak-D-Lite is LegacyDave only while I am learning how to program it and then learn to program the robot with what I learned how to program on the Oak thingy.
And since that all sounds like more than my brain can wrap around right now, both are being ignored while Dave learns about VScode, and the US sensor, and I think about things that sound easier like wall following.
Distance Readings: [259]
Average Reading: 259 mm
Minimum Reading: 259 mm
Maximum Reading: 259 mm
Std Dev Reading: 0 mm
Three SD as a percent of reading: 0.0 %
Distance Readings: [259, 259]
Average Reading: 259 mm
Minimum Reading: 259 mm
Maximum Reading: 259 mm
Std Dev Reading: 0 mm
Three SD as a percent of reading: 0.0 %
Distance Readings: [259, 259, 262]
Average Reading: 260 mm
Minimum Reading: 259 mm
Maximum Reading: 262 mm
Std Dev Reading: 1 mm
Three SD as a percent of reading: 1.6 %
Distance Readings: [259, 259, 262, 258]
Average Reading: 260 mm
Minimum Reading: 258 mm
Maximum Reading: 262 mm
Std Dev Reading: 2 mm
Three SD as a percent of reading: 1.7 %
Distance Readings: [259, 259, 262, 258, 257]
Average Reading: 259 mm
Minimum Reading: 257 mm
Maximum Reading: 262 mm
Std Dev Reading: 2 mm
Three SD as a percent of reading: 1.9 %
This is in a dim room looking at a piece of cardboard. So very consistent readings. The only downside is that I had the sensor at ~25 cm, so it’s reading about a cm long.
/K