I have the short range unidirectional beam VOR (two LEDs with a separator), but Carl is not great at seeing them until he gets close. The have-to-turn-around part is quite limiting as you mentioned.
The wall following is a whole separate interest as you know - I’ve been thinking on that forever.
Have you considered replacing them with two IR lasers? Or, if you want a visible light beam, two red laser-pointers or a couple of bright LED’s behind collimator lenses. Or perhaps one beam?
Actually, I’ve been avoiding the whole “finish the docking challenge”; I just can’t seem to get motivated to finish it. As “Cyclical Obsessive”, I am sure I will get back to it, just not sure when.
We have a lot more in common than either of us may wish to admit. I have the same bursts of energy followed by periods of lethargy. This whole Ukrainian nonsense has made motivation even more difficult - but I’m getting there.
Doesn’t that contradict the data you just captured a few days ago? I was golden out to about 60 cm or so, (I didn’t try further - I didn’t have the room.)
No - The max distance in the old study was the max distance the sensor could provide reliable wall detection. The IR sensor can provide reliable “normal to wall” distance readings out to 48 inches but I did not feel it could reliably provide the two, off-normal readings to enable “wall detection”.
This was part of my initial study of GoPiGo3 “wall detection”. Before I wrote the wall following, I wanted Carl to recognize there was a wall worth following, not just an obstacle in its path.
I’ve thought about this “angled sensor” concept a lot. Distinguishing between rotation away from the wall due to non-straight driving, versus straight driving not at 90 to the wall is not possible, but it may not matter, the correction for either is a slight turn toward the wall.
And at 45 degrees angle, it should detect the end of the wall by a corner or an opening, but it may miss a chair leg, so I’m thinking the bot needs to scan for a no-obstacle distance along wall before initiating the wall following with 45 degree thing.
You are correct it is not a simple behavior, but really, your Remote Camera Robot project is “order - moon shot” while the wall sentry is “order - two burgers at McDonalds” (unless the bot is in Russia I guess.)
I’d rather order the burgers from Burger King - I think they taste better. And, AFAIK, they’re still open. KFC is still open as far as I know too.
Likewise, in the couple of shopping malls I checked this last weekend, the McDonald’s was open there - maybe the mall owners told them that if McDonald’s pulled out, they’d sell the space to someone else - or re-negotiate a “penalty” rent when they decide to come back. Of course, that was then, this is now, and who knows what’s going on tomorrow.
Sigh. . . I guess that’s a matter of opinion. Yes, my project has been “chewy”, but then again all this ROS stuff has my head spinning - and as far as I am concerned, if my project is a moon-shot, you’re trying to colonize Mars!
I had the advantage that most of the really hard stuff - the servers and the streaming camera - came pre-programmed. I just added fluff.
Good luck! My hat’s off to you!
It’s not possible to detect and, (as you said), it doesn’t really matter. Also that solves the “curved wall problem” - you simply adjust toward or away as necessary.
Try not to over-engineer this and you’ll be happier.
So then, what’s happening with the fancy camera and the LIDAR? All that was ROS stuff, right? Or, do you have a version of di_sensors I haven’t seen yet?
The LIDAR is ROSbot Dave only. The Oak-D-Lite is LegacyDave only while I am learning how to program it and then learn to program the robot with what I learned how to program on the Oak thingy.
And since that all sounds like more than my brain can wrap around right now, both are being ignored while Dave learns about VScode, and the US sensor, and I think about things that sound easier like wall following.
Distance Readings: [259]
Average Reading: 259 mm
Minimum Reading: 259 mm
Maximum Reading: 259 mm
Std Dev Reading: 0 mm
Three SD as a percent of reading: 0.0 %
Distance Readings: [259, 259]
Average Reading: 259 mm
Minimum Reading: 259 mm
Maximum Reading: 259 mm
Std Dev Reading: 0 mm
Three SD as a percent of reading: 0.0 %
Distance Readings: [259, 259, 262]
Average Reading: 260 mm
Minimum Reading: 259 mm
Maximum Reading: 262 mm
Std Dev Reading: 1 mm
Three SD as a percent of reading: 1.6 %
Distance Readings: [259, 259, 262, 258]
Average Reading: 260 mm
Minimum Reading: 258 mm
Maximum Reading: 262 mm
Std Dev Reading: 2 mm
Three SD as a percent of reading: 1.7 %
Distance Readings: [259, 259, 262, 258, 257]
Average Reading: 259 mm
Minimum Reading: 257 mm
Maximum Reading: 262 mm
Std Dev Reading: 2 mm
Three SD as a percent of reading: 1.9 %
This is in a dim room looking at a piece of cardboard. So very consistent readings. The only downside is that I had the sensor at ~25 cm, so it’s reading about a cm long.
/K