Hi all,

I am interested in VSLAM for my BrickPi3 robot Charlie.

CyclicalObsessive mentioned RTAB-Map, this looks very interesting. I was wondering what people’s experiences are with this. Is is straightforward to use? Does it work well?

Charlie is presently a monocular camera robot, although I do have spare camera’s so I probably could give him stereo vision. But they are cheap’ish USB cameras. There is no depth information. Is this likely to work?

The documentation seemed clear that a monocular camera would need depth info, and that it also supports stereo, but I wasn’t clear whether stereo removes the need for depth info or if it is still needed in that case as well.



I do not have any solid info, but I did see comments that led me to believe some folks have even run RTABmap using monocular RGB, as well as the usual stereo gray-scale, but some have used active 3D Time Of Flight IR ranging combined with RGB monocular. It apparently can also fuse 2D LIDAR scan data into the algorithm.

I saw a very impressive demo of RTABmap running with several CUDA graphics cards with a wireless stereo camera - but while impressive, far from my desired “autonomous mobile processing”.

With the three camera Oak-D-Lite performing the depth-from-stereo neural net (and object recognition NN that is not used), RTABmap seems to only need 50% to 75% of my Pi5 processor, but I have not “tuned” the configuration at all. (actually have not figured out how to generate correct /tf for my cameras yet either.)

Keegan Neave has three cameras hooked to a Luxonis image processor to feed his NE-Five

(after Johnny-5 of Short Circuit movie)

The RTABmap forum is probably the best place to haunt for answers.


Thank you for this.

I’ve had a chance to look at more of the RTAB-Map documentation. From what I have understood it looks as if it does need depth information. I did play around with some neural nets for depth estimation from a monocular camera. They seemed quite good at giving relative depth information (ie this object is closer than that object), but not so good at absolute depth, i.e. this is about 48 cm away.

Like the pictures, by the way, your disparity map looks like it’s working well (from what I can tell)


His disparity map - super impressive that it is running with giant wide angle.