ModRobotics: Does EasyDistanceSensorMutexes.py Example Work?

@cleoqc, Anyway you might get a chance to try running one of the sensor examples real quick?

Please try:

$ python3 ~/Dexter/DI_Sensors/Python/Examples/EasyDistanceSensorMutexes.py 

On my system, it causes a fatal I2C bus error requiring a cold boot (sudo shutdown -h , power cycle):

[Errno 5] Input/output error
[Errno 5] Input/output error
[Errno 5] Input/output error
[Errno 5] Input/output error
[Errno 5] Input/output error
[Errno 5] Input/output error
Thread ID = 1974203488 with distance value = 0
Thread ID = 1984783456 with distance value = 0
[Errno 5] Input/output error
[Errno 5] Input/output error
[Errno 5] Input/output error
[Errno 5] Input/output error
[Errno 5] Input/output error
[Errno 5] Input/output error
...

I cannot figure out a way to have two threads access the distance sensor.

Likewise, if I try to use the I2C_Mutex.Mutex() class or di_sensors.easy_mutex.ifMutexAcquire/Release in threads I end up with a fatal I2C bus error eventually.

1 Like

A callback routine that immediately returns in a manner similar to request_animation_frame?

This is python @jimrh , you and I both have no clue.

1 Like

As you once said, part of the official Python spec is:

“. . . this has been left as an exercise for the students” :wink:

1 Like

Although I’ve used Python for a while now, I’ve never done multithreading explicitly (I know it’s happening in ROS, but that’s a black box for me). Definitely something else to put on my “to learn” list.
/K

2 Likes

Yes, the producer/subscriber messaging mechanism of ROS is possibly the most important architectural pillar of ROS. It “de-couples” everything.

One of the most memorable concepts of my limited software coursework were discussions of coupling and cohesion - cohesion being good and coupling being bad.

Designing “complete” robots, without the ROS foundation, requires architectural choices that can be very significant with nuanced impact not understood until late in the travel down the chosen path.

Early in my software career, I was enamored by writing direct to the hardware with no operating system of code I had not written. If something was wrong, it was wrong in my code.

Very quickly I learned I could not “write it all”, and accepted compiler generated code and compiler runtime libraries, but still avoided operating systems. (My first job was to write those compiler runtime libraries actually.)

After 40 years of directly coding robot behaviors, the Raspberry Pi with its Debian Linux based OS and built-in SPI and I2C bus availability has rocketed me beyond my “write it all” comfort zone. The massive community support for advanced computing principles in Python opens up robot architecture decisions which I am totally unprepared to make.

Rather than trying to roll my own robot software architecture in increments, I probably should accept a move to ROS, BUT I want you “ROS-on-GoPiGo3” folk to find out if the GoPiGo3 hardware and DI/ModRobotics sensor/effector code can make a robot that is both aware and responsive with small reaction times.

Obstacle detection with only the camera and distance sensor, wheel stall detection, and body deflection from the horizontal plane (such as hitting a floor/rug transition) may need autonomic protection responses at the GoPiGo3 platform level (below the ROS layer).

In that case, a ROS-on-GoPiGo3 programmer may end up exactly where I am at right now - trying to couple sensors to effectors closely in a few important cases but loosely for the majority needs. But then again, perhaps the ROS community having gone down the path of adding platforms will have templates for the platform specific interface routines to every imaginable need.

1 Like

Beyond one college programming course (in Pascal of all things), my programming/comp sci education has been very ad hoc. I’ve been working on improving my ROS knowledge by taking online training on The Construct. One thing I just recently noticed is that there don’t seem to be specific ROS topics for the IMU output. I’m trying to figure out if the measurement are somehow being incorporated into /tf or /odom. So much more to learn…

It seems clear that for the GoPiGo3 with a Raspberry Pi you really need to be linked to a laptop for more advanced navigation, etc. That probably includes image analysis. Basic sensors like the distance sensor and wheel encoders can probably be processed locally with reasonable response times.

I still need to finish Part 4 of the Hands on ROS book, which focuses on machine learning, but I really wanted to get basic navigation down first.
/K

2 Likes