Thinking about it, an i2c controlled power interface may not be that difficult to implement.
The big design questions are:
How do we connect this device in series with the devices to be controlled?
a. Two Grove connectors? (A pair for each channel.)
Grove connectors will increase the size of the interface considerably.
b. Smaller connectors on 0.1 or 0.05" centers?
b. Sets of pin headers on 0.1 or 0.05" centers?
Note that both of these solutions will allow the resulting PCB to be considerably smaller at the exxpense of wiring. One possible solution is to use Sparkfun-like Qwik connectors on the PCB and use Grove-to-Qwik cables to connect it.
How many ports would be needed?
Should the device also function as a i2c buffer? If so, how would A/D sensors/devices be handled?
The easiest way would be to wire the connectors âstraight-throughâ only interrupting the power lead.
Another metthod, though more difficult, would be to specify certain connectors as buffered i2c connectors and others as wired straight through for A/D or un-buffered i2c.
Would you want the device completely isolated? (i.e. break either both power-leads or break all four leads?
Or, would it be sufficient to break just the power lead? (IMHO, the best and safest way.)
It might be possible to find a small microcontroller board that already has i2c capabilities and modify it.
Ultimately, it would have to be a custom PCB to reduce the size.
Way over my head to even know what the specifics need to be. I donât know to answer any of your questions, especially since I donât have a specific target device.
My original intention has always been to use the GoPiGo3 / DI sensors efficiently, coupled with using the microphone and pi camera efficiently.
That OAK-D looks like it would draw more than the GoPiGo3 I2C 5v lines have available, so it would need a lot of, lot of engineering for another tap off the batteries, another 5v switching power supply, and then the power switching function.
In all reality, Jim, I am âprobably neverâ going to add a $300 sensor to Carl that takes all that just to put power to it only to find out that to really use vision with depth, I would have to reinvent Carl as a ROS bot, and end up wanting a Pi4 and bigger batteries, and now weâre no longer talking about little Carl - it becomes Carla the âMuscle Botâ.
At one point I was thinking that the HuskyLens might be a low current basic vision processor that I would only use for docking, but its functionality is so limiting that I think it will stay in my âfailed toysâ box.
Maybe a Pi-4 isnât a bad idea, but I totally agree that loading up a GoPiGo like a DARPA experiment on steroids ultimately will be the worst of both worlds.
Thomas wants a 'BotZilla with dual NVIDIA Xavier Pro processors and the Lidar From Hell.
Me, Iâm happy if my simple experiments donât crash-and-burn.
Iâd be happy to try and design you a switchable power source, but the ultimate direction and goal is up to you.
The ultimate direction and goal: stimulating thoughts, conversations about those thoughts, and turning selected thoughts into actions which stimulate pleasurable thoughts.
More specifically, this week was filled with reading researchgate.net papers and searching github for:
Knowledge representation for robots
Intelligence without representation
Machine Consciousness
Natural Language Tool Kit, WordNet, Rule-based Chatbots
Â
(Thank you again for teaching me about - I just used it here.)
Â
and coding:
healthCheck enhancements: added transient I2C failure detection (and delay before attempting the yet untested âreset GoPiGo3 board without rebootingâ)
carl_chat.py: revisited my 2019 âLetâs talk about me [Carl]â NLTK based chatbot
keyword_chat.py: created to review using WordNet synonyms to spot user intentions in human-robot dialog
leds.wifi_blinker(): thread management to blink WiFi LED used when healthCheck finds swap space exceeds 60%
Â
While I should be revisiting OpenCV and building a âFind Line and Follow Itâ program, I am distracted after adding speech reco to researching ways to imbue Carl with a more general dialog and voice action engine.
This then opened up the whole âCould Carl use knowing what he knows and can do?â (meta-knowledge) line of thought. This morning I read that new research shows Crows (corvids) can think about what they can do with what they know. This is meta-level thought.
And I keep remembering one of my ârobot heroesâ, Rodney Brooks and his subsumption robot architecture I built into my RugWarriorPro in the year 2000, that Carl doesnât need (and in fact should avoid) an âall encompassingâ knowledge (robot-model and world-model) to enable a reliable tolerance for reality.
Which brought me back down to earth with my goals for Carl.
BTW, no hard I2C failures in 8 days, and no intermittent failures in 4 days, so I have yet to be able to test the resetGoPiGo3 function I wrote.
How Linux uses swap is way over my head. But from a practical standpoint, does it matter? Is it affecting performance somehow? Not trying to be confrontational - Iâm asking for my own edification.
/K
At some point the whole system slows to a grind and stops responding to my ssh remote command line, so for the time being, I am just going to set the WiFi light blinking when it gets over a âstill working thresholdâ.