New Pi 3B+ and BrickPi 3 tribot

I wrote a quick blog on the new tribot design I’m working on using the latest Pi and BrickPi3.

The STL files for the chassis are in my github for anyone that has a 3D printer :slight_smile:

That’s something really neat as a starting project. Really cool.

I also find AI an intriguing domain and it’s something I’m actively looking into - I love delving into imagining large-scale projects with AI as the pivotal element of the entire assembly … and eventually acting upon it.

Also, thank you for your contribution. It’s nice to see people building stuff.

thanks. luckily I do R&D in my day job and I’ve been doing AI stuff since 2000 starting with inference rule engines, Natural Language parsing, natural language editors and more recently deep learning. Once I get “find the flag” working I will post a few more blogs about it. Getting tensorflow training to work properly is time consuming, but hopefully in a few more weeks I’ll have a basic model that will work for a simple demo.

By all means, I’d love to read your blog posts as soon as they get out of the door - I’m also experimenting with tensorflow and sklearn at the moment and I’ll be doing an AI-based project in the very near future for sure.
I hear you said something about inference rule engines - have you ever been using Prolog for this or are other languages enough for this particular task?

I’ve contributed to several RETE rule engines over the years and have my own open souce rule engine up on sourceforge. I’ve played with Prolog in the past and studied pattern matching algorithms extensively :slight_smile: It’s a hobby/obsession of mine since 2000. I have other non-robotics related hobby projects for inference rule engines and hadoop. As cool as Deep neural nets are, they are terribly inefficient and still don’t handle first and second order logic. With AI domain and formal logic, there’s the concept of currently unknown, unknown and unknowable. Currently unknown are situations where data related to a query (aka question) isn’t available and the system can’t give a true/false answer. In those cases it can give a probability. Unknown are cases where an application doesn’t understand the query, but data related to the answer is available. Unknowable cases are situations that no data can answer or the answer isn’t static.

When I look at the graph produced by a neural net and discrimination network like (RETE algorithm) they are both DAG (directed acyclic graphs). The main differences are: 1 is machine generated and computationally expensive, the other is hand written and 2 orders of magnitude more efficient. The downside of inference engines is humans can’t write rule fast enough to keep with machines and skilled programmers that understand RETE are rare resource.

My bias is towards combining deep neural nets with fuzzy logic, inductive logic and inference rules to build better systems :slight_smile: But that’s totally off topic from BickPi and doesn’t belong here.

I’ve posted the TensorFlow retraining project to my github. It has some basic instructions, training dataset and a few test images. Later on I will write a detailed step-by-step tutorial for it. https://github.com/woolfel/inception_retrain

Thank you a lot @woolfel for your contribution. I’ll also take a look at it and give it a try when I’m more free than I am right now.

A quick update. It appears tensorflow on RPi only supports version 1.2 and the latest release is 1.7. Running a retrained model on Raspbian isn’t possible at the moment and requires compiling tensorflow from source. I’m exploring tensorflow-lite instead, since that is the official roadmap for tensorflow on RPi.

@woolfel maybe you can use a package from conda - I’m not sure if they have one for the RPI, but they definitely have the latest up there.