Figured out why Dave did not understand American Sign Language

18 months ago I admitted defeat trying to teach Dave to understand American Sign Language with his “naked Pi4”.

Of course I was trying to jump straight to the end of the “train your robot” process that

  • starts with data…lots and lots of data,
  • continues with processing that lots of data to “build a model”, and
  • ends with using the model in a specialized neural network processor to recognize something “familiar” in new visual frames.

Saw an article on the datasets for several Google sponsored competitions for machine learning:

  • American Sign Language gestures and
  • American Sign Language finger spelling.

The input data for the ASL competitions consists of “only” 61 million samples in 250GB.

I have resisted building a machine learning environment so am limited to using available models.

Hopefully these Google sponsored ASL competitions will result in TFlite models to run on Dave’s Oak-D-W Neural Net processor to give him a sophisticated human interface input modality.

2 Likes

And he will begin ordering at Burger King and asking existential questions. .

2 Likes