Every now and again I think I have rebuilt strength to explore an Inertial Measurement Unit for my robots, and find the topic just too complex.
This time my experiment was with the MPU6050. Usage of this IMU begins by placing the IMU level and motionless for a period while collecting ax,ay,az accelerometer and gx,gy,gz gyroscope readings to collect a mean calibration offset.
So I repeatedly collected 240 readings over one minute calibrations, to discover the values drifting:
I had heard that gyroscopes “progress” over time, so I am not totally surprised by that drift, but a 2% to 3% drift in acceleration makes me think single point calibration is fated from the start.
Sort of - it seems like IMU data is known to be quite noisy so they apply kalman filtering and PID modeling to pull out “the reality” from the data. That is why the DI BNO055 IMU with Fusion processor costs so much. I saw several places that say even using the output from that is not sufficient and apply their own “Apollo Moon Flight” algorithm to the raw data from the BNO055 to get better accuracy.
I just never would have thought it would be so hard to have a robot turn good 90 degree turns, (or to actually know what direction it is facing sufficient to actually drive straight forward without wandering a body width left or right after crossing the room).
the best way might be the camera, but that too is not been pre-packaged for the picam.
The create3 is using a floor facing camera “flow sensor” that seems to cost like the DI IMU, but I don’t know what its accuracy is. The commercial versions seem to be targeted to drones not 1cm away floors.