I’m a school admin tasked with bringing STEM and robotics to our school. I’m going to this GoPiGo event in St. Louis on Saturday (link below). I want to make the GoPiGo avoid obstacles with the ultrasonic sensor, and to trace a line with the camera–using Python. I’m going to be using the B+ and the Pi2.
Has anyone done this, or have any ideas, examples or even other “tasks” we can try out at this 14 hour event?
The line finder by itself does not do much. It’s two LEDs, an IR transmitter and an IR receiver. It simply returns digital high/low when it detects black or white.
To use it, draw a black line on the surface and instruct the GoPiGo to rotate slightly left or right to stay on the line. If the line is lost, reverse back.
Other ideas? Obstacle avoidance is once thing. Avoiding a moving target is even harder. Why not have several GoPiGo’s and make them play Tip?
Elect one of them as It, and it’s job is then to hunt down other GoPiGos and collide with it / send something wireless, ble perhaps.
After a few seconds, the hunted now become the hunter and the cycle completes. The game could end once a GoPiGo has exceeded a tipped count, or last man standing.
Hunted GoPiGos could even communicate with other ones using IR, BLE, Xbee, Wifi etc. “He was over there 10 seconds ago”
Add a 9DOF to track XYZ orientation, acceleration and direction.
Thanks for this too! Yeah, I heard image processing code is involved. But that’s what we’re doing for GopiGo camp in June. I might as well start now. There will be techies at the St Louis thing. Perhaps some of them may have written some python examples etc.
If you’re looking to do demonstrations, especially under deadline, and especially if you have limited experience with the GoPiGo, you might look at starting with one of these projects that are already complete and tested.
But how about some more complex examples or tasks for the GoPiGo so that the more advanced people at the event can have a shot at it? It’s more of a “collaborative” presentation–“an interactive build session” as they call it.
No better place to ask this stuff than this forum, I imagine.
Hey,
We had tried a line following robot with the pi camera and OpenCV on a B+. It wasn’t a very big success because we are not that good in the the OpenCV part and B+ was a bit slow. That might be a nice project for the advanced users especially if you have a Pi2.
Karan’s right; we did try it but had some trouble getting OpenCV to run fast enough to make it run at a meaningful speed. However, with the new Pi2 and more power, we probably could revisit that and crank up the speed!