Charlie gets some (much needed) exercise!

Greetings!

I finally have something interesting to post about Charlie - aside from him getting rebuilt for the umpteenth time. . .

Today I got Charlie running, logged in via noVNC, and started looking at the “Projects” folder. The “Camera” robot project caught my eye so I started playing with it. (Sorry, don’t have any code snippets since it’s late and Charlie went to bed.)

I figured out how to:

  1. Invert the camera image. (camera.rotation=180)
  2. Center Charlie’s head servo so it points forward.
    (And conversely, how to turn-off the servo when I’m finished with it.)
  3. Move Charlie around without banging into too many things.

The result is two videos of Charlie wandering around my living room. (It’s a mess, we’re painting.)

These videos are large - 51 and 77 megs each - so I’m not posting them here. However, I am including links:

http://www.mediafire.com/file/1ha2vbllwev9uzz/Charlie_goes_walking_1.webm (50.55 MB)
http://www.mediafire.com/file/rtfev46fflz5byk/Charlie_goes_Walking_2.webm (76.72 MB)

In the first file, he goes exploring and bumps into a few things.
In the second file, he goes a little bit further exploring the living room. Of course, he’s still having trouble getting around. (The mouse control over Charlie is a bit over-sensitive. I’m going to see what I can do to damp that down a bit.)

Questions for the developers/anyone who might know what’s going on:

  1. Any attempt to set the video resolution to anything other than 320x240 or 640x480 results in no video, though on the PiCam site it lists a whole bunch of valid resolutions and mentions that the “default” resolution is something like 1690x1280, (or something like that, again I don’t have the page in front of me - I was running on a different system.)

  2. Because the video resolution is so low, the view-port’s field of view is very constricted, providing a view that is unnecessarily magnified. This makes accurate navigation difficult.

Is it possible to correct this?

Bonus Question:

  1. Why is the Raspbian for Robots VNC desktop so small? It looks like a 640x480 resolution desktop. Is it possible to make this larger? Later on, when I get another chance to play with Charlie, I’m going to have to investigate this.

I don’t have any second-person point-of-view video of Charlie’s adventures - all of this is through his eye(s).

ToDo:

  1. Improve the video resolution.

Stretch goal:

  1. Can I control Charlie’s movements with a USB joystick plugged into the computer viewing the video? It probably has to do with getting the joystick to send messages to Charlie via the web browser. How to do that is the challenging part. If I can do this, I have a Saitek joystick with a “twist” rudder control that I would love to be able to use to move Charlie’s head-servo around so he can look around while moving.

Jim “JR”

1 Like

Looks like you are going to need a “Recognize a nefarious rug and plan attack” method!

1 Like

The real issue is the actual mouse-based joystick code.

When you run the camera-robot code you get a picture out the front of the camera. (Because of the way my camera is mounted - with the ribbon out the top - I had to add a “camera.rotation=180” to the section of the code that sets up the camera.)

When you press down the left mouse button, you get what they call the “nipple”. The “nipple” is two concentric red circles at about 50% transparency overlaid on the image, one about 2/3 the size of the other that - (amazing, isn’t it!) - resembles a nipple. When you move the mouse in a particular direction, the GoPiGo is supposed to turn and/or move in that direction.

The degree of angular deviation from center determines the direction of motion and the distance from center toward the edge determines speed.

Viz.:

  1. Straight up should move directly forward.

  2. Directly left should cause the 'bot to rotate left.

  3. Straight down and directly right should work the same way.

  4. Any combination of the four cardinal points and distance from center should cause your 'bot to curve in the direction indicated. (i.e. Up and to the left about 20 degrees should give you a rolling turn to the left.)

In my case, there’s a strong left-hand bias. Running the code and clicking the mouse without moving it causes the 'bot to spin left. This is with both Firefox and Chrome and it makes the 'bot almost impossible to control.

Mine does the same thing - don’t do that.

1 Like

Thanks for creating the vids in that format. I was previously unaware of the .webm format, and thus I learned something unexpected today.

1 Like

I think I read that the new Raspbian For Robots switched from TightVNC, so I am not able to help directly.

I suggest:

  1. To see what vnc is running:
ps -ef | grep vnc   

Mine (Stretch base not Buster) shows:

pi         691     1  0 11:38 ?        00:00:37 Xtightvnc :1 -desktop X -auth /home/pi/.Xauthority -geometry 1280x768 -depth 24 -rfbwait 120000 -rfbauth /home/pi/.vnc/passwd -rfbport 5901 -fp /usr/share/fonts/X11/misc/,/usr/share/fonts/X11/Type1/,/usr/share/fonts/X11/75dpi/,/usr/share/fonts/X11/100dpi/ -co /etc/X11/rgb
pi         725     1  0 11:38 ?        00:00:00 /bin/sh /home/pi/.vnc/xstartup
pi         727     1  0 11:38 ?        00:00:00 bash /usr/local/share/noVNC/utils/launch.sh --vnc localhost:5901 --listen 8001

I have no idea if I changed that geometry statement or where to change it.

1 Like

You learned something “unexpected”? Sounds like it does “interesting” things like crashing your system!

No, it just worked but when I downloaded it, I did not know what it was nor what app to use to view it.

I was suspicious it might contain some Russian cyber stuff that PiOTUS ("Presidential Idiot Of The US) asked for, so I had to research the format before I clicked on the downloaded files.

1 Like

Turns out that at least some of the problem is in the formula for calculating “derived_speed” which is the speed the robot moves at when moving.

I haven’t figured out how to cut-and-paste between Charlie and my laptop yet, but it goes something like this:

MIN_SPEED = 100
MAX_SPEED = 300  [I set Charlie's MAX_SPEED to 400, since 300 is a bit slow.]
MAX_FORCE = 5
[force = distance from center of nipple expressed as a positive floating point number]

[insert a bunch of unrelated code here]

derived_speed = MIN_SPEED + force * (MIN-SPEED - MAX_SPEED) / MAX_SPEED
[or maybe MAX_FORCE?  Charlie's not running right now so I can't check.]

“force” = the degree to which the inner circle of the “nipple” is moved away from the center. Obviously, if there is no “force”, the 'bot should not move. However, because the force is added to the rest of the calculation, “derived_speed” is never zero because “force” is never negative. (It can become zero, but can never become negative.)

To correct this, in the series “if” statements that regulate “derived_speed” by constraining it to “MAX_SPEED” if the calculation for “derived_speed” is greater than MAX_SPEED, I added an additional statement at the end of it all:

if force == 0:
    derived_speed = 0

Which forces “derived_speed” to be zero if no force is applied. There is still a left-hand bias that I have to track down somewhere else, but the effect is much less pronounced and the ability to control the robot is enhanced greatly.

Two things still need to be done:

  1. Figure out how to reduce the angular sensitivity since even tiny deviations from the vertical axis cause relatively large left-or-right movements of the 'bot.

  2. Track down and remove what remains of the left-hand bias.

On the subject of coding, (at least at the beginning), I do better taking code that someone else has written, analyze it, and work on making changes/improvements. Huge complex pieces of code like this - especially class definitions - are still (somewhat) beyond me. I’ve learned a lot this way, and it’s one of the fastest way for me to learn the ins-and-outs of any programming language.

I would LOVE to be able to use Visual Studio Code to program Charlie. For one thing, the noVNC desktop on my browser is (almost) unusably small. The other is that VSC provides built-in Git integration.

One thought I just had is to run VSC on my Windows machine (with the much larger and easier to read desktop) and figure out how to ftp things back-and-forth to Charlie for testing. This way I can have my cake and eat it too. At least to some extent.

Jim “JR”

1 Like

Carl is a bit top heavy, so I try to keep him at 150.

2.6 in/sec at 120 DPS
3.25 in/sec at 150 DPS (myconfig)
6.4 in/sec at 300 DPS (default),
7.7 in/sec at 360 DPS (max for straight travel)

When I run the camera bot program, it drives nice (slow) forward, but seems to go backward too fast to control well (not to mention Carl cannot see what he is likely to back into.)

1 Like

Interesting!

IMHO, that should be easy to fix - whatever code you’re using to drive Carl around - even if self-driving - should know what direction (forward/backward) Carl is going. You can use that.

Viz.:

#  "calculated_speed" is the result of the mathematical formulae
#   you use to determine how fast Carl will move at any one point in time.
#
#  "actual_speed" is the value you use to drive the motors, which is a function of
#   calculated_speed, and can be reduced or augmented as necessary.
#
#   I assume that, when driving forward, "actual_speed" = "calculated_speed"
#
#  There are two experimentally derived constants that are used to modify
#  the actual speed of Carl when moving in a direction other than forward.
#
#  1.  "backward motion offset constant" is a value of less than 1, which is
#       used as a factor to reduce the backward speed. (i. e. ".5" would reduce it by half)
#
#  2.  "turning motion offset constant" is a different value, (if needed), used to reduce
#       Carl's speed when turning right or left.
#
#  3.  If Carl can make moving turns while driving forward or backward,
#      those cases will need to be handled separately.

if state == "moving":
    calculated_speed = [whatever mathematics you use to derive the speed]

    if direction == "forward":
        actual_speed = calculated_speed
    elif direction == "backward":
        actual_speed = calculated_speed * [backward motion offset constant]
    elif direction == "left" || direction == "right":
        actual_speed = calculated_speed * [turning motion offset constant]
    else:
        #  Something is seriously wrong here, 'eh?
        print("Carl appears to have no direction in life. . . .")

else:
    Carl.stop()

Assuming that my understanding of your basic logic is correct, you should be able to use something like this to regulate Carl’s speed when moving in directions other than forward.

Jim “JR”

Thanks for thinking on Carl’s behalf. Eventually, I plan to implement a drive mechanism that combines:

  1. Subsumption Architecture of Rodney Brooks
    • Layers: Safety, Avoidance, Goal, Explore/Wander
    • lowest level uses obstacle sensing (distance, vision) for protection commands that override higher layer commands
    • not sure if will implement the avoidance layer
  2. Ramped acceleration/deceleration to target speeds
    • ramp allows faster speeds than instantaneous on/off commands
    • since safety layer might need (nearly?) instantaneous off, perhaps need two ramp rates
  3. Separate message and event enabled drive motor goals process
    • similar to the code Robert Lucian wrote for the Intelligent Obstacle Avoidance project.

This of course is complicated to write so it sits in the “plans” line of thinking, and I just keep Carl at his most accurate turning and driving speed 150 dps.

(It also starts to look alot like ROS…)

1 Like

. . . But in the meantime you can implement a simpler system that might make Carl’s life easier. IMHO, it’s better to use a simple solution instead of no solution at all. (While you wait to implement your stretch goal.)

With respect to your stretch goal, might I suggest reading up on fuzzy logic. I can’t speak about how difficult it would be to implement; thhough I would assume it’s not trivial.

1 Like

Update:

Charlie got to go out and play again today. No video this time.

I decided to remove the Pi-4 board and re-install the Pi-3 I was using before, still using Buster.

I was able to, (finally!), adjust the noVNC display resolution to something useful, and - at the same time - change the streaming video to a resolution and color-balance that made sense. Why I couldn’t do this on the Pi-4 is a mystery to me,

After that, Charlie decided to go to the kitchen and say “Hello!” to my wife.
(P.S., I need to get some kind of amplified speaker that runs off of USB power.)

After running him around, he decided he wanted to go to bed,

His batteries are in the two chargers - four each - and should be ready by tomorrow.

Jim “JR”

P.S.
Any suggestions on what to use as a speaker? I have a couple of small speakers that could be wired direct to GPIO pins, but that sounds like a butt-ugly hack to me. Most all the speakers I see - even the USB ones - are stereo. I don’t need stereo.