How to organize multiple projects on GitHub?

@cyclicalobsessive, @KeithW, @cleoqc and anyone else who wishes to comment. . .

Assume the following is true:

  1. I have multiple projects that I want to work on, located on my GoPiGo.

    • I want to start program dev directly on the 'bot instead of remote dev and copying everything over.
  2. I have them organized as sub-directories under a master /home/pi/projects directory.

    • /home/pi/projects/new_remote_camera_robot
    • /home/pi/projects/new_control_panel
    • /home/pi/projects/startup_settings
    • (etc.)
  3. I either have preserved, (or want to preserve), the content and state of my projects on GitHub so that if all hell breaks loose, I can recover.

    • This also allows me to update/upgrade the operating system in toto and re-install everything back as it was.

One solution would be for me to make a “projects” project and to create “branches” for each project directory.  Unfortunately, GitHub assumes that branches are related, connected, and, like opposite charges, they REALLY WANT to recombine.  It has been my experience with GitHub that GitHub stamps it’s feet and really wants branches to recombine.

Another solution is to create a totally separate and distinct project on GitHub for each project on the GoPiGo.  Having done this, the projects are relatively immiscible and have no desire to combine.  However, to restore, (i.e. to a new installation on new media), I have to copy/clone each directory separately.


  • What has your experience been with Git/GitHub?

  • How have you handled the organization of multiple, possibly related, projects?

  • How do you handle cross-projet dependencies?  (i.e.  Project “x” depends on something else, (a library perhaps), contained within as a part of project “q”?)

What say ye?

1 Like

Corollary question:

Assume I want to clone the Dexter gopigo and easy_gopigo library projects to my bot so that I can mess with them, (and perhaps create pull requests).

What’s the correct way to clone a project so there is a relationship with the upstream project that I can open pull requests against?

How do I force the entire robot to use MY libraries instead of the stock libraries by default?

Second corollary question:


  • I have a project based on, (for example), control_panel and I want to push changes.

  • @cyclicalobsessive is also working on this project and has valuable changes that I want to have as part of my project too as they might affect each other - GUI display alignment, etc.

How do I coordinate the two different development efforts so we are not at cross-purposes?


sounds like you want to learn to use python sys or non-sys virtual environments

IIRC it will be something like:

  1. create a copy sys virt env
  2. enter that env
  3. perform gopigo3 update --virtual-env --no-gui (don’t remember exactly)
  4. change stuff
  5. either run or put your test script with the changed stuff.

Search the forum for virtual environment, cleoqc posted to someone exactly

Or you can just add your versions in the front of the Python Path so it always uses yours if it exists
I also add a print stmt in all mine so I can be sure it is the module in use

1 Like

Why do I want a “virtual” environment?

What’s wrong with the real one?


Did that.

Do you know that a search for “virtual environment” returns something like 34 hits, most of which are years old, and I didn’t see anything that gave me confidence - at least not at 10:30 pm with a throbbing headache after troubleshooting a dead monitor all afternoon and getting the occasional 220v “nip” from grounds and mains power being located too close together.

(The main processor’s oscillator isn’t running, not sure why not yet. I will check /RESET tomorrow.)

Can you give me a hint what the thread was about so I can find it more easily?



This is the link I was hoping you would find:

It was the culmination of this one:


I’ve enjoyed using git and GitHub. there is a learning curve to be sure.

No - this is really not how you want to use git (or GitHub). Branches really are for development and testing of the same basic project - not for separate projects.

You can create one master “project” and then create directories for sub-projects under this. I’ve done that. It works easy enough, and then you only have to clone one “project”

I haven’t had much issue with dependencies, but I suspect @cyclicalobsessive is correct - that is something that lends itself to python virtual environments. I’ve used them before, but not in the context of development with git,



What does a Python “virtual” environment buy me that running Python direct doesn’t?

1 Like

In each virtual environment you can have different sets of libraries (or versions of libraries). This lets the programmer work on different projects without having to worry about version conflicts.

I have no idea how it might work to use a different GitHub project branch with different virtual environments. I suppose it’s possible.


My idea, (and concern), is how a particular library revision will affect the entire robot as opposed to one or two specific programs.

IMHO, it doesn’t matter if program “x” works with a modified library, if it’s going to crash the rest of the 'bot. :wink:

Of course I want to test with specific apps, but I want to know how it affects the entire robot too.

1 Like

Fair point. I was thinking you’d load the entire driver library into each virtual environment. If a particular library does impact the entire robot, you just remove the entire environment.

The downside to virtual environments is that they require much more disk space, since a lot of things are duplicated for each environment (although maybe some of that is handled intelligently in the background - I don’t know).


I have enough disk space available that this is not a constraint.

Unfortunately, my limited, (read “nonexistent”), experience with Python in general and with Python virtual environments in particular, makes me suspicious.

For example, what happens if I try to instantiate multiple different versions of gopigo/easygopigo in different virtual containers, and then try to do something like move the 'bot?

Since there are multiple instances of different versions of the same classes, who wins?  Which witch is which?  Who’s on first?

It seems like a recipe for mass confusion.

Don’t understand the question actually.
“experience with Github”:

  1. created repository for each robot, or execution environment (Ubuntu VM on Mac, DeskPi, Battery Test Pi)
  2. Occasionally revert a file or folder to the master version, but many times I have created a local “.old” copy that (I don’t check-in ) before starting a mod so I can check-in changes along the way. (instead of creating a branch, checking in on the branch, then merging when happy with the changes - I don’t mess with branches and releases)

“Good” python programmers use virtual environments, local and site packages, and even test and public PyPi packages. Me? I do everything above currently released site packages, and have a system of diff and release to a /plib/ folder. Every project starts with:

  • some file or files that are intended for other project consumption -
  • will compare the local with plib/
  • last released copy until the new version is released, then deleted (never checked in)
  • will copy the local to plib/
  • I check git status at the end of every dev session and add/commit/push the project for safety
  • I don’t use local packages, even though that is the “correct” way to publish to other projects.
  • I use either files from plib by prepending the plib onto the python path:
import sys

or sometimes for quicky tests, I simply copy the needed files since the python path always checks “./” first.

1 Like

It’ll be the one in the environment that is currently active. That’s the whole goal - to let you change environments so that you can have different behaviors.

1 Like

pythonpath is the traffic cop. If a desired version is not in the path before an undesired version, you aren’t thinking path all the time.

Python will tell you where it got every module:

>>> import gopigo3
>>> print("gopigo3 used:", gopigo3.__file__)
('gopigo3 used:', '/usr/local/lib/python2.7/dist-packages/gopigo3-1.2.0-py2.7.egg/gopigo3.pyc')

This is the path on Carl for python2 before I prepend my “/home/pi/Carl/plib/”

>>> import sys
>>> print(sys.path)
['', '/usr/lib/python2.7', '/usr/lib/python2.7/plat-arm-linux-gnueabihf', '/usr/lib/python2.7/lib-tk', '/usr/lib/python2.7/lib-old', '/usr/lib/python2.7/lib-dynload', '/home/pi/.local/lib/python2.7/site-packages', '/usr/local/lib/python2.7/dist-packages', '/usr/local/lib/python2.7/dist-packages/wiringpi-2.60.0-py2.7-linux-armv7l.egg', '/usr/local/lib/python2.7/dist-packages/smbus_cffi-0.5.1-py2.7-linux-armv7l.egg', '/usr/local/lib/python2.7/dist-packages/python_periphery-2.1.0-py2.7.egg', '/usr/local/lib/python2.7/dist-packages/cffi-1.14.3-py2.7-linux-armv7l.egg', '/usr/local/lib/python2.7/dist-packages/pycparser-2.20-py2.7.egg', '/usr/local/lib/python2.7/dist-packages/scratchpy-0.1.0-py2.7.egg', '/usr/local/lib/python2.7/dist-packages/DI_Sensors-1.0.0-py2.7.egg', '/usr/local/lib/python2.7/dist-packages/Line_Follower-1.0.0-py2.7.egg', '/usr/local/lib/python2.7/dist-packages/Dexter_AutoDetection_and_I2C_Mutex-0.0.0-py2.7.egg', '/usr/local/lib/python2.7/dist-packages/gopigo3-1.2.0-py2.7.egg', '/usr/local/lib/python2.7/dist-packages/brickpi3-0.0.0-py2.7.egg', '/usr/local/lib/python2.7/dist-packages/grovepi-1.4.1-py2.7.egg', '/usr/local/lib/python2.7/dist-packages/pivotpi-0.0.0-py2.7.egg', '/usr/lib/python2.7/dist-packages', '/usr/lib/python2.7/dist-packages/gtk-2.0', '/usr/lib/python2.7/dist-packages/wx-3.0-gtk3']

" “Good” Python programmers  ". . . .
. . . . usually work on programs that stand alone while they do something, and 99.99999% of “good” Python programs have absolutely nothing to do with robotics or complex interrelated systems.

This is where the challenge of robotics raises its ugly head - everything is interrelated and codependent in some way.

Unlike a program for a pinochle game, robotic functions can, and do, interact in ways that may not be easy to predict.

This is why I am uneasy about rushing into something I know so little about.

1 Like

Don’t rush… don’t skip the research phase, don’t skip the design phase, don’t loose patience with yourself.


I can’t say I’m “thinking path” any of the time.

Mostly I type in the appropriate magic words and hope they work!
:wink: :grin:


My guess is that you and I are handicapped by having learned programming in an era of no libraries.

For me, It started with “standard C++ libraries” and standard Ada libraries. These libraries were collections of “programming related” tasks.

The first explosion in available libraries in my programming life, was Microsoft Foundation Classes.
This was a 3 foot wide bookshelf of libraries that were my first “application related” reusable code.
I never got comfortable with more than a tiny fraction of those classes.

(I had 20 years of Java in there with IBM - same story - it’s all in the libraries.)

When I started playing with RaspberryPi, I decided to “learn Python.”
I quickly discovered that is actually more impossible for me than learning to use the MFC library,
because only “Gurus” write in Python - everyone less uses Python to call packages the Gurus publish
on GitHub, PyPi, and the Linux repos.

Python is a pretty amazing language but the real power is in the packages.

ROS is a fairly simple architectural concept with the real power in the packages.

Both are not overly complex as you might think - the power lies in the complexity available. What might be overly complex is finding the right version of the right package to reuse.

(something weird going on - I’ve lost the line wrap)


So, you are suggesting that I do some fundamental research into Python virtual environments?

Anything else I should study before I begin the design phase?

That’s part of the problem here:
With a hardware problem, I usually have a clear idea of the scope of my knowledge and which specific parts need amplification.

With a major software design, there are so many unknowns that everything is ultimately in question and I’m not sure where to start researching since I can’t research everything!

1 Like

I suggested it because it is the “proper” way. Like I wrote, I don’t do that.
My real opinion is that you are perhaps thinking a bit grandiose?

The simplest approach is no virtenv, no packages, no path worries - just copy and rename “my_” this and that. I do this and it is nearly invisible:

import my_easygopigo3 as easygopigo3

It can use the site if my change is only in the top package.

If the my_ packages get nested, in I “import my_that as that”

Eventually this bites when DI changes something I based a “my_” module on.