Support for Julia?

I just received an interesting e-mail, (from GitHub), talking about a virtual event, “JuliaCon”, where they describe Julia as “the fastest high-performance open source computing language for machine learning, artificial intelligence, life sciences, robotics, algorithmic trading, real-time analytics and more.”

Before we go any further, I really don’t know anything about Julia, (except that it’s my granddaughter’s name), other than what I’ve read on Wikipedia about it

According to the Wikipedia article noted above, Julia has a much more FOSS-friendly license, (a combination of MIT and GPL-2), than other similar languages, particularly when compared to the proprietary MATLAB and Wolfram languages already provided for the Raspberry Pi.

Likewise, it is claimed that Julia beats the pants off of just about everything out there - including MATLAB - being one of a very few languages, (like C), that have achieved petaFlop performance on standard supercomputer benchmarks.

Later on in the article, they mention that though initial compatibility with the Pi was a bit squiffy, it’s improving rapidly.

(Side note for @thomascoyle11859 - it supposedly eats CUDA-cores for lunch, so your Nano/Xavier based boards should love it.)

Admittedly there are a few rocks in the road:

  1. Any language that references any version or implementation of LISP as an ancestor is going to be chewy.  (IMHO, Mandarin Chinese vs LISP is an even toss-up as far as complexity is concerned.)  The only language chewier, (IMHO), is APL

  2. Using Dexter/M.R. libraries with it will have to be done via special Julia calls to Python libraries, (which is supposed to be “easy”).  Though I am sure the Dexter libraries can be ported to Julia, that is being left “as an exercise for the student”
    (Thomas, do you want to take a crack at it?)

  3. It seems to be tied up within Jupyter in ways I don’t understand.
    Actually Jupyter itself is something I don’t really understand the essential need for. . . but that’s another story.
     

Looking up Julia on the Raspberry Pi opened up a whole cornucopia of goodies, not the least of which was

 
Apparently they did this by using Julia, a Raspberry Pi, and an Arduino to implement a self-driving car:  https://www.youtube.com/watch?v=bX4TXWO7dA0

Note that this implementation uses hand-made wheel encoders connected to an Arduino and it smells strongly of a GoPiGo re-implementation.  What they could have done with The Real Thing?  It’s anyone’s guess.

It sounds interesting.

It might be worth a look at.  At least it’s something that we should keep our eyes on as it looks like it might be an up-and-coming thing in robotics.

Here’s where it can be found: https://julialang.org/

Interesting. But I think I’ll stick with Python until I see a clear reason to change. For some of ROS I think I’ll need to bolster my C++ skills. Not nearly as cutting edge as Julia, but more useful for now.
/K

1 Like

True.

However, I think it’s important for us to know what’s going on out there.

If colleges are building GoPiGo clones because they don’t know better, then maybe MR needs to do some evangelizing on technical college and/or trade school campuses as well?

It would be wicked cool to see schools like Va. Tech, or MIT, hand out a GiggleBot to every engineering freshman just to see what they would do with them.

While researching Julia for this article, I ran into a whole lot of interesting design philosophies - things like “least resources” which means use the least powerful tool to do the job, because it will be simpler to debug and require more ingenuity than a complex tool that does 99% of the work for you.  That’s why I thought of giving out GiggleBots to the engineering students.  It can do a lot, but being based on a micro:bit, you have to be clever to make it happen.

Of course, if high-class engineering schools like 'Tech or MIT think GiggleBots are too juvenile - I won’t object to them giving out GoPiGo’s either. :wink:

Fair point.

Also a fair point- some of the NASA rovers have very limited resources that are very used very cleverly. I think @cleoqc had mentioned that she had code on the space shuttle - I’ll bet that was very efficient as well. That said, I don’t know what’s easier - starting very lean, or starting with fewer restrictions, then learn to be lean. Probably arguments to be made both ways. The GoPiGo is actually quite capable. You’re right - the micro:bit is pretty restricted in terms of resources.

/K

1 Like

These things aren’t necessarily mutually exclusive.  Efficient coding doesn’t necessarily mean more restrictions.  It’s more like a discipline and a realization that your one program may not have infinite resources.

IMHO, programmers have become “lazy”/“careless” because of the huge amounts of memory and processing power available.  In essence, they don’t need to worry about resource constraints.  (Note that I’m beginning to sound like @cyclicalobsessive with regard to programming resources instead of battery power.  :wink:)

Smaller systems like the Pi, (and it’s clones), have forced a realization that resources may NOT be infinite and that more careful programming is necessary.

Hopefully this will carry over into the software for larger systems.

1 Like

Maybe. Depends in part on what you’re optimizing for. I’ve read many times to avoid the temptation to prematurely optimize. In many situations every sub-system doesn’t need to be optimally efficient. The programmer can get it all working, then profile where there are actual efficiency gains to be made. Making ultra-efficient code is time consuming. If you’re optimizing for getting programs out the door, wasting time cutting milliseconds off of a loop just isn’t a good investment. I think that’s one of the reasons Python is so popular - it runs fast enough for most things, but is way faster to develop for (especially with no compilation step). Of course some attention to resources is always needed.
/K

1 Like

What I’m talking about isn’t about “premature optimization”, but more a disregard for efficient programming altogether.

Does your program really need the “huge” memory model?  (And an entire circus of DLL’s and helper libraries?)

I remember back in the 80’s when I wrote my now-famous 29 byte reset routine in 8086 assembler.  Trying to do that using Q-Basic generated a (comparatively) huge program to do the same thing.

My “famous” test.
Generate “compiled” (Q-Basic) basic for the following code samples and record the size of the executable created after compilation.

First sample:
10 REM This code does nothing
20 END

Compiled size: A 10-and-some-odd-k .exe file that did nothing.

(Note: This particular “compiler” required at least one statement before the “END” statement. The size of the REM, (remark/comment), statement didn’t matter.  And yes, I tested that.)

Second sample:
10 REM This prints “Hello World!”
20 PRINT “Hello World!”
30 END

Compiled size: A 30k+ .exe file.

Adding any kind of math, even simple addition, caused the file to rapidly balloon.

My necessary patch, in “compiled Q-Basic” was well over a hundred “k” in size.

In my case the constraint was storage space.  I had to fit my patch into a single 512 byte sector and a 100+k file wasn’t going to cut it.
 

One good thing that’s happening is the proliferation of smaller systems like the Pi.  It forces developers to be less wasteful and more aware of the system’s limits.

Here’s an example:

PiFox, a bare-metal arcade style game written in only 6000 lines of ARM assembler, it runs fast as a thief on a Pi-1!  (Look at the video. They show the board, a 26-pin GPIO Pi!).  You could, literally, run this on batteries instead of a wall adapter.
 

Before everyone goes off half-cocked, I know that you have to learn how to cook an egg before you can cook pan-seared Angus Beef with Free Range chicken eggs and a hand-made Hollandaise sauce.  Despite that, I don’t think anyone will argue that the budding cook needs to know that you don’t turn on the burner full-blast and drop an entire stick of butter in the pan to cook one egg. . . .

It’s the same with programming.  You need to learn to code before you can learn to code fancy stuff.  However prudence and economy is something that can be taught even in the most basic “This-heayer is a keyboard” computer classes.
 

I guess my past is showing through the thin cloth of my worn elbows. . .

My college computer classes graded on both function and economy:

  • If the program did what it was supposed to do and wasn’t considered a butt-ugly wad of spaghetti code by the instructor, you got a “C”.

  • Higher grades were awarded for faster, smaller, executables.
     

Note that a lot of this was being done on mainframes or PDP/VAX type systems where computer time and storage used was billable.  You couldn’t write sloppy code because if you did, your “computer account” would “run out of money” and you’d have to go beg your instructor for more - a good way to guarantee your grade would tank.

Also note that the instructors did not have an unlimited fountain of computer resource money to draw from and you were not always guaranteed “more time” on the system if you wasted what you had - the fast track to a failed course!

When I was cutting my teeth on things like the 8080, 8085, and 6502, there just wasn’t enough room to be wasteful - a “fully loaded” 65k system was a luxury from the Arabian Nights.  Many of my projects had to fit within the confines of an eight-k EPROM.  If you “went over budget” and needed a second EPROM, you not only had to explain it to your boss, YOU had to handle crossing over the EPROM address boundary from chip-to-chip, as chips were not always sequentially addressed.

If you were using a chip with a paged architecture, (like the 6502), God help you since the starting address for that second chip would invariably be further away than a single page.  And no, I won’t go into the horrors of the difference between a “near” call and a “far” call, and having to completely re-arrange the memory layout simply because you needed three extra bytes. . .

1 Like

All fair points. Although with the QBasic compiler examples, it sounds like the issue was a bad compiler. This may not be something the programmer has much control over depending on the setting.
/K

1 Like