Makelangelo Tutorials

Makelangelo Spirograph Art

Making spirograph art is easy on the Makelangelo. Here’s a few examples of how you can generate beautiful geometric patterns and spirograph art. Post your favorites to the forums!

Makelangelo Spirograph Art with Scratch

Export the output from Scratch, load the SB file in Makelangelo Software, and proceed as normal.

Makelangelo Spirograph Art with Processing

void setup() {
  float r=225;
  float a = 0;
  println("G0 Z90");
  println("G0 Z30");
  while(r>15) {
  println("G0 Z90");

void driveto(float r,float a) {
  float x = sin(radians(a)) * r;
  float y = cos(radians(a)) * r;
  println("G0 X"+x+" Y"+y);

Copy/paste the output into a file called “test.ngc”, open that file in Makelangelo-Software, and proceed and normal.

Robot Arm Tutorials

Calculating Jacobian matrixes for a 6 axis Robot Arm

What are Jacobian matrixes (good for)?

I want to know how fast the Sixi robot has to move each joint (how much work in each muscle) to move the end effector (finger tip) at my desired velocity (direction * speed). For any given pose, the Jacobian matrix describes the relationship between the joint velocities and the end effector velocity. The inverse jacobian matrix does the reverse, and that’s what I want.

The Jacobian matrix could be a matrix of equations, solved for any pose of the robot. That is a phenomenal amount of math and, frankly, I’m not that smart. I’m going to use a method to calculate the instantaneous approximate Jacobian at any given robot pose, and then recalculate it as often as I need. It may be off in the 5th or 6th decimal place, but it’s still good enough for my needs.

Special thanks to Queensland University of Technology. Their online explanation taught me this method. I strongly recommend you watch their series, which will help this post (a kind of cheat sheet) make more sense.

What tools do I have to find the Jacobian matrix?

  • I have the D-H parameters for my robot;
  • I have my Forward Kinematics (FK) calculations; and
  • I have my Inverse Kinematics (IK) calculations.

I have a convenience method that takes 6 joint angles and the robot description and returns the matrix of the end effector.

 * @param jointAngles 6 joint angles
 * @param robot the D-H description and the FK/IK solver
 * @return the matrix of the end effector
private Matrix4d computeMatrix(double [] jointAngles,Sixi2 robot) {
    robot.setRobotPose(jointAngles);  // recursively calculates all the matrixes down to the finger tip.
    return new Matrix4d(robot.getLiveMatrix());

The method for approximating the Jacobian

Essentially I’m writing a method that returns the 6×6 Jacobian matrix for a given robot pose.

 * Use Forward Kinematics to approximate the Jacobian matrix for Sixi.
 * See also
public double [][] approximateJacobian(Sixi robot,double [] jointAnglesA) { 
     double [][] jacobian = new double[6][6]; 
     return jacobian;

The keyframe is a description of the current joint angles, and the robot contains the D-H parameters and the FK/IK calculators.

Each column of the Jacobian has 6 parameters: 0-2 describe the translation of the hand and 3-5 describe the rotation of the hand. Each column describes the change for a single joint: the first column is the change in the end effector isolated to only a movement in J0.

So I have my current robot pose T and one at a time I will change each joint a very small change (0.5 degrees) and calculate the new pose Tnew. (Tnew-T)/change gives me a matrix dT showing the amount of change. The translation component of the Jacobian can be directly extracted from here.

     double ANGLE_STEP_SIZE_DEGREES=0.5;  // degrees
     double [] jointAnglesB = new double[6]; 

     // use anglesA to get the hand matrix 
     Matrix4d T = computeMatrix(jointAnglesA,robot);

     int i,j;
     for(i=0;i<6;++i) {  // for each axis
         for(j=0;j<6;++j) {
         // use anglesB to get the hand matrix after a tiiiiny adjustment on one axis. 
         jointAnglesB[i] += ANGLE_STEP_SIZE_DEGREES; 
         Matrix4d Tnew = computeMatrix(jointAnglesB,robot);

         // use the finite difference in the two matrixes
         // aka the approximate the rate of change (aka the integral, aka the velocity) 
         // in one column of the jacobian matrix at this position.
         Matrix4d dT = new Matrix4d();

We’re halfway there! Now the rotation part is more complex. We need to look at just the rotation part of each matrix.

        Matrix3d T3 = new Matrix3d(
        Matrix3d dT3 = new Matrix3d(
        T3.transpose();  // inverse of a rotation matrix is its transpose
        Matrix3d skewSymmetric = new Matrix3d();
        //[  0 -Wz  Wy]
         //[ Wz   0 -Wx]
         //[-Wy  Wx   0]
         jacobian[i][3]=skewSymmetric.m12;  // Wx
         jacobian[i][4]=skewSymmetric.m20;  // Wy
         jacobian[i][5]=skewSymmetric.m01;  // Wz
    return jacobian;

Testing the Jacobian (finding Joint Velocity over Time)

So remember the whole point is to be able to say “I want to move the end effector with Force F, how fast do the joints move?” I could apply this iteratively over some period of time and watch how the end effector moves.

public void angularVelocityOverTime() {
    Sixi2 robot = new Sixi2();

    BufferedWriter out=null;
    try {
        out = new BufferedWriter(new FileWriter(new File("c:/Users/Admin/Desktop/avot.csv")));
        DHKeyframe keyframe = (DHKeyframe)robot.createKeyframe();
        DHIKSolver solver = robot.getSolverIK();
        double [] force = {0,3,0,0,0,0};  // force along +Y direction

        // move the hand to some position...
        Matrix4d m = robot.getLiveMatrix();
        // get the hand position
        solver.solve(robot, m, keyframe);
        float TIME_STEP=0.030f;
        float t;
        int j, safety=0;
        // until hand moves far enough along Y or something has gone wrong
        while(m.m13<20 && safety<10000) {
            m = robot.getLiveMatrix();
            solver.solve(robot, m, keyframe);  // get angles
            // if this pose is in range and does not pass exactly through a singularity
            if(solver.solutionFlag == DHIKSolver.ONE_SOLUTION) {
                double [][] jacobian = approximateJacobian(robot,keyframe);
                // Java does not come by default with a 6x6 matrix class.
                double [][] inverseJacobian = MatrixHelper.invert(jacobian);
                out.write(m.m03+"\t"+m.m13+"\t"+m.m23+"\t");  // position now
                double [] jvot = new double[6];
                for(j=0;j<6;++j) {
                    for(int k=0;k<6;++k) {
                    // each jvot is now a force in radians/s
                    // rotate each joint aka P+= V*T
                    keyframe.fkValues[j] += Math.toDegrees(jvot[j])*TIME_STEP;
            } else {
                // Maybe we're exactly in a singularity.  Cheat a little.
    catch(Exception e) {
    finally {
        try {
            if(out!=null) out.flush();
            if(out!=null) out.close();
        } catch (IOException e1) {

Viewing the results

The output of this method is conveniently formatted to work with common spreadsheet programs, and then graphed.

Position and Joint velocity over Time

I assume the small drift in Z is due to numerical error over many iterations.

Now what?

Since velocity is a function of acceleration (v=a*t) and acceleration is a force I should be able to teach the arm all about forces:

  • Please push this way (squeeze my lemon)
  • Please stop if you meet a big opposite force. (aka compliant robotics aka safe working around humans)
  • You are being pushed. Please move that way. (push to teach)
  • Are any of the joints turning too fast? Warn me, please.

Final thoughts

All the code in this post is in the open source Robot Overlord app on Github. The graph above is saved to the Sixi 2 Github repository.

Please let me know this tutorial helps. I really appreciate the motivation! If you want to support my work, there’s always my Patreon or the many fine robots on this site. If you have follow up questions or want me to explain more parts, contact me.


Send Instagram Videos to Youtube with Raspberry Pi and Bash

Making Instagram videos is quick and easy.  Making Youtube videos is a laborious editing process that sucks the sunlight out of my soul.  People kept asking me if I have a Youtube channel and I was embarrassed to admit that yes, sort of, but no, not really.  I needed a way to send my Instagram videos to Youtube… a way to make the rock that kills the first bird bounce and kill the second bird!  With a bit of Bash scripting I was able to get my Raspberry Pi to automatically do this for me.  Read on for all the deets.

I used to use Zapier but they got dumb.  First they wanted to charge me to add extra steps to my one and only zap, then they said they couldn’t work ANYWAYS because Instagram tightened the data access rules.  I’m a big fan of IFTT but they can’t do it, either.  So!  Hand-rolled it is.

The pieces someone else made

I already had a Raspberry Pi 3 (?) set up to take timelapse pictures of my Prusa 3D prints.  I figure it’s idle most of the time, why not make it work a little harder?  Firstly I logged in and updated all the underlying framework.

sudo apt-get update
sudo apt-get upgrade
mkdir ig2yt
cd ig2yt

Then I installed to grab my instagram content.

sudo apt-get install python3
sudo apt-get install python3-venv
python3 -m venv env
/env/bin/python -m pip install instagram-scraper

and tested the instagram-scraper

/env/bin/instagram-scraper yourTargetChannel -u yourIGName -p yourIGPassword -t video

Next I installed by following the instructions in their readme.

cd youtube-upload-master
sudo python install
cd ..
rm -rf youtube-upload-master

There were also a number of steps (in their readme) to install the .youtube-upload-credentials.json file to /home/pi/.  The file will contain your specific Youtube upload credentials and should remain secret.

That which binds them in darkness

Finally I was ready to test a bash script that would glue these other pieces together.  What you see here is the end result of an afternoon’s work.  It doesn’t show the testing steps along the way, most of which were later removed for brevity.  Test first!  Don’t blow up your following with a crazy upload machine.


# the folder where mp4s will be stored

# instagram-scraper looks in $DESTINATION to see what videos you've got
# and only grabs newer videos. youtube-upload uploads indiscriminately.
# we need the list of old videos so we don't upload old stuff again.
# get the list of files already in $DESTINATION
shopt -s nullglob
#echo $fileList

# grab the 10 newest instagram videos
env/bin/instagram_scraper $TARGETACCOUNT -u $MYACCOUNT -p $PASSWORD -t video --maximum 10 --template {datetime} --latest

# upload each new video to youtube
for filename in $DESTINATION/*.mp4; do
if [[ ! "${fileList[@]}" =~ "${filename}" ]]; then
#echo $filename is new
youtube-upload --title "$filename3" "$filename" --privacy="private"

# delete all but the 3 newest files so we don't fill the drive.
ls $DESTINATION -1tr | head -n -3 | xargs -d '\n' rm -f --

Final Thoughts

  • I didn’t setup youtube-upload in env because I’m a python newb.  I can barely ask “where it the toilet” in parseltongue.
  • I did setup this script to run as a cron job once an hour, every hour.
  • videos will be private so they don’t smash your subscriber’s data plan.  I had them originally set to unlisted but youtube still told everyone!  Many ugly emojis were received.
  • videos will be titled in youtube with the date and time of their original publication.
  • if you make the mistake of publishing a few of them and then pulling older instagrams and publishing those in a second wave your videos will be all out of sequence.  “Ha, ha,” says Youtube, “No fix for you!”
  • hese scripts don’t email me if the process worked, failed, or what have you.  They could be way more robust.
  • Special thanks to the helpful people in IRC freenode channel #bash and #python for helping with the many (many) different versions of python.

As always please let me know if I missed a step, if this was helpful to you, and so on.