Robot Arm

Intuitive robot programming

While making demos of what the Sixi arm can do I discovered that I’m a terrible driver. More than a few times I wanted to drive the hand +X and instead rotated the wrist, or got the direction wrong. It might not seem like much now but if it were holding a cup of liquid or in a narrow confine that would be very bad for the end user!

A good fix would be a way to preview a move before committing to that move, and maybe having some kind of undo feature. Now by moving the joystick the blue “ghost” arm moves while the yellow “live” arm stays still. When you like the pose you’ve reached, press X on the playstation controller to commit that move. If you want to undo – put the ghost back on the live arm – press the circle button on the playstation controller. Lastly if you want to drive the arm back to the starting position, triangle button will move the ghost to the starting position and then X would commit the move.

Let me know in the comments how is your experience with driving the simulation. I will be dedicating the next few weeks getting ready for the Vancouver Mini Maker Fair, September 14, 2019 at Science World.

Robot Arm

Record & Playback 4

I have been building a robot arm. You may have seen it on my Instagram. I also have an open source Java app called Robot Overlord, which can simulate the arm in 3D. Robot overlord can drive the simulation AND the real arm by moving the joystick. All the moves in this video were done with the Playstation controller:

In a previous post on Hackaday.io, I briefly covered a system I wrote into Robot Overlord that would capture all the joystick data and play it back on command. Technically, that worked. Qualified success.

Watch I stream robot related tutorials from imakerobots on www.twitch.tv

However! Driving this way is way inefficient. new instructions are sent to the arm 30 times a second. The arm can’t see where it is going, so it can’t plan to reach high speeds. It’s like a very fast game of Marco Polo. Also if you’re a novice driver like me it’s really easy to go the wrong way. It would be great if I could move the simulated arm to a destination pose BUT only update the real robot when I’m sure I like my destination pose. Then the arm would then move in a straight line from start pose to end pose at top speed.

First I needed a way to save any pose to a file on disk and then bring it back. Then I could save and load single poses. Then I could play those poses back to the real robot, same as I did with the joystick model. Then I could repeat tests, which helps me confirm things work correctly.

If I have a start and an end pose then I can find a way to interpolate between two poses – I can split that line into sub poses if needed. I can already send poses to the robot. So what I can do is find the happy trade off between too many poses (no acceleration) and too few (less accurate movement).

Looking through my daily notes I see I started on the new system some time before 2019-8-13, because that was when the weirdness started: I found cases where recording to disk and coming back were out of sync. Not identically 1:1. Discombobulated. When I tried to play back a recording the hand of the robot (J5) was always turned 90 degrees from the original recording. As I began to dig into why, I opened a whole can of worms. Bigguns.

Worm: The robot sim in Robot Overlord was wrong.

When Jin Han and I built the computer model of the robot arm in Fusion360, the design was started in November 2018 and back then we started facing the wrong direction.

Arm designed pointing at -Z

When I say it was built facing the wrong direction, I mean that I imagined That both Fusion360 and Robot Overlord would have the hand pointing at +X and up was +Z. In fact, in Fusion360 the hand is pointing at -Z and up is +Y, and in Robot Overlord I reassembled the arm with the hand facing -Y and up is +Z. Copying the model over was stupid hard and I didn’t realize that was partly because I was doing it the wrong way, turned 90 degrees on two axies. It would have been easier if it was upside down and backwards!

My method to solve it was to load one joint at a time starting at the base, get it turned facing upwards, and then add another link and so on. Once all the bones were in their relative positions, build D-H parameters that matched.

Worm: The D-H model of the arm was wrong.

The Sixi was the first robot arm I ever coded that used Denavit–Hartenberg parameters. One of the reasons I used D-H parameters is that they’re well documented and supported by other people into robotics. I can easily use D-H to calculate Forward Kinematics (FK), where i know the angle of every joint in the arm and I want to get the pose of the hand. (A pose is a position in space and an orientation. One common way to describe this combo is with a 4×4 matrix). I could also use Youtube videos that explained how to calculate Inverse Kinematics for a robot arm with D-H parameters. Especially tricky is the spherical wrist:

I found the videos on spherical wrists were incomplete and it wasn’t until I stumbled on these notes from York University in Canada that I found the missing piece.

Worm: Inverse Kinematics calculations were wrong.

Of course my code didn’t quite match the stuff I’d been taught because my model was facing -Y instead of +Z – a 90 degree turn. Every time the tutorials said use atan(y,x) I had to write atan(-x,y).

Not knowing that I’d done all this stuff wrong yet, I had to diagnose the problem. I build a jUnit test in src.test.java.com.marginallyclever.robotOverlord.MiscTests.java:TestFK2IK(). This test sweeps the arm through the set of all angles keyframe0. Every instance in keyframe0 creates some possible pose m0. Some m0 can be solved with Inverse Kinematics to make some other keyframe1. keyframe1 can create a pose m1. m1 should ALWAYS match m0. I got lot reams of big data, all of which told me Yes, there’s definitely something wrong. It took about a week of nail-biting research until I figured out and unscrambled each of those worms.

So what does all that mean? It means I can now build meaningful recordings and now I can start to search for the right happy trade off .

Makelangelo News Robot Arm

Weekly summary ending 2018-02-22

Hey, gang! Here is what we worked on in the last week, where we are heading next, and how you can join us.

Makelangelo Update Released!

Makelangelo Software v7.20.1 and Makelangelo firmware v9.0.0 are out now. This is a major change to the firmware which rewrites all the logic for planning movement, especially with acceleration and deceleration. If you upgrade one then you must upgrade both. We have written a guide to help you with updating the Makelangelo firmware that should make this easy for you.

One of the side effects is that the old and busted steps/min values have been replaced with new hotness mm/s and mm/s/s. I am getting good drawing results with top speed=90mm/s, drawing speed=60mm/s, and acceleration=300mm/s/s. See for yourself.

Makelangelo 5 at Science World Vancouver, Feb 20, 2019

Wednesday the power was out for a scheduled electrical upgrade and we used the time at Vancouver’s Telus World of Science to take some better photographs and video. I hope that with this I can finally make a presentation pitch video that does the machine sweet vengeful justice.

Fun fact: A large sheet of acrylic has enough static cling and surface tension to hold an A0 sheet of paper without tape.

Makelangelo on a white board

I find that the Makelangelo suction cups work great on glass …and not so great on a class room white board. I tasked Jacob with designing a new system and he went through several rapid prototype iterations.

In house we’ve moved on a few versions from what you just saw and have an even better design. When it’s ready we’ll offer it in the store and on Thingiverse.

I should also mention Scott has been doing a great job of manufacturing Makelangelos and documenting process. I don’t mention enough how good he’s doing (very!) at a job some people would look down on. Documentation is hard, yo! He’s even inventing better ways to make the machines faster with less futzing.

Sixi motors arrived

Jin was taking some much earned time off this week, during which time the motors and the power supply we ordered finally arrived. As I write this he’s doing electrical tests and torque tests and making sure that everything is to spec.

Meanwhile, on the other side of town,

In anticipation of our new CNC machine arriving in ~6 weeks I’ve been taking night classes to learn how to be a better machinist. Here’s a funny from my last session:

Manners really seal the deal.

Next

The next week is Jacob & Scott’s last week of internship. Pizza party? Pizza party. If everything goes well with the testing we should be installing the new motors into Sixi this week. I can wait, but I don’t want to… 😭

Join Us

Get the latest Sixi Master Assembly for Fusion360 on Patreon
OR
Buy anything in our store and mention ‘The Fusion link’ in the notes field to get a link by email.

You can use this file to build your own copy of the robot arm.  Put that 3D printer you bought to use – Make it your own! Share your creation with others!  That’s what open source is about.  If you build one we would love to share it with others.  I am actively seeking talent… show me what you got.

As always, follow our daily progress on our Instagram or see our older stuff on Youtube.

You can do us a huge favor by watching just 2 minutes of video on our Youtube channel. We have 44k subscribers on IG and we need 100k minutes watched to monetize our channel. If you all watch 2 minutes of video that’s 88k minutes done, finito, in the bank, sayanora. So Like, Share, Subscribe! We thank you.

Lastly, thank you for your likes, subscribes, comments, and purchases. You keep the lights on and the mood high so we can keep working on awesome things that will help the planet.