April 19th, 2014
I’ve had a dumb idea clonking around in my head for at least a decade: a jigsaw puzzle for the blind. When I was young I would see carved stone reliefs and wish I could touch them. I thought the blind might really enjoy them. When I joined the reddit puzzle exchange my giftee asked for a jigsaw and I knew: it was time.
Read the rest of this entry »
April 18th, 2014
Added an SSL certificate. Surf in private, shop with confidence.
Added an R&D subforum. This is for people stuck on a robotics challenge that need a bit of help. I’ve listed 4 items to start it off.
Added “learn” to our top level menu that will take you to our education Wiki.
Adding a new product banner every day of the work week until they’re all done.
Hooray for small victories! One step at a time.
Special thanks to VHS member “Robbat2″ for his amazing linux know how. You rock!
April 18th, 2014
The latest version of PancakeBot is coming to the Paris MakerFaire. Did you know it uses GcodeSender and a version of our gcodecncdemo code? I’d love to show you some of the other tasty morsels Miguel has been sharing with me but I have to wait until he gives the OK.
April 17th, 2014
If you’ve followed the story so far from the beginning you’ll know that I have been building a program to simulate and train my robot arms. In the program there are virtual copies of my real robots. The real and virtual models are be synchronized 1:1 when connected.
It looks like the virtual robots in my simulator now have collision detection: they understand when they are touching each other. That means I can make sure that the real robots are incapable of slamming into each other by giving the virtual robots more personal space than the real robots. I’ve already made sure that they can’t bend in impossible ways or go through the table. Now kids can drive the robot with relative safety – as long as they don’t get in it’s way, there shouldn’t be any accidents.
Get update 14 of the open source code from Github.
Next step is to record & playback animations.
Frankly, I’m kind of amazed at the framerate I’m getting for brute-force OpenGL graphics in Java in Debug mode. I still remember using assembler code to draw triangles one pixel at a time in mode 13h.
April 17th, 2014
I’m hacking a vending machine to take BitCoins and I’ve made my work open source. Read all about it and get the code
April 17th, 2014
And now… your moment of Zen.
April 15th, 2014
I’ve just added a dimension drawing of the base for the Arm3 robot to the ROBO-0023 Arm3 product page. This way you should be able to integrate it into your automation solution more easily. I’ve also updated the description with a special section that covers the technical abilities of the Arm3.
April 14th, 2014
Reddit.com robotics user newgenome heard about my project and sent in this video of self-assembling FANUC robots arms. SFW automation porn fiesta commences now:
So it seems I’m not the first to achieve this goal… but I can still be the first in the open source community. Opportunity Awaits!
April 13th, 2014
I’ve just updated the software for the Arm3 project. Everything between here and the San Mateo Maker faire is about putting my best foot forward. I’ve just rearranged the office to make a workspace for the arms. Now I have to lay out the components and teach them to assemble one of their own. I wonder how many tries it will take? You’ll have a chance to try it yourself at the faire.
The next step for me is to record and playback actions for several robots working in concert. This way the various stages of assembly can be tackled in sections.
I also rewrote the HTML and CSS for the site so that our facebook fans will no longer get the paypal logo where there should be a relevant robot picture.
Anyways, on with the software update.
I hope this program will be reused in the coming months to simulate all Marginally Clever robots.
- OpenGL graphics that simulate arms
- Synchronizes real arm(s) and simulation
- WASDQE+mouse camera controls
- UIOJKL arm movement w/ IK solving (moves along cartesian grid)
- RTYFGH arm movement w/o IK solving (moves motors only)
- Spacebar to switch between arms
- added EEPROM version and GUID records
April 10th, 2014
Arm3 robot trainer is coming along. There were several false starts until I figured out the right way to get Swing and OpenGL to play nice together in Java.
In this proof-of-concept the real machine moves to match the virtual model. The robot already understands Inverse Kinematics so there are two ways to drive the machine: the first is to move along XYZ lines and the IK system figures out how to match your request. The second is to move the motors directly. A trainer can use either one or both at once.
Next step is to record & play back sessions. Once that’s been achieved people can share training sessions with each other online – remix, tweak, collaborate, use github, and so on.
I’m looking for resellers and people who are interested in using this robot to solving real world problems. I’m also getting ready for the San Mateo Maker Faire & MakerCon. Will I see you there? We should meet up and talk shop.