Delta3 Precision and Speed

Home Forum Everything Else Delta3 Precision and Speed

  • This topic is empty.
Viewing 3 posts - 1 through 3 (of 3 total)
  • Author
    Posts
  • #6023
    AvatarAnonymous
    Inactive

    I have assembled a Delta 3 and have played around with it quite a bit. I have put a stylus into the collet and am using it to control a smartphone. Dan did a great job with this kit, its really neat. But …

    I would like it to be more precise–when moving from a distance to a specified x,y,z location it should be exactly the same spot every time, but it isn’t–and am looking for tips?

    I would like it to go faster, and am looking for tips. The PDF referenced below which used a Delta 3 (not sure if any different from mine aside from the material) claims it can get 5 screen taps per second, but I am getting about 1 tap per second, while keeping reasonable precision.

    What is the maximum microsteps per minute people are using for the MOTO-0007 motors? The instructions make it clear that a value of 3000 is acceptable, but can I go higher?

    In terms of precision tips I have already done: tightened all nuts, tightened all zip ties, triple checked exact lengths of forearms, done my best to make the bicep/rotor connection snug, but because I got the b-team rotors and had to file them myself this is not perfect and there is some slop, that is if I push up on the collet with the rotors energized the collet raises and the biceps move without the rotors moving, this does not happen with a downward push. Moving slower helps to avoid “kickback” upon arrival and stopping (I have requested deceleration from Dan Royer). Moving closer to center is more accurate, at the periphery things get more weird. Sometimes my stylus clearly touches the smartphone screen but no tap is registered, I have had less trouble with this since adding a ground wire to the stylus but any ideas around this would be appreciated too, I am using a $20 Belkin stylus I bought at Target.

    In terms of speed I have played with higher step values and reduced the vertical play when lifting off the screen before another press.

    To summarize, if I put a pencil in the collet and paper on the platform, I would like to go between two spots several inches apart and tap a pencil dot in each spot, getting 10 taps in 2 seconds. After 2 seconds the pencil taps for each of the two dots should be very close to each other. Today, I am seeing less precision than this, and much less speed.

    Thanks for any advice!

    Since I might be hearing from Justin Engler I’ll ask an unrelated question as well, in your related work you use a camera and computer vision, can you tell me what camera and software you used and if some of it is open source could you point me to that? Thanks!

    (PDF referenced above: https://www.defcon.org/images/defcon-21/dc-21-presentations/Engler-Vines/DEFCON-21-Engler-Vines-Electromechanical-PIN-Cracking.pdf … and quote: “The motion control software was modified to speed up movement, up to 5 presses/second” )

    #6747
    AvatarAnonymous
    Inactive

    Hey Mark,

    Dan pointed me to your post.

    I’m not sure I can help you with precision. I’m running an acrylic Delta v3 for R2B2 and the precision seemed great to me, but I only need to hit buttons on a smartphone. It worked well for about 4 hours continuous use until one of the screws came loose, which could probably be fixed with loctite, glue, or maybe a lock nut. I happened to be sitting and watching it at the time, so I just stopped it and tightened it.

    As for speed, I wasn’t able to get the Delta v3 to go very fast. The PDF you’re quoting from was using Dan’s Delta V1, which was built with an arduino and some nice servos instead of Rumba + steppers. V3 definitely seemed more precise than V1 though. I assume we could probably pick up some speed with some firmware changes and/or a faster controller board, but I don’t know for sure. Dan would be the man to ask about that part.

    All of our code for the R2B2 project is up on github:
    http://www.github.com/isecpartners/r2b2
    I had to make a few minor changes to make the python code work with the V3, but I haven’t posed those. It was pretty minor stuff.

    We used python and OpenCV for computer vision. We originally tried to autodetect buttons regardless of what device it was, but that turned out to be too difficult. We instead ended up with just looking to see if the screen changed in a user-defined location, which would indicate that we’ve passed the lock screen. OpenCV is very powerful, but a little wonky to work with at times.

    For missing taps, there are a few troubleshooting steps:
    1. connect to ground (looks like you’ve already done this)
    2. Ideally, your ground would be connected to the phone’s ground (perhaps by sacrificing a sync cable). I haven’t found a need for that with R2B2.
    3. Make the stylus touch a little more firmly. When manually calibrating R2B2, I try to set the depth to be a bit past when the screen registers a touch. I like to make the stylus be good and squashed.
    4. Check your level. This isn’t a problem for me since I manually calibrate the depth to push at each point, but if you’re trying to do something more dynamic, you should maybe calibrate the level of the screen by measuring at what depth the screen registers a touch at each of the 4 corners or something.

    We bought the cheapest webcams we could find on Amazon (~$10 US each). 1 out of the three we got worked adequately. I’m currently running a logitech camera (~$30?) that works much better. We also had some success with the built-in laptop cameras, but it’s difficult to get them into a good position. Depending on your application, you might find better success with mirroring the device screen to the computer and then doing your image recognition there, just so you don’t have to worry about things like camera angle correction, lighting, robot effector getting in the way, etc.

    You might also google around for tapsterbot. It’s a similar robot designed for user interface testing. I don’t think he sells the hardware, but it should be pretty easy to adapt his software.

    Good luck on the project!

    #6748
    AvatarAnonymous
    Inactive

    The challenge with high speed is acceleration. Currently all MC robots have one of two options:

    a) acceleration but no look-ahead. Arm3 has this. It can speed up and slow down in a move but there’s a small pause between moves, so it can’t plan to keep going fast between two moves.

    b) look-ahead at constant speed. Makelangelo has this. It can plan ahead so there’s no pause between movements but it doesn’t yet understand how to accelerate and decelerate.

    Now I’m talking theory here, coz i’m still working on this next bit:

    In the code each move is a segment. Each segment currently has a start speed, end speed, and max speed. Curved lines are made of lots of tiny segments. Changes in direction happen between two segments.
    Because the robot can change direction rather sharply I need to find the maximum allowable speed at the start of each segment. Once I’m past that I have to scan all the segments in the list and adjust their start and end speeds to take advantage of straight lines. Easy, right?

Viewing 3 posts - 1 through 3 (of 3 total)
  • You must be logged in to reply to this topic.