Making an Arduino-controlled Delta Robot

A delta robot is a parallel robot that’s designed for precise and high speed movement of light payloads.  They’re generally used for pick-and-pack operations.  I’ve wanted to build a delta robot since I saw this video, so I took a weekend last summer to put something together as a technology demonstration for the high school robotics club that I coach.

The complete robot

The servos and upper assembly

The upper link of an arm

A delta robot is composed of three two-segment ‘arms’ mounted at 120 degree intervals.  The arms use parallelograms to restrict the motion of the ‘wrist’ and keep it in a static orientation.  For a much better description, check out Wikipedia, and for the math geeks there’s an excellent writeup of delta robot kinematics over at Trossen Robotics.

To build my delta robot I started with three HS-645MG servos, and mounted them on some particle board.  I fabricated upper arms out of Tetrix beam and used some JB-Weld to attach long standoffs perpendicularly at the ends of the arms.  The standoffs hold small ball joints (I used some ball endsthat were intended for RC car suspensions) that will provide the free movement that is required of these joints.  The rods that make up the parallel linkage for the second segment of the arms are aluminum. I bored holes in the ends of these on the lathe and pressed small bolts into the holes to create a mounting point for the ‘cap’ portion of the ball ends.  The lower triangle is just a quick mock up made of Lego with some more long standoffs zip-tied to it to provide mounting points for the ball ends.

On the software side, there are two components.  I modified an excellent Arduino sketch originally created by Bryan at Principia Labs to drive the hardware.  The Arduino sketch uses the Servo.h library to control the three servos.  It listens over a serial connection for three-byte control commands in the format start_byte, servo_number, desired_position where the start_byteis always 255.  Servo number 99 is used as a control channel to enable and disable the servos.  Sending a desired position of 180 to servo 99 will turn all the servos on, and sending position 0 to servo 99 will disable all servos and power them off.

Processing GUI

To control the robot I wrote some software in the Processing language.  I used the really nice ControlP5 GUI library to make text boxes and sliders to control servo positions.  The kinematics of converting three servo angles to a position in XYZ axes and vice-versa is..interesting, to say the least!  Luckily great minds have gone before and completed this math for me, even including C code. Again, Trossen Robotics forums are your friend.

I’m making all the code for the Arduino sketch and the Processing GUI available for download from my Box.net account (Edit: and on GitHub).  The code is well-commented, but I did not originally intend to publish it, so it’s a little rough around the edges.  Feel free to ask any questions in the post comments and I’ll try and clear things up if it’s confusing.

I’ll wrap things up with a quick video of the delta robot in action. Enjoy!

About these ads

About Matt Greensmith
Sysadmin, robot-builder, linux geek, etc.

26 Responses to Making an Arduino-controlled Delta Robot

  1. joe says:

    Hi Matt, Had a go at using your code in gui but cant access libraries in version 1.0, ANy chance sending the whole file incuding gui libraries attached

    • Hi Joe,
      You say that you are using ‘version 1.0′, which I assume is Arduino 1.0, but the GUI is written for Processing, not Arduino. It’s confusing because the two applications are similar in look and feel, and the code files for both apps use the same extension (.pde, which has since been changed to .ino for Arduino 1.0 files). To clarify: the DeltaBot_Arduino.pde should be compiled and downloaded to your Arduino board via the Arduino app (Wiring), ad the DeltaBot_GUI.pde should be run in Processing.

      I don’t want to mirror the ControlP5 library files on my box.net account, but the library version hasn’t changed since I wrote the app, so the zip download (direct link here) from their website should still be current.

      If you are using the correct application to run the GUI app, then your library access errors are probably due to an incorrect path. Processing is picky about how to find libraries. If you check out the preferences window in Processing, you can find out your Sketchbook folder location. For example, mine is
      /Users/matt/Documents/Processing

      When you extract your controlP5 zip, it needs to go into a folder called “libraries” (case sensitive) inside your sketchbook folder. So, on my machine, I have:

      /Users/matt/Documents/Processing/DeltaBot/DeltaBot_GUI.pde (my source code)
      /Users/matt/Documents/Processing/libraries/controlP5/[examples,library,reference,src] (controlP5 library location)

      Hope this helps, send me more details if you continue to have problems; I’m happy to assist.

  2. Mansoor Ghazi says:

    Loved it..!!! I am working on exactly similar lines to develop a delta robot as part of my final year project. Great job with such neat implementation of servo control.

  3. Mark says:

    I have tried many and this is by far the smoothest example I have experienced. I am attempting to modify your code for step/dir for steppers. Do you have any hints for me how I can convert the servo signals to step/dir? I think having the pc do the grunt work with Processing is great.

    • Hi Mark, interesting question!

      So the big difference between using servos and steppers is that with servos I have the luxury of being able to specify a target position directly, ie. “Go to 45 degrees”. Using steppers, you’ll only be able to specify a particular distance to move ie. “Move down 45 degrees” (actually you’ll have to convert steps to degrees as well – microstepping will be important if you aren’t gearing your output, as common stepper motors are 1.8deg/step which is a pretty coarse resolution for our purposes.) This means that you’ll need to calculate your distance to target before each move, which implies that you’ll need to track the absolute position of each motor throughout. That will require some sort of calibration process every time you turn the system on so that you have a known position for each motor to start from. You could use upper limit switches to calibrate the stepper positions, ie. drive each motor ‘up’ until the limit switch is tripped, then set your current_position variable to that known angle. Then each position request in code is just ‘Move (target_position – current_position) steps’ and then update your current_position variable.

      I would probably do all of the stepper stuff on the Arduino and leave the Processing code as it is. This abstracts the robot-specific stuff away from the higher level delta controller.

      Hope this helps, feel free to keep this conversation going if you think I can help further!

      • Mark says:

        Thanks Matt!!
        Thats some great info. I’m going to try a bit of code from LinuxCNC homing process to attempt that after I fully understand calibrating the code to fit my specific geometry. I think that since the stepper won’t travel a full 360d I will try potentiemeters and analog input for position. I have another question…

        // robot geometry
        static float e = 14; // end effector triangle
        static float f = 29; // base triangle
        static float re = 50; //length of long arm
        static float rf = 14; //length of short arm

        What units are these in? cm or mm?

      • Hey Mark, wow – LinuxCNC homing code and a pot for each motor? This sound like it’s getting complicated.

        I always try and keep things as simple as possible (less for me to screw up!), so I personally wouldn’t use the pots for position feedback. This is because you already have an ongoing position feedback mechanism in the stepper motors themselves, by virtue of the fact that each ‘step’ rotates the shaft a known number of degrees. Think about CNCs and 3-D printers (ie. the reprap) – they cut/print entire parts without using any analog position feedback at all. Unless your inertial load is REALLY big, you can count on the stepper motor being at the angle that you tell it to be at.

        As far as homing code, I think it’s simpler than you’re expecting. Here’s pseudo-code for each motor:

        // we just turned it on, and we don't know the arm position
        current_arm_position = ??
        while (upper_limit_switch is not_pressed) {
        step the motor up one step
        }
        current_arm_position = 0_degrees //arbitrary, depends on your robot geometry

        Now we have a known position for our arm, and as long as we keep updating this variable every time we move the motor, we’ll always know our position. For example:

        // 1 step is 1.8 degrees, so a quarter-step is 0.45 degrees,
        // this should give us more than twice the precision of a servo!
        QSTEPS_PER_DEGREE = (1 / 0.45);
        degrees_to_travel = target_position - current_position;
        if ( degrees_to_travel < 0 ) { // distance is negative, we're moving up
        //we're specifying quarter-steps here
        step_motor(UP, (degrees_to_travel * QSTEPS_PER_DEGREE))
        } else {
        //we're specifying quarter-steps here
        step_motor(DOWN, (degrees_to_travel * QSTEPS_PER_DEGREE))
        }
        current_position = target_position //update our current position variable

        And that’s basically the whole program for one stepper motor.

        Regarding the robot geometry values, mine are in cm, but it’s arbitrary – as long as all the values are using the same unit, you can use any unit that you want. The code just needs to be able to calculate the ratios of the different values. One last caution – the upper and lower triangles are not the distances between the pivot points – the pivot points are at the mid-span of each triangle leg. See this image for clarification:
        Delta Robot Diagram

        Hope this helps!

  4. Alan Nelson says:

    Hi Matt
    In your designing of Delta robots have you found an optimum relationship among the two triangle dimensions and the two leg lengths?

    • Hey Alan, interesting question. I’m by no means an expert on the physics of the delta, and I haven’t gone beyond the proof-of-concept stage of delta design, so I don’t really have a good answer. I would say that your robot geometry should be application-specific, ie. that the design should be determined based on the characteristics of the load that you are moving/manipulating and the nature of he work that you have to perform. For example, the ratio of upper to lower arm length will impact:

      • - the speed at which you can move the end effector (a longer lower arm means more movement of the wrist for a given upper arm movement)
      • - the load capacity (a longer lower arm moving a greater distance for a given upper arm movement decreases available torque)
      • - the precision with which you can place the end effector (a higher ratio means coarser control of end effector position, subject to the slop of your joints and the positional accuracy of your stepper motors/servos)
      • So, a higher arm length ratio would allow you to move a lighter load, faster, with less precision.

        That’s an example of just one geometry decision (arm length ratio). Clearly there are a number of geometry options to consider (triangle sizes, arm lengths, arm ratios, shoulder movement range, etc.) and each would impact the robot’s performance in some if not many aspects. If I were to design a delta, I would design it with the specific task in mind and optimize for the expected load and its expected movements.

        Hope this is helpful!

      • Dex says:

        good work!
        Id like to also give me a delta robot.
        the step motor are a good compromise …
        I do not think a arduino to do the job:
        step library as blocking function
        I do not know processing. can one drive ltp port? with step / dir command?
        i have pic of my design if you interressed :)

      • Gregor says:

        I would ask about accuracy degradation (if any) on the working area (close to edges), with too short arms (for ex).
        In general, is there a way to estimate working area with ‘guaranteed’ accuracy?

        gj, I wish mine would work like this

  5. Pingback: Delta Progress : The Base and Arms | Robotic Arts

  6. Pingback: Bits and Bots: Musings on LabVIEW, LEGO and Learning » Blog Archive

  7. Pingback: Enable Builds a Delta Robot, Part Two: Three Men And A Robot

  8. Daniel says:

    Hi Matt,

    First of all i want to thank you for this wonderful and very useful tutorial.

    I have several question concerning the “DeltaBot_GUI”

    There are some constants inside ( s1,s2,s3offset ; max min x,y,z,t ; zp ) which i suspect would be different for different dimensions(re,rf,e and f). am i right?
    if so, then, how to get the value of each constant?

    I am looking forward for your answer.
    Thanks in advance.

  9. Matt,
    Thanks for the tutorial and the code – really nice work!
    I think I’ve got a similar question to Daniel – I’ve got a 3 arm delta bot I made using plans from
    Marginally Clever (http://www.thingiverse.com/thing:44235 ) . I’ve ported your code to it, and it works OK with your parameters ( although the XY movement is nowhere near planar). When I change the re, rf , e and f dimensions I get no response at all from the processing controls.
    Here’s my code parameters for processing

    // robot geometry
    // modified for marginally clever deltabot
    static float e = 5.5169; // end effector triangle
    static float f = 19.929; // base triangle

    static float re = 18.5; //length of long arm
    static float rf = 5; //length of short arm

    Any suggestions would be really helpful – thanks again

    • It sounds like your modified robot geometry might be making the calculated x,y,z-positions fall outside of the envelope boundaries (lines 36 through 43). The envelope boundaries are arbitrary – I determined them by trial and error. The range of values that you’ll see for x,y,z positions will be specific to your robot’s particular geometry, as the geometry values are used in the transformation calculations. You may want to sent the envelope boundary values arbitrarily high, and then enable debug output to get a feel for the range of x,y,z-values that you’ll see in normal usage.

      • Thanks – yeah, I’ve been playing with watching the values that are generated and seeing when they create an invalid position. I’ll try that too.
        I kind of prefer a project not “working out of the box” because then it forces me to learn what’s happening inside the code – so this is great!

  10. Daniel says:

    Thanks for the reply Matt!

    I did trial and error just like you said.

    and here are some changes I did to your program to fit my robot’s dimensions:

    e = 5.32;
    f = 11;
    re = 22;
    rf = 5.5;

    min_x = -10;
    max_x = 10;
    min_y = -10;
    max_y = 10;
    min_z = -27.5;
    max_z = -19;
    max_t = 45; // I found that the servos goes to negative angle when my robot extending its leg
    min_t = -80; // so i just turned the original value around (max = 80, min = -45)

    zp = -21;

    //I don’t really understand this line below, but i guess the values are max, min, starting value,…)
    //z azis slider
    z = controlP5.addSlider(“z-axis”,-27.5,-19,-21,230,200,15,200);

    //servo sliders
    servo1 = controlP5.addSlider(“servo1″, -80, 45, 0, 20, 460, 200, 15);
    //the same goes to the other two servos.

    //set the first angle
    float theta1 = delta_calcAngleYZ(xp, yp, -zp);
    //I add a minus(-) in front of zp because I want my robot to extend its legs when i pull the z slider down, the original code (without minus) do the opposite.

    anyway, thanks alot Matt for sharing the program

  11. Carlos says:

    I have a doubt in the part of the offset in the code, what that means?

    • Hi Carlos, the offset is a small correction value to be used to align all three arms to the same position. The Servos have a splined shaft and it’s not always possible to attach the arms to the servo at an exact position, so the offset value can be used to correct the small differences in the attachment points. My alignment process is as follows: I set all the servos to a known position with the arms detached (I use upper arms at level as my reference position), then I attach the arms as close to level as possible. Then I adjust each arm’s offset until the arm is exactly level.

  12. Uri says:

    Hi Matt, u did a great work!

    I wonder if I can use that type of robot to build a 3d printer,
    What would u consider will be the limitations and advantages?

    Thank you!:)

  13. Sabih Toor says:

    Hello! Its a great work! I am also making a pick and place delta robot that can sort the objects as well using image processing techniques. I will be grateful if u can tell me about the exact dimensions of arms and bases?? i mean if i make them randomly then how can i locate the exact positions? i need some accurate calculations, please help! Thanks!

  14. Johnny White says:

    Hi Matt,

    I’m currently working in building a delta robot like yours. Unfortunatelly its impossible to see the sketch from Principa labs because the domain has expired. Would you upload your modified sketch so we could take a look at it?

    Thank you!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.