Project 5: Robotic Arm (Part 2)

Documentation (description, images, video, code, etc) due to website by Wed (2/21) at 9pm.

UPDATE: students negotiated 24-hour extension. Documentation due to website by Friday (2/23) at 9pm.

Click here for the full PDF of the project description.

Note that Project 5 is a build-on of Project 4: Robotic Arm (Part 1).

View the following Help Tutorial for calculations/tips and tricks: Inverse Kinematics.



Robotic Arm 2

Project 5 Annalisa and Raven

Description: We added a third motor to the end of our robotic arm from project 4. This motor rotates 90 degrees once the end effector is at its desired first position. The arm moves to the specified coordinates while the pen is drawing. Trigonometry calculations were used from Ipek's notes in order to calculate what we call "new motor positions" in our code.

Image Image

.

.

.

.

Challenge 1:

 #!/usr/bin/env python3 def main(): import math motor_pos1 = None motor_pos2 = None x_a = None y_a = None x_b = None y_b = None length1 = 12 length2 = 13 x_array = [1, 17] y_array = [20, 7] new_mot_pos1_array = [0, 0, 0, 0, 0, 0] new_mot_pos2_array = [0, 0, 0, 0, 0, 0] i = 0 speed1 = 10*10.5 speed2 = 20*10.5 j = 0 # set up the motor ev3 = Device('this') m = ev3.LargeMotor('outA') n = ev3.LargeMotor('outB') p = ev3.LargeMotor('outC') for count in range(len(x_array)): x_b = x_array[i] y_b = y_array[i] print('End Effector X:' + str(x_b)) print('End Effector Y:' + str(y_b)) sleep(2) r_squared = x_b**2 +y_b**2 print('r:' + str(r_squared)) #calculated motor 1 position cos_alpha = (length2**2 - (length1**2 + (r_squared)))/((-2)*length1*(math.sqrt(r_squared))) print('cos_alpha:' + str(cos_alpha)) alpha = math.atan((math.sqrt(1- cos_alpha**2))/cos_alpha) alpha = math.degrees(alpha) print('alpha:' + str(alpha)) beta = math.atan(y_b/x_b) beta = math.degrees(beta) print('beta:' + str(beta)) new_motor_pos1 = alpha + beta new_mot_pos1_array[i] = new_motor_pos1 print('New Motor 1 Position:' + str(new_motor_pos1)) #calculated motor 2 position cos_2 = (r_squared - (length1**2 + length2**2))/(-2*length1*length2) print('cos_2:' + str(cos_2)) sqrt_var = (math.sqrt(1-cos_2**2))/cos_2 print('sqrtvar:' + str(sqrt_var)) new_motor_pos2 = math.atan(sqrt_var) new_motor_pos2 = math.degrees(new_motor_pos2) new_mot_pos2_array[i] = new_motor_pos2 print('New Motor 2 Position:' + str(new_motor_pos2)) #new end effector position x_a = length1 * math.cos(new_motor_pos1 / 180.0 * math.pi) y_a = length1 * math.sin(new_motor_pos1 / 180.0 * math.pi) x_b = x_a + length2 * math.cos((new_motor_pos1+new_motor_pos2) / 180.0 * math.pi) y_b = y_a + length2 * math.sin((new_motor_pos1+new_motor_pos2) / 180.0 * math.pi) print('New End Effector X:' + str(x_b)) print('New End Effector Y:' + str(y_b)) i = i + 1 print('i = ' + str(i)) pass if __name__ == '__main__': main()
.
.

Challenge 2: we added this to the bottom of our challenge 1 code:

 for count in range(len(x_array)): # run to position 1 print('Going to Position 1: ' + str(new_mot_pos1_array[j])) m.run_to_abs_pos(position_sp=new_mot_pos1_array[j], speed_sp=speed1, stop_action="hold") # run to position 2 print('Going to Position 2: ' + str(new_mot_pos2_array[j])) n.run_to_abs_pos(position_sp=new_mot_pos2_array[j], speed_sp=speed1, stop_action="hold") # wait till done moving (keep checking) while n.is_running: sleep(0.1) j = j + 1 sleep(1) # After moving to the first position, put the pen down 90 degrees and keep it there p.run_to_abs_pos(position_sp= -90, speed_sp=speed2, stop_action="hold") sleep(.5) pass
# OUTSIDE OF THE LOOP WE'LL PULL THE PEN BACK UP print('pen back up') p.run_to_abs_pos(position_sp= 0, speed_sp=speed1, stop_action="hold") # "relax" the motor at end of the program m.stop(stop_action="coast") n.stop(stop_action="coast") p.stop(stop_action="coast") if __name__ == '__main__': main()

Feb 27 2018 Read →

Project 5: Inverse kinematics
Brian Reaney, Julia Noble

We reprogrammed the robotic arm we built in project 4 to draw a picture given inputted coordinates.

Arm:

Image

Base:

Image

Challenge 1: Print out the inputted values, calculated end effector position, and newly calculated motor positions.

We inputted coordinates by calculating the points along arcs of certain radii. These calculated arc coordinate values were all appended into an array of values (shown below). We also inputted another point at the end of the position arrays to demonstrate the robots ability to pick up the pen and draw separate dots and lines. The resulting motor angles used to move to those points was outputted as shown below.

Image Image

Challenge 2: Use a sequence of end effector positions (x,y locations), calculate the appropriate motor positions and “play back” that sequence (draw). The robot should sketch more than one line (thus, having to pick the pen up and down).

Our robot draws arcs by moving to 20 different points. The robot picks up the pen halfway to go back and draw a dot in quadrant I on the robot's x,y axis. This was done to demonstrate how the robot can also draw dots and discontinuous drawings.

Feb 24 2018 Read →

Project 5: Robotic Arm Part II by Martin and Omar

Challenge 1: Calculating Motor Positions (based on End Effector Positions)

In the first challenge, we were assigned to calculate the appropriate motor positions based on the previously calculated end effector positions of our code last week. We had to perform inverse kinematics which is a routine type of method in robotics to solve for joint parameters that provide a desired end effector positions. Our code is in fact incredibly accurate within +-2 degrees from the actual reading. Our calculated values are defined as the relative displacement that the motor rotates with respect to its zero. The best calculations are done in the first quadrant.

Below are pictures of the algebra needed to solve for each angle based on motor position (sorry Ipek, your work was greatly appreciated but we wanted to do them again for our own learning):

Image Image

Below are the pictures displaying the angles read by the motor, the coordinates of the end effector, and the calculated angles of the motor based on those coordinates.

Image Image Image

Here is the code:

#!/usr/bin/python3
import math
import time def main(): print('CalculatingPositions.py') # Constant Definitions RUN_TIME = 30 INITBAR = 17 # centimeters SECONDBAR = 9 # centimeters # Connect to ev3 and motors ev3 = Device('this') endMotor = ev3.MediumMotor('outD') middleMotor = ev3.LargeMotor('outC') firstMotor = ev3.LargeMotor('outB') # Record starting positions initMidAngle = middleMotor.position initFirstAngle = firstMotor.position # Run for a certain period of time stop_time = time.time() + RUN_TIME while time.time() < stop_time: # Determine motor positions theta = firstMotor.position phi = theta - middleMotor.position # Print values in degrees print("Read Angle") print(theta) print(phi - theta) print("\n") # Calculate coordinate positions xPos = (INITBAR * math.cos(math.radians(theta))) + (SECONDBAR * math.cos(math.radians(phi))) yPos = (INITBAR * math.sin(math.radians(theta))) + (SECONDBAR * math.sin(math.radians(phi))) xPos = round(xPos,3) yPos = round(yPos,3) # Print out values in centimeters print("Position Values(x,y)") print(xPos) print(yPos) print("\n") sleep(0.5) # Calculate motor angles based on coordinate positions beta = math.atan2(xPos,yPos) R = math.sqrt( xPos**2 + yPos**2) theta2 = math.acos((R**2 - INITBAR**2 - SECONDBAR**2)/(2*INITBAR*SECONDBAR)) alpha = math.acos((INITBAR + SECONDBAR*math.cos(theta2))/(R)) theta1 = beta + alpha # Convert radians to degrees theta1 = math.degrees(theta1) theta1 = (theta1-90) * -1 theta2 = math.degrees(theta2) theta1 = round(theta1,3) theta2 = round(theta2,3) # Print values in degrees print("Calculated Angle") print(theta1) print(theta2) print("\n\n") if __name__ == '__main__': main()

Challenge 2: Playing Back Sequence of End Effector Positions

For challenge 2, we had to use a sequence of end effector positions which are simply (x,y) points and play back those positions. In our case, we decided to sequence the points in a circle to draw out. We mathematically indexed points of the circle using equations solving for the x and y components of the circle and looped those values. In one of our loops, the marker would be lifted up and when it reached the designated position it would drop. This would create a circle composed of points. In the other loop, we did not have the rise function, and simply drew out the circle.

Below is the finished circle when the pen was put down before moving and raised up only after finishing it's path:

Image

Here is the dotted circle

Here is the code:

#!/usr/bin/python3
import math
from time import sleep def dropPen( motor ): # Define constants for motor movement MOTOR_SP = 450 T_SLEEP = 2.55 # Lower pen motor.run_forever(speed_sp=MOTOR_SP) sleep(T_SLEEP) motor.stop() return def raisePen( motor ): # Define constants for motor movement MOTOR_SP = -450 T_SLEEP = 2.55 # Raise pen motor.run_forever(speed_sp=MOTOR_SP) sleep(T_SLEEP) motor.stop() return def circleCalc( xCenter, yCenter, r ): # Define function constants x = [] y = [] # Calculate x and y coordinates of a circle for i in range(0,20): deg = i * 360 / 20 x.append( r * math.cos( math.radians(deg) ) + xCenter ) y.append( r * math.sin( math.radians(deg) ) + yCenter ) # Return coordinates return x, y def main(): print('Test') # Constant Definitions RUN_TIME = 30 INITBAR = 17 # centimeters SECONDBAR = 9 # centimeters MOTOR_SP = 100 # ev3 Defintions ev3 = Device('this') endMotor = ev3.MediumMotor('outD') middleMotor = ev3.LargeMotor('outC') firstMotor = ev3.LargeMotor('outB') x, y = circleCalc( 18, 0, 3 ) dropPen( endMotor ) for i in range(0, len(x)): # Unpack variables xPos = x[i] yPos = y[i] # Calculate motor angles based on coordinate positions beta = math.atan2(xPos,yPos) R = math.sqrt( xPos**2 + yPos**2) theta2 = math.acos((R**2 - INITBAR**2 - SECONDBAR**2)/(2*INITBAR*SECONDBAR)) alpha = math.acos((INITBAR + SECONDBAR*math.cos(theta2))/(R)) theta1 = beta + alpha # Convert radians to degrees theta1 = math.degrees(theta1) theta1 = (theta1-90) * -1 theta2 = math.degrees(theta2) theta1 = round(theta1,3) theta2 = round(theta2,3) # Go to each position middleMotor.run_to_abs_pos(position_sp=theta2, speed_sp=MOTOR_SP, stop_action="hold") while middleMotor.is_running: sleep(0.1) firstMotor.run_to_abs_pos(position_sp=theta1, speed_sp=MOTOR_SP, stop_action="hold") while firstMotor.is_running: sleep(0.1) # dropPen( endMotor ) # sleep(0.1) # raisePen( endMotor) # sleep(0.1) # Release motors so they don't burn out raisePen( endMotor ) middleMotor.stop(stop_action="coast") firstMotor.stop(stop_action="coast") if __name__ == '__main__': main() 

Feb 24 2018 Read →

Project 5:Inverse Kinematics

Rui Shi & Ziyi Zhang

Description:

Project 5 asks us to use inverse kinematics to determine end effector positions and control motors to draw through a sequence of points. To add a "lift" and "drop" functionality, based on the project 4 robotic arm, we mounted a gear with two linked rods which can attach pens on it, so that the pen can move up and down by controlling rotation of the gear. Also, we add some other bricks to reinforce the whole arm to ensure it  moves more stable.

Image Image Image Image

Challenge 1:

As we show in the video, our sensor will output the end effector position and then playback. By measure the actual position and the output of sensors, we find out under most occasions the output value is accurate, but sometimes has a small error, for the upper motor might move back a little after we set its position.

Code:

For the first challenge, we keep old code and add calculation part to it. Then we print the angles we get from the calculation and  angles we get from sensor. The result shows that the results are basically same with small deviation.

#!/usr/bin/env python3
import math
def main(): ev3 = Device('this') #Connect the #1 motor to port A #Connect 1the #2 motor to port B m1 = ev3.LargeMotor('outC') m2 = ev3.LargeMotor('outA') sensor1 = ev3.GyroSensor('in1') sensor2 = ev3.GyroSensor('in3') sensor1.mode ='GYRO-ANG' sensor2.mode ='GYRO-ANG' angle1_origin = sensor1.value(); angle2_origin = sensor2.value(); l1 = 8 l2 = 9 loop = 1 prev1 = sensor1.value() prev2 = sensor2.value() - angle2_origin print("Start") while loop == 1: angle1 = sensor1.value() angle2 = sensor2.value() - angle2_origin print("angle1 =%d, angle2 =%d" % (angle1, angle2)) if abs(angle1 - angle1_origin) > 10 and abs(angle2 - angle2_origin) > 10 and abs(prev1 - angle1) < 10 and abs(prev2 - angle2) < 10: prev1 = angle1 prev2 = angle2 break sleep(1) prev1 = angle1 prev2 = angle2 print("End")
# calculation theta1 = -prev1 theta2 = prev2 - prev1 x = l1 * math.cos(theta1/180*3.14) + l2 * math.cos(-prev2/180*3.14) y = l1 * math.sin(theta1/180*3.14) + l2 * math.sin(-prev2/180*3.14) theta1o = -prev1 theta2o = -prev2 + prev1 r2 = (x**2 + y**2) cosa = (l2**2 - (l1**2 + r2))/((-2) * l1 * math.sqrt(r2)) a = math.atan(math.sqrt(1-cosa**2) / cosa) b = math.atan(y/x) z = (r2 - l1**2 - l2**2)/(l1 * l2 * (-2))
# get angle1 and angle2 if abs(prev2) <= abs(prev1): theta1 = a + b theta2 = -abs(math.atan(math.sqrt(1 - z**2) / z)) else: theta1 = b - a theta2 = abs(math.atan(math.sqrt(1 - z**2) / z)) theta1_deg = theta1/math.pi * 180 theta2_deg = theta2/math.pi * 180
# print origin angle and calculated angle print("prev1 =%d, prev2 =%d" % (prev1, prev2)) print("x =%d, y =%d" % (x, y)) print("theta1cal =%d, theta2cal =%d" % (theta1_deg, theta2_deg)) print("theta1orin =%d, theta2orin =%d" % (theta1o, theta2o))
# run back m1.run_to_rel_pos(position_sp=-theta1_deg, speed_sp=200, stop_action="hold") m2.run_to_rel_pos(position_sp=-theta2_deg, speed_sp=200, stop_action="hold") sleep(1);
# stop m1.stop(stop_action="coast") m2.stop(stop_action="coast")
if __name__ == '__main__': main()

Challenge 2:

For the second challenge, we build a for loop to get three different positions and record them by pushing them into stack. During one loop, we sense the movement of both part of the arm, then calculate the angle and record them. After finishing all three measurements, we use stack to pop positions to the motor. And we also use a for loop to achieve this function. During the loop, every time the stack pops a position, we drive the motor to rotate back  certain angle to get back to its origin position. To get the right angle, we need to remeasure angle every time we restart to record the rotation.

To make to pen row up before drawing the line and put back when the arm is ready to draw the line, we and another motor on the arm, before we start to record positions, we drive the motor to rotate certain angle and row the pen up. After finishing recording, we drive the motor again in different direction to put the pen back and make it ready to draw.

Code:

#!/usr/bin/env python3
import math
def main(): ev3 = Device('this') #Connect the #1 motor to port A #Connect 1the #2 motor to port B m1 = ev3.LargeMotor('outC') m2 = ev3.LargeMotor('outA') m3 = ev3.MediumMotor('outB') sensor1 = ev3.GyroSensor('in1') sensor2 = ev3.GyroSensor('in3') sensor1.mode ='GYRO-ANG' sensor2.mode ='GYRO-ANG' m1.stop(stop_action="coast") m2.stop(stop_action="coast") m3.stop(stop_action="coast") l1 = 8 l2 = 9 loop = 1 positions = [] m3.run_to_rel_pos(position_sp=50, speed_sp=100, stop_action="hold") print("Start") for i in range(3): print(i) angle1_origin = sensor1.value(); angle2_origin = sensor2.value(); prev1 = sensor1.value() prev2 = sensor2.value() - angle2_origin while loop == 1: angle1 = sensor1.value() - angle1_origin angle2 = sensor2.value() - angle2_origin print("angle1 =%d, angle2 =%d" % (angle1, angle2)) if abs(angle1 - angle1_origin) > 10 and abs(angle2 - angle2_origin) > 10 and abs(prev1 - angle1) < 10 and abs(prev2 - angle2) < 10: prev1 = angle1 prev2 = angle2 break sleep(1) prev1 = angle1 prev2 = angle2 print("End") # calculation theta1 = -prev1 theta2 = prev2 - prev1 x = l1 * math.cos(theta1/180*3.14) + l2 * math.cos(-prev2/180*3.14) y = l1 * math.sin(theta1/180*3.14) + l2 * math.sin(-prev2/180*3.14) theta1o = -prev1 theta2o = -prev2 + prev1 r2 = (x**2 + y**2) cosa = (l2**2 - (l1**2 + r2))/((-2) * l1 * math.sqrt(r2)) a = math.atan(math.sqrt(1-cosa**2) / cosa) b = math.atan(y/x) z = (r2 - l1**2 - l2**2)/(l1 * l2 * (-2)) # get angle1 and angle2 if abs(prev2) <= abs(prev1): theta1 = a + b theta2 = -abs(math.atan(math.sqrt(1 - z**2) / z)) else: theta1 = b - a theta2 = abs(math.atan(math.sqrt(1 - z**2) / z)) theta1_deg = theta1/math.pi * 180 theta2_deg = theta2/math.pi * 180 # print origin angle and calculated angle print("prev1 =%d, prev2 =%d" % (prev1, prev2)) print("x =%d, y =%d" % (x, y)) print("theta1cal =%d, theta2cal =%d" % (theta1_deg, theta2_deg)) print("theta1orin =%d, theta2orin =%d" % (theta1o, theta2o)) positions.append((theta1_deg, theta2_deg)) m3.run_to_rel_pos(position_sp=-50, speed_sp=100, stop_action="hold") sleep(1); for i in range(3): angle1, angle2 = positions.pop(); m1.run_to_rel_pos(position_sp=-angle1, speed_sp=100, stop_action="hold") m2.run_to_rel_pos(position_sp=-angle2, speed_sp=100, stop_action="hold") sleep(1); m1.stop(stop_action="coast") m2.stop(stop_action="coast") m3.stop(stop_action="coast")
if __name__ == '__main__': main()

Feb 23 2018 Read →

Inverse Kinematics Project Report

Shunta Muto

Osvaldo Calzada

Challenge 1: Calculating Motor Positions

For this project, we used the same robot arm from the previous project. Inverse kinematics calculation was performed to determine the required rotations of the base motor and the link motor from given xy positions. Lengths of the links were measured with a ruler.

Below shows the calculated values for given xy positions (5, 10), (10, 10), (-2.5, -13)

Image

At the end of the code, we included equations that recalculate xy positions from the angles that were just obtained, in order to confirm the calculation. There was no discrepancy between the input and recalculated xy position values, because the calculatoin is theoretical.

Challenge 2:

A major task for challenge 2 was ultimately determining what positions the robot could successfully reach. Our link/motor lengths were 8cm for the base link and 12cm for the attached link, the max reach of the robot was a circle of 20cm, but to make our task more manageable, only two quadrants were our focus. To do this, we limited the base angle to be between -90 and 90 degrees. Also, the construction of the arm limited the motion of the outer angle, so we decided to limit the angle to within a range where it would not hit any other pieces.

Video:

CODE:

Image Image Image Image

Feb 23 2018 Read →

Challenge 1

Here you'll see us moving the end effector to different x,y positions on our coordinate system and our code returning both the actual motor 1 positions and the calculated positions.

Code

Image

Challenge 2

Here we have the code storing end effector positions (x,y) as well as motor 3 positions to playback the positions.

Code

Image

Feb 23 2018 Read →

Project 5: Robotic Arm (Part 2)

By Isaac Collins, James Liao, and Andrew Sack

Overview:

The goal of this project was to program the robot arm we designed and built in our last project to use inverse kinematics so the robot could achieve a desired end effector location and travel through a sequence of pre-planned points. Our project was successful by implementing a way for a user to give the robot a drawing through a GUI and have the robot copy that drawing by moving a marker across a series of desired locations.

Image

Construction Updates:

This project primarily uses the same robot arm as was used for Project 4. However, some small modifications were made based on lessons learned from that project. The main changes we made to this project were the removal of the button we used for the last one because it was not needed for these challenges, as well as more firmly mounting the arm onto the EV3. This did not fully fix the issue as the EV3 still would move about as the arm was in operation however it became easier to steady.

Image

Software:

Challenge 1: Calculate Motor Positions

This builds on the previous code from robot arm part 1 for when a user moves the robot arm the EV3 calculates the end effector position, this code takes those calculated end effector positions and from that data uses inverse kinematics to find the original motor angles used to achieve that position. The only difficulty we ran into was when the user manipulates the motors, there is always 2 possible positions to have the motor which would achieve the same end effector position. Unfortunately the computer has no way of telling which of the two positions is correct, which means that this program was not always accurate in perfectly determining the correct angles of the motors, however in the second challenge this is not an issue because the angles calculated will always be done by the same method and therefore will achieve the desired end effector locations the same way.

Challenge 2: Playing Back Sequence of End Effector Positions

This code takes a text file as input. The data in the file is read into into 3 lists which correspond to shoulder, elbow, and wrist position data. After first recording the initial relative motor angles, the robot then uses the inverse kinematics equations to calculate the new angles the shoulder and elbow motors would need to change to and a boolean pen matrix is used determine whether the pen should be up or down. The motors would iterate this process until it had moved through all the positions it had been given.

Bonus Challenge: Dynamically Inputting Sequence of End Effector Positions

For this challenge, we used MATLAB to create a simple GUI program for plotting arm motion. The boundaries of the plot are the limits of the arm motion, with a blue semicircle to denote the right boundary, since it is not a straight line. The plotting is done by mouse clicks, with a left-click to select a new point, a middle-click to toggle the pen, and a right-click to end the program. When a point is clicked, the x-position, y-position, and pen state are saved as a tab-delimited line in a text file. When the program is finished, the text file is saved and can be copied over to the EV3 where the information is read and used to move the arm.

Image

Code Files:

calc_motor_position.py

play_back_pos.py

dynamicInput.m

square.txt (sample position file)

Feb 23 2018 Read →

Introduction:

This week we were tasked with using our already constructed robotic arm to demonstrate inverse kinematics. More specifically, we had the following three objectives. First, we had to modify our code from the week prior so that it could generate both the expected end effector positions and the corresponding motor positions for that point, allowing us to compare actual motor positions to calculated motor positions. Second, we needed to develop code that could pass an end effector position to the robot and have the robot move the arm to that position. Finally, our bonus challenge was to use an alternative input method to pass the position array to the robot.

The Robot:

This week’s assignment lends itself perfectly to last week’s robot (intentionally, we assume), thus we made no changes to our robot.

Part 1:

Part 1 was fairly simple given the equations needed were given to us. With just a few tweaks to last week’s code, we were able to get our robot to record various motor positions (moved through manual manipulation), calculate the corresponding end effector positions, and then use the inverse kinematics equation to calculate motor positions. We then compared those calculated motor positions to the recorded motor positions to assess our error.

We found in part 1 that the calculated motor positions were accurate to +/- 0.5, which is likely attributable to wiggle room in the motors.

Part 2:

Part 2 did not involve too much additional code, simply adding in a command to run the motors to the positions specified by the code. The robot was constrained to run in only the right two quadrants of the coordinate plane, with the positive x-axis defined as the arm’s starting position. We found that the robot was only able to achieve certain positions within its range of motion. In some cases, the robot could not move to a specified position, but changing that position by 1cm would fix the problem. In other cases, we were able to compensate for a seemingly unattainable position by hard coding a change in the sign of the alpha/beta angles, however in these cases this change had no affect. We are still unsure what caused this problem.

Ultimately, we were able to make our robotic arm play back a series of input positions with 1-2cm error, which is easily attributed to wiggle room in the motors.

Part 3:

Part 3 was a bonus challenge that stipulated an alternate method of input must be used to generate an array of end effector positions with which to drive the robotic arm. To accomplish this, we created a simple interactive grid in Matlab, the limits of which matched the numerical limits on our physical axes. To add a point to the array, we simply clicked somewhere on the grid, and the new point was automatically appended to a set of arrays, which we transferred into the robot code through copy/paste.

Conclusion:

Overall, this project was relatively successful. When the arm worked, it worked beautifully, exhibiting minimal error. However, the few positions we attempted that the robot was incapable of achieving were quite frustrating, and we remain perplexed as to what caused this error.

Video

Feb 23 2018 Read →

Righty II: The Revengeance

Description

Righty has returned from the future, now armed with knowledge of inverse kinematics, to avenge its past self, and to put a stop to the evil time travelling villain known simply as Danahy.

Structure:

Construction of Righty is exactly the same as in Project 4 (but edgier, cause its from the future). First, a motor was fastened to the top of the brick, nice and snug, so that only the actual motor moves/rotates, and that the motor just peaks out of the top of the brick. Two beams are then used to connect the motor to the base of another motor, which is connected to the end effector. We made the end effector from a motor that changes the orientation of the pencil. The pencil is held tight, sandwiched by two gears, one of which is connected to the end effector. This gives it a greater range of motion

Challenge 1:

Righty was easily able to pick up inverse kinematics. It was very easy to implement code based on the supplemental materials provided to us which included equations for getting theta 1 and theta 2. And now, as seen in the video, the program actually  prints a much more specific description of the orientations of the motors than in Project 4. In addition, theta 1 and theta 2 (our calculated values) are very similar to init_pos 1 and init_pos 2 (angle of the motors), proving that our calculations were right. We also coded a case system that would change values of alpha and beta, allowing the code to account for how two different sets of motor orientations can define the same end effector position, thus removing any possible discrepancies that there could be between the inputted and calculated motor positions. Righty was very successful at this challenge, as seen in the video.

Challenge 2:

We then defined coordinates such that righty would be able to draw a square. These coordinates are defined in xloc and yloc. We used the same code that worked in challenge 1 to calculate the motor positions.

Then, we ran these coordinates through a simple for loop for the 1st 4 points. Following this, the pen lifts up, moves by 2 coordinate points, is put back down and then then starts drawing the same shape (i.e.  the shape is translated). Although the robot played back the same shape in the two different loops, it is not the square we expected. To determine these end effector coordinates that were inputs, we physically moved the arms to determine the position (using code from project 4). When we tried to input these points, we got the same type of drawing as shown in the video below - it moves, but not in the orientation we expected. We believe the logic of our code is right (also partly because it worked yesterday....) but the movement of the motor may not be correct as the errors from the first position keep building up.

Pictures

Image

The two motors and the end effector connected to Righty II

Image

Reference x-y plane for how Righty II defines the position of the end effector

Image

Isometric view of the entirety of Righty II.

Videos

Challenge 1

Challenge 2

Code

Image Image

Challenge 1

Image Image Image Image

Feb 22 2018 Read →

Robotic Arm 2: Judgement Day

For part 2 of the Robotic Arm Lab we began with building a test code that was used to make sure our end effector positions and conversions were correct. We did this by storing initial angles values [angle1,angle2] in an array, then converting from polar to cartesian coordinates, and then from cartesian back to polar in a new array [angle1Out, angle2Out]. We then compared the input angles with the output angles to make sure they were the same. We moved the end effector to a new location and trying to work backwards and try to get the needed angles for each motor based on where we wanted the arm to move. Initially there was a discrepancy between the input angle values and what the device was returning to the second time around. We discovered that the cause of this was the fact that it takes a nonzero amount of time to move the motor to a new position. This meant that certain movement commands would be cut short- the larger the change in angle (i.e. moving the arm from fully extended to fully bent) the more drastic the movement would be cut short. This was remedied by adding a sleep command at the end of our for loop that was long enough to accommodate for the length of time it takes to move from one angle to another. We knew our code was correct when the arm started out and ended in the same location.

Image

Testing our part one code in quadrant I

For the second part of the assignment we made a few modifications to our previous code to allow it to respond to (x,y,pen) inputs rather than (theta1,theta2). Unfortunately we ran into issues getting correct output angles despite using the same equations (and code) as before. We spent a LONG TIME trying to troubleshoot this issue and were unable to resolve it. We went back and forth between challenges 1 and 2 and for some unknown reason there was a bug in challenge 2 code preventing correct conversion. Despite these errors we still finished the rest of the code such that if these bugs were solved the robot would be able to play back the positions correctly, including the pen up and down commands. Here is a video of our robot stabbing itself in the SD card slot with a dry erase marker. Enjoy!

Image

Feb 19 2018 Read →

Project 5

The main focus of this week’s project was inverse kinematics. We were assigned to use the previous week’s robotic arm and certain elements from the previous week’s code to work on inverse kinematics; i.e., using given positions of the end effector to calculate where the robotic motors should move to.

Challenge 1

The first challenge was to use the previous week’s code and essentially reverse it. While the previous week had the robot calculate its end effector position from given motor positions, this week’s code took that resulting position, calculated appropriate angles from given formulae, and moved the motors to those positions; in essence, this challenge was an exercise in performing mathematical operations in Python and confirming that they matched the expected inputs.

Challenge 2

The second challenge is similar to the first in its actions, the only difference being the number of executions. Furthermore, a new variable is introduced: whether or not the “wrist” is down and the writing utensil is touching the surface. The input values are stored in three separate arrays: one for X positions, one for Y positions, and one for wrist positions (i.e., up vs down). This allows the robot to draw disconnected lines, allowing more flexibility in execution.

We had some issues in completing Challenge 2. We used print statements to know when to position the robotic arm, and to what the program had calculated for the positions, and to see what the error was. The first issue we ran into was that the print statements would not execute, so we weren't sure when the robot was doing each task. However, even without the print statements, the for loops would often stop before completing entirely, and without print statements we couldn't figure out why this was. Challenge 1 executes well, but the video for Challenge 2 demonstrates these issues.

Code

Challenge 1

Image Image Image Image

Challenge 2

Image Image Image Image Image

Photos and videos

Image Image Image

Feb 19 2018 Read →