Project 4: Robotic Arm (Part 1)

Documentation (description, images, video, code, etc) due to website by Wed (2/14) at 9pm.

Click here for the project description.

Be sure also to see relevant notes under the February 8th, 2018 lecture.

And here is another writeup of the calculations (from last year, although we were using LabVIEW instead of Python); image credit: Ipek.

Image


Robotic Arm 1

Julia Noble and Brian Reaney

Robotic Arm Part 1

Image

Sturdy Arm Base:

Image

Challenge 1: Demonstrate range of motion and end effector: The robotic arm has three motors that make up the 3 joints. The two large motors move the arm to a position and the medium motor picks the marker up and down.

End Effector:

Image

Challenge 2 and 3: Manually move the arm and record the position of the arm through Python code. Have the robot play back the positions.

The robotic arm is connected to a button so that the user can input when to record the position. The button is also used to calibrate the position of the robotic arm before manually moving the robot to the desired positions. The robot draws dots where the positions were recorded.

Code:

Image Image

Feb 24 2018 Read →

Robotic Arm

Image

_Challenge 1/2_

Here you see the arm connected to the EV3 brick base. The two large motors make up the link with the end effector connected to a medium servo motor. Some pieces are underneath the arm to provide stability. The code moves the arm to different positions by changing angle1 and angle2 for the two motors.

_Code_

Image

Challenge 3

Here we move the arm to different positions on our coordinate system and see the difference between expected and true values

Code

!/usr/bin/python3

ev3 = Device("this")

from time import sleep

import math

def main():

motor1 = ev3.LargeMotor('outB')

motor2 = ev3.LargeMotor('outC')

motor3= ev3.MediumMotor('outD')

motor3.run_to_abs_pos(position_sp = -8, speed_sp = 50, stop_action="hold"); sleep (0.1)

motor1.run_to_abs_pos(position_sp= 0, speed_sp = 75, stop_action="brake"); sleep(0.1)

motor2.run_to_abs_pos(position_sp= 0, speed_sp = 75, stop_action="brake"); sleep(0.1)

while True:

this prints the (x,y) coordinates of the end effector:

motor1 = ev3.LargeMotor('outB')

motor2 = ev3.LargeMotor('outC')

pen = ev3.LargeMotor('outD')

angle1 = motor1.position

angle2 = motor2.position

x = 13_math.cos(math.radians(angle1)) + 6.5_math.cos(math.radians(angle1 + angle2))

y = 13_math.sin(math.radians(angle1)) + 6.5_math.sin(math.radians(angle1 + angle2))

print ("X position is " + str(x))

print("Y position is "+ str(y))

sleep (5)

if name == 'main':

main()

Feb 14 2018 Read →

Project 4: Robotic Arm Part 1

Image Image Image

Challenge 1: Building the Robotic Arm

In the first part, we were assigned to actually assemble the robotic arm composed of two arm links and two joints, one connecting the first arm to the base and the other connecting one arm to the other. The final component is an end effector that actually holds a marker.

Range of Motion:

Max X-position = 0.26 meters

Max Y -position = 0.26 meters

min Y- position = -0.26 meters

min X- position = -0.26 meters

Effective End Effector Trial:

Challenge 2: Calculating End Effector Positions

The second part of the project involved actually calculating the position of the marker at the any point along the range of motion. This was effectively done by using the angular displacement of each motor and multiplying times the arm length and adding those values up. For the x position we would be using the cosine function and for for the y-position. Originally I had trouble as I had forgotten that python uses the cosine function expecting radians.

Image

Challenge 3: Recording and Playing Back Positions

To implement this challenge, we had the brick communicate with the console. We would know when to move the arm, wait for a period of time, and move the arm again. Once we got all the positions we wanted, it would playback the positions in reverse so that it would end up in the starting position.

Image

Feb 14 2018 Read →

project-4: Robotic Arm

Rui Shi & Ziyi Zhang

Description

The goal of the project is to make a robotic arm which can hold a pen or something else and is able to write in a sheet. To construct such a system, we used two large motors, two gyros, a medium motor with a gear mounted on it to adjust the height of the "pen". To ensure each rotate part has the largest degree of freedom so that it can cover as much part of the sheet as possible, we connected three motors  in different horizontal level so they can rotate for almost 360°. Also, to avoid collision, one gyro was installed on the side of the motor rather than on top of it. Besides, we mounted a rod in the hole of the medium motor to make the arm capable to cover a larger area.

Image Image Image Image

Challenge 1: Building the Robotic Arm

As shown in the pictures and video below, our motor is able to reach a much larger area than a 8.5*11 sheet.

Image

Challenge 2: Calculate End Effector Positions

For challenge two, the code need to use the Gyro sensor to report the rotation angles of both parts of the arm and we need to make sure the report angles is the final angle of two parts. To achieve this, I wrote an if sentence in the while loop to achieve the function that the code can report values only when both parts of the arm already rotated and stopped rotating. When the rotation angle difference is smaller than 10, the code reports the value of two rotation angles for further calculation.

In the video, we used a green pad with grids as reference. The distance between two bold lines are one inch. The following picture shows the coordinate system we used to measure positions of the pen.

Image

Challenge 3: Recording and Playing Back Positions

For challenge three, after getting the value of two rotation angles, we need to run the motor to return the arm to its origin shape. To achieve this function, first, we calculate the angle that two motors need to rotate. Then, we set two motors to rotate for that much in certain speed.

#!/usr/bin/env python3
import math
def main(): ev3 = Device('this') #Connect the #1 motor to port A #Connect 1the #2 motor to port B m1 = ev3.LargeMotor('outA') m2 = ev3.LargeMotor('outB') #Connect the #1 sensor to port 1 #Connect 1the #2 motor to port 3 sensor1 = ev3.GyroSensor('in1') sensor2 = ev3.GyroSensor('in3') sensor1.mode ='GYRO-ANG' sensor2.mode ='GYRO-ANG' angle1_origin = sensor1.value(); angle2_origin = sensor2.value(); #set the length of each part of the arm l1 = 8 l2 = 9 loop = 1 #record the origin values of both sensors to prevent error prev1 = sensor1.value() prev2 = sensor2.value() - angle2_origin #this part use sensors to record the rotate angles of both parts of the arm while loop == 1: angle1 = sensor1.value() angle2 = sensor2.value() - angle2_origin print("angle1 =%d, angle2 =%d" % (angle1, angle2)) #if both parts of the arm moved and already stoped moving, then break the loop and report the value of angle1 and angle2 if abs(angle1 - angle1_origin) > 10 and abs(angle2 - angle2_origin) > 10 and abs(prev1 - angle1) < 10 and abs(prev2 - angle2) < 10: prev1 = angle1 prev2 = angle2 break sleep(1) prev1 = angle1 prev2 = angle2 #calculate angles that motor1 and motor2 should rotate, the angle motor2 should rotate equates to angle2 - angle1 theta1 = -(prev1) theta2 = -(prev2 - prev1) print("prev =%d, prev2 =%d" % (theta1, theta2)) #calculate x, y position x = l1 * math.cos(prev1*2*3.14/360) + l2 * math.cos(prev2*2*3.14/360) y = l1 * math.sin(prev1*2*3.14/360) + l2 * math.sin(prev2*2*3.14/360) print("theta1 =%d, theta2 =%d" % (theta1, theta2)) #print x, y position print("x =%d, y =%d" % (x, y))] #run the motor to rotate back to its origin position m1.run_to_rel_pos(position_sp=theta1, speed_sp=100, stop_action="hold") m2.run_to_rel_pos(position_sp=theta2, speed_sp=100, stop_action="hold") sleep(2) m1.stop(stop_action="coast") m2.stop(stop_action="coast")
if __name__ == '__main__': main()

Feb 14 2018 Read →

Larry

The goal of this assignment was to design a robotic arm that could sense and record how it was being moved in the x and y direction and then repeat the same motions on its own with a marker on a 8.5x11 sheet of paper.

The Design

Our robotic arm, Larry, is a two-linkage robotic arm with an end effector that is an Expo marker. Larry’s base contains the EV3 brain and three overturned wheels for support. The actual arm utilizes two large motors for sensing and controlling the angle of each link. The Expo marker at the end of the arm is controlled by a medium motor.

Image Image

Challenge 1

The minimum and maximum positions of the robotic arm are shown in the video below.

Larry's end effector works by rotating the medium motor 90 degrees in either direction to engage or disengage with the writing surface. It engages before playing back the recorded motion and disengages immediately afterwards. The end effector is attached to the motor by a gear coupled to the rotating axle of the motor. It is connected to the Expo marker by LEGOs held in tension and two rubber LEGO stoppers.

Image

Challenge 2

For Challenge 2, we implemented the equations discussed in class to calculate the position of a two-bar linkage. To do so, we needed to find the lengths of the links and convert from units of LEGO studs to inches. We then created empty arrays to store the values of the positions (3 total for our initial code and 500 for the continuous recording). We then iterated over these arrays using a loop. Within each cycle of the loop, we read the angles from the motors' sensors, calculated the Cartesian position of the marker, and then added all the results to the associated arrays at the proper index. After running the loop over the full length of the arrays, we had a complete recording of the marker's position and the angles necessary to trace it.

We confirmed the accuracy of our system by engaging the end effector before recording the position. When we then played it back, it was easy to see if the output matched the input by comparing the two paths.

Challenge 3

Challenge 3 took the information gained in step 2 to mimic the motions inputted by the user. We made an array that we stored location data inside and created a variable that would move through the array as time progressed. At first the array was only 3 long but we later modified it to be much larger so that we could continuously record position and angle information so that the arm could move more smoothly.

Playback of 3 points:

Image

Continuous recording and playback:

This is essentially the same code, but with larger arrays and less wait time to compensate the greater number of points desired. Additionally, the interface instructions are slightly different.

Image

Feb 14 2018 Read →

Code Description

  • For challenge two,  the code need to use the Gyro sensor to report the rotation angles of both parts of the arm and we need to make sure the report angles is the final angle of two parts. To achieve this, I wrote an if sentence in the while loop to achieve the function that the code can report values only when both parts of the arm already rotated and stoped rotating. When the rotation angle difference is smaller than 10, the code reports the value of two rotation angles for further calculation.
  • For challenge three, after getting the value of two rotation angles, we need to run the motor to return the arm to its origin shape. To achieve this function, first, we calculate the angle that two motors need to rotate. Then, we set two motors to rotate for that much in certain speed.
#!/usr/bin/env python3
import math
def main(): ev3 = Device('this') #Connect the #1 motor to port A #Connect 1the #2 motor to port B m1 = ev3.LargeMotor('outA') m2 = ev3.LargeMotor('outB') #Connect the #1 sensor to port 1 #Connect 1the #2 motor to port 3 sensor1 = ev3.GyroSensor('in1') sensor2 = ev3.GyroSensor('in3') sensor1.mode ='GYRO-ANG' sensor2.mode ='GYRO-ANG' angle1_origin = sensor1.value(); angle2_origin = sensor2.value(); #set the length of each part of the arm l1 = 8 l2 = 9 loop = 1 #record the origin values of both sensors to prevent error prev1 = sensor1.value() prev2 = sensor2.value() - angle2_origin #this part use sensors to record the rotate angles of both parts of the arm while loop == 1: angle1 = sensor1.value() angle2 = sensor2.value() - angle2_origin print("angle1 =%d, angle2 =%d" % (angle1, angle2)) #if both parts of the arm moved and already stoped moving, then break the loop and report the value of angle1 and angle2 if abs(angle1 - angle1_origin) > 10 and abs(angle2 - angle2_origin) > 10 and abs(prev1 - angle1) < 10 and abs(prev2 - angle2) < 10: prev1 = angle1 prev2 = angle2 break sleep(1) prev1 = angle1 prev2 = angle2 #calculate angles that motor1 and motor2 should rotate, the angle motor2 should rotate equates to angle2 - angle1 theta1 = -(prev1) theta2 = -(prev2 - prev1) print("prev =%d, prev2 =%d" % (theta1, theta2)) #calculate x, y position x = l1 * math.cos(prev1*2*3.14/360) + l2 * math.cos(prev2*2*3.14/360) y = l1 * math.sin(prev1*2*3.14/360) + l2 * math.sin(prev2*2*3.14/360) print("theta1 =%d, theta2 =%d" % (theta1, theta2)) #print x, y position print("x =%d, y =%d" % (x, y))] #run the motor to rotate back to its origin position m1.run_to_rel_pos(position_sp=theta1, speed_sp=100, stop_action="hold") m2.run_to_rel_pos(position_sp=theta2, speed_sp=100, stop_action="hold") sleep(2) m1.stop(stop_action="coast") m2.stop(stop_action="coast")
if __name__ == '__main__': main()

Feb 14 2018 Read →

Project 4: Robotic Arm

By Isaac Collins, James Liao, and Andrew Sack

Overview:

The goal of this project was to build a robotic arm with a 2-dimensional range of motion that could hold a pen, make markings on a sheet of paper, as well as record and “playback” its position in 2-dimensional space. This project was successful to calculate its position in space and in using the motor controllers to ‘remember’ the arm’s position.

Image

Construction:

The arm was constructed using 3 motors, which had roles as the ‘elbow’, ‘shoulder’, and ‘wrist’ joints. The wrist joint held a red expo marker as a writing tool using only lego pieces.The wrist joint motor was a EV3 medium motor and the shoulder and elbow motors were large motors. A caster wheel attached to the wrist joint allowed for smooth movement as the arm moved across the page, and a button was connected to the EV3 block as a mechanical way to control arm recording and playback. Long pieces were used for the ‘bones’ of the arm to give the largest freedom of movement to the arms motion.

Image

Design Challenges:

One big difficulty with controlling the arm’s two dimensional motion was friction causing the arm to move at the shoulder as well as at the wrist. This difficulty could be fixed by creating a wider base (at the expense of the arm’s current range of extension) or by printing/laser cutting a tool to secure the pieces of the EV3 together with the arm. Our attempted solution of adding the caster wheel did decrease friction however there was still some uncertainty when it came to control.

Image

Software:

Challenge 2: Calculating End Effector Positions

This challenge uses the built in encoders in each motor to calculate the arm position. First, the encoder values for the straightened-out arm are recorded to serve as a reference for all other values and to effectively zero the encoders. Then, for each time position is calculated, the angle of the shoulder motor (theta), is calculated by subtracting the initial position from the shoulder position and converting to radians. The angle of the wrist (alpha) is calculated similarly, but the wrist angle is added to theta, so alpha is relative to the x-axis. The trigonometric equations below are then used to calculate the positions and the results are printed to the console. The lengths are constant and were determined by measuring the joint-to-joint distances on the arm.

x_pos = length_1 _m.cos(theta) + length2 m.cos(alpha)

y_pos = length_1 _m.sin(theta) + length2 m.sin(alpha)

Image

Challenge 3: Recording and Playing Back Positions

This challenge involves recording the position of the arm while it is manually moved, and then the robot repeating that motion independently. The storage of arm positions is accomplished by saving the relative positions of each joint in separate python lists. This is because the position-based movement of the EV3 motors only accepts relative positions, not absolute. While the arm is manually moved, the code loops through reading the position of the three motors and then saving the position minus the previous position to the list. It then sleeps for a set amount of time and repeats. After the recording is complete, the program waits for a button made from a touch sensor to be pressed to give the user time to reset the robot position. The program then iterates through the lists and commands each motor to move the recorded distance at a given speed and then sleep for the same amount of time as in recording.

Code Files:

calc_position.py

record_and_play.py

Feb 14 2018 Read →

Robotic Arm

The goal of this project was to build and program a 2-dimensional robotic arm that could hold a pencil, write over an 8.5”x11” paper area, measure its position, and record and playback input positions.

Challenge 1

Image

The robot is built with the EV3 brick stationary, and the arm constructed out of two motors connected end-to-end. A wheel or ball is positioned under each arm to allow for smooth movement without interference from the weight of the motors. The end effector is a rotary motor with a pencil holder attached: the pencil is kept in place by a rubber band, and the end effector can lift the pencil off the page by rotation.

The robot is placed in the middle of the long side of the paper area, and from here can reach nearly all of the paper. However, due to the motor size, the arm is not able to fold completely, so it cannot reach positions very close to the robot brick. In programming, the motor closest to the end effector is referred to as the radius, and the other motor as the humerus.

Challenge 2

Image

The code for Challenge 2 records a manually input position of the robot arm, and calculates the x and y values of that position in relation to the robot’s position. The robot is placed at the middle of the long side of the paper area, such that the paper area is encompassed by Cartesian quadrants 1 and 4 (all positive x-values and all positive and negative y-values). Since one of the motors is oriented differently from the other, one motor’s positions are multiplied by -1. Once the position has been manually input, the code calculates position using trigonometry, with the known lengths of each linkage as 5 inches. The position is printed as separate x and y coordinates.

Challenge 3

The code for Challenge 3 records 3 manually input positions of the robotic arm, returns to the original position, and then returns to those 3 positions. The robotic arm is positioned 3 times, allowing 5 seconds each time for positioning, and then returns to the original position. Each time, the code records the absolute positions of the motors into two arrays. Afterwards, the robotic arm returns to each of the three positions by returning to the corresponding motor positions in the arrays. A low speed and longer pausing time is used to increase accuracy.

Code

Challenge 2

Image

Challenge 3

Image

Feb 14 2018 Read →

ROBOT ARM: Shunta Muto, Osvaldo Calzada

Challenge 1:

Robot arm consists of two links, three motors, base, and gripper. Since each motor was already embedded in its arm, we only needed to connect each motor through a rod to build the robot arm. One end of the arm was connected to the rectangular base, on which the processor brick was mounted to provide the counter weight for the robot arm. If the processor brick and the arm was not mounted onto the same base, weight of the arm could cause itself to fall. Thus, heavy weight of the processor brick helps the robot arm to stabilize, so that it rotates smoothly on 2D plane. The gripper is directly attached to the motor at the tip, where pen is held by pressing it with rods that are connected to each other, forming a square.

Image Image Image

Video below shows the range of motion for the robot arm. Recorded minimum and maximum positions are as follows:

Minimum Position

theta_B:-89, theta_L:-78

x:-14.48 y:-11.37

Maximum Position

theta_B:86, theta_L:72

x:-13.35 y:13.6

Video showing the range

Challenge 2:

Position of the end effector was calculated using simple trigonometric relationship. In doing so, we used “math” library on python to calculate each trig function. In order to calculate the xy position of the end effector, we measured the length of the links beforehand. After some trials, we realized that the positive direction for the rotation of the two motors were opposite from each other, so we aligned the direction by inserting negative sign for one of the motor. We also checked the numerical values against the measurement to make sure that our calculation provides good approximation. Results are as follows:

Image

[video or pic]

Video below shows the movement of the end effector

Challenge 3:

Using built-in functions, we wrote code that records 3 positions of each motor in degree and moves the motor to the recorded positions. Eah time the program records the position, it waits for 2 seconds to wait for the user to move the arm to the new position. Once it records all positions, it should go back to the original position, because we can only move the motor relative to its previous position. Furthermore,

Code and video-recorded results are shown below.

[code]

Image Image

Feb 14 2018 Read →

Riley Kolus

Robert Hrabchak

Intro:

For this project, we had three objectives. The first was to build a robotic arm, capable of 2 degrees of freedom and actuation of an end effector, in this case a writing utensil (raised or lowered), and simply demonstrate its range of motion. The second was to move that robotic arm through multiple different positions and record the positions as points on the plane of motion. Our final objective was to write a program that could record manually input positions of the robotic arm and then play them back through motor drive.

The Robot:

As seen in the images below, we built our robotic arm as a series of two motors mounted in a linear fashion, mimicking a human arm with a shoulder that only rotates about a single axis. The arm construction of this was actually very simple, requiring only a few linkages to firmly connect each motor. There were two lengths of the arm controlled by motors, similar to the Humerus and Radius/Ulna in a human. Each length was measured using a ruler for future programming use.

The arm, being so far cantilevered away from the CPU, created a stability problem since the CPU is simply not all that heavy. In order to firmly affix our robot to the work surface, we lines each leg of the CPU with rubber bands, increasing friction and preventing unwanted rotation, and we built a small platform on which to rest heavy objects like textbooks, providing a counterweight to the heavy arm.   We also added a roller ball to the end of the arm.  When the pen was placed on the page, it takes the weight of the arm, but when it is flipped up, the roller ball is able to take the weight of the arm.  These interventions greatly improved stability.

At the end of the arm, we placed a Medium Motor, and connected it to a specially outfitted ball point pen. Held into its Lego mount by many rubber bands, the pen could be raised and lowered by rotation of the medium motor a predetermined amount. This worked quite well to actuate the writing action of the robotic arm.

Image

Fig. 1: Overhead view of robotic arm on drawing plane

Image

Fig. 2: Off axis view

Image

Fig. 3: Side View

Range of Motion:

As shown in the video below, the robot is able to move through a large, but abstract range of motion. When fully extended, the arm can move in a 35cm radius semi-circle, however, due to constraints imposed by the physical size of each motor, the robot can only complete a portion of the small semi-circle. This is difficult to describe, and best illustrated in the video below. Though it may be hard to tell, the usable range of motion is enough to cover an 8.5”x11” sheet of paper if placed in the right position relative to the CPU position.

Image

Fig. 4: Code to move arm through largest and smallest range of motion

End Effector Position

The second objective listed above is the need to calculate end effector positions. To do this, we used a simple equation to calculate the x and y positions, shown below in the code for this section. We mapped out x and y axes on the work surface our robot was mounted on, measured in centimeters, with the first pivot point of the arm at (0,0). Moving the arm manually through multiple positions, we were able to output relatively accurate position information in our coordinate frame. We saw error of 1-3 centimeters in each measurement, but we attributed this to “wiggle” in the motors, only amplified with both motors mounted in series.

Image

Fig. 5: Code to calculate end effector positions

Playback Positions

The final objective was to record manually input positions of the robot arm and to play them back in sequence. In order to achieve this, we recorded position information in the form of motor angle data for both main joints every second for 5 seconds. During these 5 seconds, we moved the arm through various positions. This position information was recorded to a 2D array, concatenated with the new position every second. After a brief pause, the robot passed these values two by two to the motors in sequence, with enough of a pause for the observer to confirm correct playback location. Again, we saw some error here, but it was still on the order of 1-3cm in each dimension, likely attributable to wiggle room in the motors.

Image

Fig. 6: Code to playback arm positions

Conclusion

Overall, this project was quite a success. Our robot was pretty sturdy, and able to perform all necessary functions without any deficits. The wiggle room in each motor contributed to a fair amount of error, but this was difficult to fix without gearing the whole robot. Given that we did not discover this would be a source of error until we had the robot functioning, it was much too late to implement a gearing system. In future robotic arm platforms, it would be wise to gear down the motors so that error like this is less apparent.

Video

Feb 13 2018 Read →

RIGHTY THE ARM

Description

Structure:

Construction of Righty, the robotic arm, was quite simple. First, a motor was fastened to the top of the brick, nice and snug, so that only the actual motor moves/rotates, and that the motor just peaks out of the top of the brick. Two beams are then used to connect the motor to the base of another motor, which is connected to the end effector. We made the end effector from a motor that changes the orientation of the pencil. The pencil is held tight, sandwiched by two gears, one of which is connected to the end effector. This gives it a greater range of motion

Challenge 1:

Righty has a wide array of motion, and is easily able to cover the entirety of an 8" by 11" sheet of paper. Its full range of motion and use of the end effector can be seen in the videos (1 and 2) below

Challenge 2:

Challenge 2 was to write a code that would calculate the position of the end effector as the two parts of the arm were manually moved. This was done using simple trigonometry, with the x displacement being calculated using the length of the first motor times the cosine of how much its rotated, plus the length of the second motor times the cosine of how much both motors have rotated. Y displacement was calculated in a similar fashion, using sine instead of cosine. We had to ensure that there was a baseline "0" degree straight line, to use as a reference frame for the other degree measures, and to set up an x-y grid to better understand the position. To do this, we oriented the position of each motor until it read close to 0, and then reassembled the arm in a straight line, using the tan pegs as a reference for 0 degrees. As seen in the video (video 3), this code was very effective at giving a reading of the end effector's position, based on the little x-y grid we had drawn in. While there are slight errors due to the motor (there is some room for the link to shift in place while the actual orientation of the motor doesn't change), the basic position of the end effector is still accurate.

Challenge 3:

Challenge 3 is where things got a little hairy for Righty. The plan was to use a user inputted variable that would record how many positions the robot was moved manually. This was done using a while loop. This while loop has a pause for 5 seconds. So, every 5 seconds, the robot would record the angle of each motor based on where the user has moved it to. This would be recorded in 2 different lists, motor1_pos and motor2_pos . For example, a list would look like [p1 p2 p3 p4] at the end of 4 iterations. This implementation worked out fine.

In the next step, we used another while loop that would go back through the list saved and have the robot move to that position, using simple python commands. For whatever reason, Righty was simply unable to do this. As seen in our video, it just crumbles in on itself. We even tried controlling the direction of rotation, thinking that the error was Righty's - that he was trying to rotate parts through the brick's space, causing them to collide. But even that did not fix it. We believe the logic of our code is correct and have scoured our code but cannot think of a reason as to why this is occurring.

Pictures

Righty's full armspan

Image

Close up view of the links connecting each of the motors

Image

Close up view of how the first motor is fixed to the mindstorms brick

Image

Videos

Challenge 1: To demonstrate range of motion

Challenge 1: To show end effector movement

Challenge 2: Move end effector and calculate positions

Challenge 3: To record and playback

Code

Challenge 2:

Image

Challenge 3:

Image

Feb 13 2018 Read →

Robotic Arm

Annalisa DeBari and Raven Fournier

Our robot consists of two motors and beams of length 14 cm and 11 cm (first and second beam respectively).

Image Image

Challenge 1: Build the arm and demonstrate the range of motion:

Maximum X position: 25 cm

Maximum Y position: 25 cm

Minimum X position: -11.3 cm

Minimum Y position: -25 cm

Challenge 2: Calculate end effector position based on manual movement of arm.

Image

Challenge 2 was done to ensure that position of the end effector is being properly tracked by the computer.  The video demonstrates the motion and then shows the position of the end effector, denoted x_b & y_b.

Challenge 3: Create program that records positions and “plays” back.

As the first half of Challenge 3, we wanted to demonstrate that our robotic arm moves rigidly to specific points to be played back.  It can be noted that the sampling and playback rates can be altered for a more accurate playback, but this video demonstrates that the robot successfully moves to the predetermined positions.

For the second half, we wanted to demonstrate that our robotic arm can move and follow fluid motions.  This was done by altering the code slightly.  Depending on the desired application, we know what alterations to make in our code.

Code:

#!/usr/bin/env python3 ev3 = Device("this")
def manageRange(n): return max(min(n,1050),-1050) def main(): import math # initialize variables position_1 = None length1 = None length2 = None theta1 = None theta2 = None i = 0 # x_a is the x-coordinate of the end of the beam connected to motor 1, and x_b is the x-coordinate of the end effector x_a = None x_b = None y_a = None y_b = None # create vectors to hold the motor location data theta1_array = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] theta2_array = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] while True: # set theta1 to the position of the motor in port b (motor 1) theta1 = ev3.LargeMotor('outB').position theta1 = -1 * theta1 # had to multiply by -1 for the math to work out # store that value into the current location of an array theta1_array[i] = theta1 theta2 = ev3.LargeMotor('outD').position theta2 = -1 * theta2 theta2_array[i] = theta2 length1 = 14 # length of first beam length2 = 11 # length of 2nd beam # from class we calculated these formulas x_a = length1 * math.cos(theta1 / 180.0 * math.pi) y_a = length1 * math.sin(theta1 / 180.0 * math.pi) print('x_a:' + str(x_a)) print('y_a:' + str(y_a)) print('motor1 position:' + str(theta1)) x_b = x_a + length2 * math.cos((theta1+theta2) / 180.0 * math.pi) y_b = y_a + length2 * math.sin((theta1+theta2) / 180.0 * math.pi) print('x_b:' + str(x_b)) print('y_b:' + str(y_b)) print('motor2 position:' + str(theta2)) sleep(1) i = i+1 # increase the counter variable # once the code has recorded location 14 times, enter this next portion where it “plays back” if i >=14: print('entered if statement') print('theta1_array:') print(theta1_array) print('theta2_array:') print(theta2_array)
# need another counter variable to access the stored values j=0 for count in range(14): print('ENTERED FOR LOOP') print('j is:' + str(j)) print('motor1 position:') print(theta1_array[j]) print('motor2 position:') print(theta2_array[j]) current_posA = theta1_array[j] * -1 current_posB = theta2_array[j] * -1
# needed to keep track of the previous position in order to move to the correct location relative to the original zero location instead of moving relative to its current position prev_posA = 0 prev_posB = 0 if j >= 1: prev_posA = theta1_array[j-1] * -1 prev_posB = theta2_array[j-1] * -1 outB = ev3.LargeMotor('outB') outB.run_to_rel_pos(speed_sp=manageRange(15 * 10.50),position_sp=current_posA - prev_posA, stop_action="hold") outD = ev3.LargeMotor('outD') outD.run_to_rel_pos(speed_sp=manageRange(15 * 10.50),position_sp=current_posB - prev_posB, stop_action="hold") j=j+1 sleep(1) pass print('done') break if __name__ == '__main__': main()

Disclaimer: Our code logic is correct, but since the EV3 motors are not very precise the locations are not always precise either. The mechanical components have weight to them, and the motors can become misaligned with zero fairly easily. There is also a decent amount of “wiggle room” with the motors where you can mechanically move back and forth a couple of millimeters and the motor reads the same position throughout.

Changing the sleep times also changes the rate of collection. So, to collect more fluidly, changing the sleep time to .5 or less allows the code to store the data more rapidly and collect more fluid motion. We also changed the motor command stop_action set to “coast” instead of “hold” to make the motion even more fluid as seen in the second video of challenge 3.

Feb 13 2018 Read →