Combination Report Einstein Robot



Download 10.84 Mb.
Page10/41
Date conversion08.07.2018
Size10.84 Mb.
1   ...   6   7   8   9   10   11   12   13   ...   41
Fig 2-17: Solid model of robot arm and hand assembly.

This model does not include some of the hard ware necessary to complete the assembly in addition to the hand. Figure 2-18, shows the complete TETRIX assembly and hand.



Fig 2-18: Final TETRIX robot arm and hand assembly.
In addition to the stock TETRIX components, a welding rod was used to fashion the fingers of the hand. The plate that the fingers attach to was also fabricated. A list of the modified TETRIX components is in the Arm Appendix.
2.3.3 Electrical Design and Components


The electronic components used to operate the arms consisted of two electronic motors, four electronic servo motors, one motor controller, one servo controller, and the NXT brick. The motors were used on the elbow and shoulder joints to provide more torque and stability while servo motors were used to control the hand, wrist, and arm rotation. All these components are part of the Lego Textrix Robotics division. Using the Tetrix parts along with the NXT brick allowed for less time spent integrating and developing drivers, because when programmed with RobotC, the drivers and control functions are already integrated into the system allowing for more of a plug-and-play environment. This saved time in developing code for controlling the arm.


The main control of out arm is done by the NXT brick. This control unit is run by a 32bit ARM7 microprocessor and an 8 bit AVR microcontroller. It has 4 six wire input ports, and 3 six wire output ports. It also contains a USB port for programming and debugging. It is mainly programmed using the NXT graphical interface language, LabVIEW, RobotC, or NXT++. We chose to use RobotC, which is a subset of the C programming language since that is what our group was the most familiar with. This will be discussed further later on in the report. The RobotC interface allowed us to download and run programs on the NXT unit, and once downloaded could be run directly from the NXT without needing to be hooked to a computer. For our application we were using Tetrix products to interface with the NXT we ran all our components from Sensor Port 1 of the NXT. The NXT allows up to four controllers to be daisy chained to each sensor port. These controllers can be a combination of servo controllers and motor controllers which will be discussed later. Any sensor that will be used for additions for arm control will also be plugged in to the NXT.
The motors we used were Textrix DC motors available from Lego Robotics. The motors run at 152rpm at full power and provide 300oz-in torque and require 12V to operate. Within the software the speed can be controlled by setting the percentage of the motor speed to lower the RPM of the shaft. This gives the motors more versatility when used in projects where more torque than can be provided by a servo is needed, but the slower speed of the servo is still desired. This was useful in our application a servo motor would not have been able to hold up the weight of our robotic arm, but we still needed slower movement for a more realistic appearance and allow more control for the user. The disadvantage of using motors in this situation is they are heavy and more difficult to mount than a servo would be. We installed encoders for position control, but we did not use them for this part of the project. The operation of the encoders will be talked about later in the report.
The motors are powered and controlled using a HiTechnic DC motor controller. This motor controller interfaces the motor with the NXT brick as well as providing power to the motor itself. Each motor controller can operate two 12V Tetrix motors as well as interface with motor encoders which will be discussed later. It is this motor controller that allows the motor speed to be adjusted by changing the power level supplied to the motor by using an internal PID algorithm.
Encoders are installed on the two motors used on the robot. These encoders are made by US Digital. They are used to allow position control of the motors so they can perform similar to servos. The encoders used are optical quadrature encoders. These encoders use two output channels (A and B) to sense position. Using two code tracks with sectors positioned 90 degrees out of phase, the two output channels of the quadrature encoder indicate both position and direction of rotation. If A leads B, for example, the disk is rotating in a clockwise direction. If B leads A, then the disk is rotating in a counter-clockwise direction. The encoder also allows the system to use PID control to adjust the speed of the shaft.


The servo motors used were three HS-475HB servos and one HS-755HB all made by Hitec. Both servos are 3 pole with karbonite gears that can be run at 4.8V or 6V. The 475HB provides about 80 oz-in of torque and the 755HB provides 183 oz-in of torque. The 755HB is a larger servo than normal is used with the Tetrix system, but the servo wire is the same for both servo types, so they can both be used with servo controller. The downside of this servo type not being available for the Tetrix system is that there is not mounting hardware available so a mount had to be fabricated to attack the servo to the Tetrix stock parts. The servos have a range of 0 to 255 so they give you excellent position control. The motors inside the servo only hold position when powered so when the power is removed any weight bearing servos release. The wrist on the robot is an example of this. When the program is running the wrist servo supports the hand, but as soon as power is removed or the program is ended the hand falls to one of the servo extremes.




Like the motors, in order to interact with the NXT device the servos must attach to a HiTechnic servo motor controller. The servo controller requires a 12V supply and it divides this down to 6V to operate the individual servos. The servo controller can hold up to six servos together, and like the motor controllers the can be chained together to allow the use of more servos than on controller could handle.


2.3.4 System Modeling
As explained previously, this phase of the project is limited to manually controlling each degree of freedom. The operator moves each joint to a new angle and this places the arm in a new configuration. For each configuration the hand is moved to a specific location and orientation. The equations that relate the arm’s configuration to the hand’s location and orientation are called the forward kinematic equations for position. What is more useful however, is the ability to determine the arm configuration that will achieve a desired hand location and orientation. In other words, the position and orientation of the hand must be defined in terms of the joint angles. This is called inverse kinematics. The forward kinematic equations for the arm are developed below followed by some possible solution techniques for the inverse kinematic problem. Developing these equations is the first step to implementing a more sophisticated method of motion control. Although this development is not an exhaustive description of the mathematics involves, it highlights the basic concepts. References are given in the appendix.
Before developing the forward kinematic equations it is necessary to describe how a frame in space can be represented by a matrix. Also, it is necessary to understand how a transformation matrix can map a frame with particular position and orientation to another. The following 4x4 matrix represents a frame in Cartesian space.

Here, the P elements represent components of a position vector that defines the location of the frame relative to a fixed frame.



The n, o, and a elements are components of unit vectors that define the x, y, and z axis of the frame respectively. These vectors determine the frame’s orientation relative to the fixed frame. The bottom row is necessary to keep the matrix square.
A transformation matrix, in this context, defines the necessary translations and rotations to move from one such reference frame to another. These transformations can be combined for a series of reference frames such that the resulting relationship defines the last frame relative to the first. In the case of the robot arm, the first frame is the fixed origin and the last is the hand. This is done by simply post-multiplying each transformation matrix with the next. For example, if T12 represents the transformation between frames 1 and 2 and T23 represents the transformation between frames 2 and 3, the total transformation between 1 and 3 can be calculated as follows.


Using this methodology, a reference frame can be assigned to each joint on the robot arm. Through successive transformations between each frame, the total transformation can be determined starting at the fixed base of the arm and ending at the hand. This will define the absolute position and orientation of the hand and be the basis for the forward kinematic equations.
The Denavit-Hartenberg representation specifies a systematic method for assigning these reference frames such that the form of the transformation matrix between successive frames is the same. The details of this method are not described here, but the assignments of each frame according to this conversion are shown in Figure 2-19. It is important to note that, although this robot has only revolute joints, the Denavit-Hartenberg method works for prismatic joints or a combination of the two. It will not however, model robots with motions in the Y-direction.


1   ...   6   7   8   9   10   11   12   13   ...   41


The database is protected by copyright ©dentisty.org 2016
send message

    Main page