Combination Report Einstein Robot



Download 10.84 Mb.
Page12/41
Date conversion08.07.2018
Size10.84 Mb.
1   ...   8   9   10   11   12   13   14   15   ...   41

3.3.1.2 NXT HID Device

This code is meant to provide an interface between the NXT Brick and the laptop. It operates very similar to the way a keyboard would, and will allow for applications to be called so that data can be stored in spread sheets or another medium. This code will also allow for troubleshooting as we implement sensors.



3.3.1.2.1 Code Setup

This code is a bit more involved to understand. A couple third-party .h files are necessary in order to make it function. These files can be found online at sourceforge.net[13].


Specifically, we needed 'Common.h,' and 'MSHID-driver.h.' The 'Common.h' file is a library that consists of hardware description files that are implemented fairly commonly (hence the name). The 'MSHID-driver.h' file allows for data transfer from the NXT Brick to the laptop.
In the code below, you will see an example of what we implemented. For the complete code, see Appendix 2, "Final Code including arm, head and base."


//Set state to long happy if left arrow is pressed on the d-pad

if ((int)currState.a==0)

{

string msg1 = "c:\\VSA\\Longhappy\r";



MSHIDsendCommand(MSHID, MSHID_DDATA);

//MSHID_MOD_LGUI = windows key, the next argument is a key input 'r'

//'WINDOWS-r' opens the run command box

MSHIDsendKeyboardData(MSHID, MSHID_MOD_LGUI, 0x15);

MSHIDsendCommand(MSHID, MSHID_XMIT);

wait1Msec(1000);

MSHIDsendCommand(MSHID, MSHID_ASCII);

MSHIDsendString(MSHID, msg1);

//Wait 2 seconds to ensure no accidental double press

wait1Msec(2000);



In the snippet above, the 'currState.a' is polling for any presses on the directional d pad. We opted to have a long happy emotion, and a long sad emotion. In this long happy example, we set a string (msg1) to carry the file path to a bat file that executes the VSA command. In the next several lines of code, we press the 'windows' and 'r' key, bringing up the command prompt. From there, we input the string in msg1 and send the carriage return command (the \r at the end of the string). Then we wait a few seconds to ensure no potential for double activation if the button is held or pressed repeatedly.


In the short versions, there was no need to reinitiate a neutral state since it was built in. The long state of emotions required a transition into the neutral state again. We implemented this as seen in the following code:

string msg2 = "c:\\VSA\\Neutral\r";

MSHIDsendCommand(MSHID, MSHID_DDATA);

//MSHID_MOD_LGUI = windows key, the next argument is a key input 'r'

//'WINDOWS-r' opens the run command box

MSHIDsendKeyboardData(MSHID, MSHID_MOD_LGUI, 0x15);

MSHIDsendCommand(MSHID, MSHID_XMIT);

wait1Msec(1000);

MSHIDsendCommand(MSHID, MSHID_ASCII);

MSHIDsendString(MSHID, msg2);

//Wait 2 seconds to ensure no accidental double press

wait1Msec(2000);

}

In this second section, we set a second string to call the neutral.bat file, and implement it the same way.

3.3.1.1.2 Troubleshooting

You may notice that the timing is off when sending gesture commands to VSA. You need to take into account the following when calling different or custom emotions:




  • You need to allow enough time for emotions to display before calling another one. That is what some of the delay statements accomplish in the code example above.

  • The .bat files that get created may or may not be consistent. You will need to work closely with whatever team is generating them, and determine the code you will need to implement.

The table in the "NXT HID device" section of Appendix 2 is incredibly handy for determining what commands are available. You will need to utilize it to truly take advantage of this device. If you find that you have problems with the message strings generating inputs incorrectly, check your values there.

If you have other unexpected problems, simply watch your computer while the code is executing and see what it does. This should be a clear indicator on what the program thinks it should be doing. If it does nothing, watch the LED's on the device itself.
3.3.2 Arm Software
The programming of the arm movement was done using RobotC language. The PSP-Nx-lib.c library was used in order to use as PS2 controller to operate the arm.

The software to control the hand can be broken up into three sections: controlling the DC motors, controlling the servos, and integrating the PS2 controller. The code for each was tested prior to being compiled into the final program. We will start with describing how the DC motors are programmed, followed by the servos, the controller integration, and finally how the finished control program works.

The software allows the DC motors to be turned on, turned off, set the power level as well as allowing encoders to be used. In order to use the motors the configuration coding should be entered at the top of the top of the program. The line “#pragma config(Hubs, S1, HTMotor, HTMotor, HTServo, none)” sets sensor port 1 (S1), and configures it to have two motor controllers and one servo motor controller chained together. After that each hub must be set. To configure a motor you use the line “#pragma config(Motor, mtr_S1_C1_1, motorD, tmotorNormal, openLoop, reversed, encoder)” this line sets the first controller (C1) on sensor port 1 (S1) as a DC motor plugged into the motor 1 slot. The command motorD, sets the name of the motor to be used in the program (motorA, motorB, and motorC are designated for NXT motors) and tmotorNormal sets the motor in normal mode. The motor can be set in openLoop or PID to use the internal PID controller. The PID mode can only be used if an encoder is attached to the motor and activated. The motors can also be switched between forward and reversed modes in this line. Once these lines at entered it allows you to use motor control commands. The following code is a sample motor program:
#pragma config(Hubs, S1, HTMotor, HTServo, none, none)

#pragma config(Motor, mtr_S1_C1_1, motorD, tmotorNormal, openLoop)


task main(){

motor[motorD] = 75; // Motor D is run at a 75 power level.

wait1Msec(4000); // The program waits 4000 milliseconds
motor[motorD] = 75; // Motor D is run at a 75 power level.

wait1Msec(750); // The program waits 750 milliseconds

}
The code runs the motor forward for 40 seconds and backwards for 7.5 seconds.

Servos are programmed in a similar way. The hub must be configured for a servo controller in one of the spots. The line “#pragma config(Servo, srvo_S1_C3_1, , tServoNormal)” sets the third controller (C3) on sensor port 1 (S1) as a servo plugged into the servo1 slot. Unlike motors the tServoNormal command is the only command that needs to be entered, but an empty placeholder spot may still have to be left.

The following code is a sample servo program.

#pragma config(Hubs, S1, HTServo, HTServo, none, none)

#pragma config(Servo, srvo_S1_C1_1, , tServoNormal)
task main()

{

while(true)



{

if(ServoValue[servo1] < 128) // If servo1 is closer to 0 (than 255):

{

while(ServoValue[servo1] < 255) // While the ServoValue of servo1 is less than 255:



{

servo[servo1] = 255; // Move servo1 to position to 255.

}

}
wait1Msec(1000); // Wait 1 second.



if(ServoValue[servo1] >= 128) // If servo1 is closer to 255 (than 0):

{

while(ServoValue[servo1] > 0) // While the ServoValue of servo1 is greater than 0:



{

servo[servo1] = 0; // Move servo1 to position to 0.

}

}

wait1Msec(1000); // Wait 1 second.



}

}

This program reads the servo value and moves it to the closest end stop.



The controller we used required the add on library "PSP-Nx-lib.c" to make the buttons resond properly. A wireless PSP controller was used to control the robot using one button to control each degree of freedom. The layout of the controller buttons is as follows and their names:

L1 R1


L2 R2

d triange

a c square circle

b cross


l_j_b r_j_b

l_j_x r_j_x

l_j_y r_j_y
The line “PSP_ReadButtonState(SensorPort, Addr, currState)” checks to see if any of the buttons have been pressed using a Boolean state, 0 for pressed, 1 for not pressed. The joysticks return a 0 at center and has a range from -100 to 100.

Combining the above knowledge we were able to create a program to run all the above components of the arm. Motor 1 controls the shoulder, motor 2 controls the elbow, servo 1 controls the wrist up and down, servo 2 controls the wrist left and right, servo 3 open and closes the hand, and servo 4 moves the entire arm left and right. The pseudo code for the control program is as follows:


If triangle is pressed move shoulder up

If square pressed move shoulder down

If circle pressed move elbow up

If x pressed move elbow down

If joystick2 pushed up move wrist up

If joystick2 pushed down move wrist down

If joystick2 pushed left move wrist left

If joystick2 pushed right move wrist right

If R1 pushed close hand

If L1 pushed open hand

If R2 pushed move arm right

If L2 pushed move arm left


3.3.3 Head Code

The programming of servo motion for creating facial expressions and head movement was done using Visual Show Animation (VSA) software Version 3.012 from Brookshire Software LLC (www.brookshiresoftware.com). The software main screen (Figure 3-7) which allows sequences of motions to be pre-programmed for playback at any time. The main program screen is dominated by tracks (one per servo) which have a title at the left and a timeline stretching towards the right. The title can be customized in the tools->settings dialog which greatly simplifies motion programming (e.g. head tilt instead of channel 3). The default positions of each servo are always the starting point of all motion and these positions are set in the tools->settings dialog also.

Figure 3-8 shows the settings dialog which contains important settings for control of the servos.

The device settings tab controls which tracks are active and what name each track is given. Click on the name of the device you wish to rename and the program will let you type in a new name. Again, this is highly recommended for ease of programming. The communications (COM) port is also set here and this setting (for each servo) will have to match the physical COM port that the ESRA controller is attached to.

The minimum, maximum and default values for each servo can be set here also and this is critical to getting a proper known starting position for motion.



Double-clicking the +value, -value, or default of any servo will bring up a small dialog box (Figure 3-8) which allows for the setting of all three values. The corresponding servo will also be 'live' and move assuming that the COM port is active, and that the ESRA controller has power (9 volts for the controller and 5 Volts for the servos) and that the servo is plugged in.
Setting minimum, maximum, and default values for all channels is critical to have a known starting position for attachment of the mask and for emotion / position automation setup.
Once the software and hardware are set up and communicating, and the minimum / maximum / default positions are set, programming a sequence of motions can proceed. The VSA interface allows you to drag a bar into existence on the timeline for a particular channel and then double-click the bar to set the end position for the time period. Note that you cannot change the beginning position because that was either the default (if it is the first bar in the timeline for this channel) or it was the position that terminated the last motion command for the chosen channel.
Emotion / motion programming is then broken into a sequence of trial-and-error steps of choosing a servo, creating a motion-bar for that servo, programming in the end position for that time period and playing it back to see if the desired motion for the servo was achieved. Once the results are as desired, the next channel is programmed until all required channels for the sequence is programmed and plays back properly.
A sound track can be added to a programmed sequence by selecting tools->load audio file from the menu in VSA. The audio file will show as a waveform along the bottom of the screen and you can shrink/stretch groups of signals to match up motions with sounds as heard during playback.
Once an emotion sequence is correct, the file is saved as a .vsa binary file into the playback directory. We chose to have all of our files reside in a directory at C:\VSA for simplicity in access.
The VSA directory then contained all of the .vsa files, the .wav audio files, and we created .bat files to actually call the VSA program and run the .vsa file. A sample batch file is as follows:


vsa "happy.vsa" /play /minimize /close
Note that this is the only line needed in the batch file. The batch file was created with notepad and then saved as "C:\VSA\HAPPY.BAT". Now when the batch file is run (from the start->run menu in windows XP, or from a command prompt), the emotion is played along with the corresponding sound. The VSA program is run in a minimized fashion and closes automatically when playback is complete due to the command line switches given in the batch file. Figure 3-9 shows a single frame captured during execution of happy.bat.


Note that the system path (figure 3-10) must be set for the operating system to find the VSA.EXE file and execute it properly. This can be set by right-clicking "My Computer" and choosing 'properties' then clicking the advanced tab and then the 'Environment Variables' button near the bottom of the screen. In the resulting dialog box, click on the variable 'path' in the 'system variables' area and choose 'edit'. A window will pop up with your current system variable setting for Path. Don't change any existing part of the path, just cursor-over to the end of it and place a semicolon (;) at the end of the line and fill in the path to the VSA software. By default it is "C:\Program Files\Brookshire Software\Visual Show Automation" (without the quotes). Then click OK and close all the open dialog boxes. The system variable change will take place immediately and so now the batch files will run properly. This can be tested by opening a command prompt and navigating to the vsa directory with the command "cd c:\vsa" and typing "vsa" and then hit . The vsa program should open up on your desktop. If this works, then the system path is set up correctly.

1   ...   8   9   10   11   12   13   14   15   ...   41


The database is protected by copyright ©dentisty.org 2016
send message

    Main page