Real Life Engine



Download 119.53 Kb.
Page4/6
Date conversion08.07.2018
Size119.53 Kb.
1   2   3   4   5   6

Prototype Implementation


The prototype was implemented was implemented on a laptop with two additional pieces (other than the computer itself) of required hardware. Additionally, the prototype included several pieces of both specialized and COTS software. The list of each is as follows:

Hardware:

  1. Dell XPS 14Z laptop, which included the following specs: Intel i5-2450M CPU @ 2.5 GHz; 8 GB RAM; NVIDIA GeForece GT 520M; Win 7 SP1, 64-bit. (This specific laptop is not required, but the laptop's specs are listed here in give an idea of the components on which the prototype ran. New users can acquire a laptop of similar capability in order to setup and use the RLE in their own homes.)

  2. The “Oculus Rift” virtual reality headset (Official site: http://www.oculusvr.com/)

  3. The Microsoft Kinect (The Xbox 360 version was used, but all data indicates that the new Kinect 2.0 [name unreleased at this time] for the upcoming Xbox One will not only work with the RLE, but will enhance the capability of a newer version of the RLE.) (Official site: http://www.xbox.com/en-US/KINECT)

  4. Nyko Power Adapter for Kinect (Used to convert the proprietary Kinect plug to a wall socket power plug (due to higher power requirements) + standard USB interface. Amazon sells these. Example: http://www.amazon.com/dp/B004UPPAE2/ref=tsm_1_fb_lk.)

Software:

  1. The “Flexible Action and Articulated Skeleton Toolkit (FAAST)” (Downloadable here: http://projects.ict.usc.edu/mxr/faast/)

  2. Two XML gesture files that I created with FAAST (the files are used to select gestures to watch for from the Kinect's skeletal motion capture feed and translate these gestures via the FAAST software into keyboard strokes). They can be downloaded here:
    https://docs.google.com/file/d/0B2Pqzv_oQNktUjFRR2xlLUMwWGc/edit?usp=sharing
    https://docs.google.com/file/d/0B2Pqzv_oQNktR1pMZkFSWk52TzQ/edit?usp=sharing

  3. Kinect SDK and Drivers (http://www.microsoft.com/en-us/kinectforwindowsdev/)
    * (Alternatively, “OpenNI” can be used instead of the official MS drivers. However, I did not have any experience with this software in my prototype testing. The OpenNI software can be downloaded from their website: http://www.openni.org/.)

  4. Portal 2 (The particular game I chose on which to demo the prototype. Due to the mind-bending nature of the game, it's use of advanced wormhole physics, and it's backstory of each level being a 'test' for the game's laboratory, it was a natural fit for my prototyping. The official site is: http://www.thinkwithportals.com/.)


Running the prototype – Initial Setup and First Run:

  1. Acquire the required hardware and plug them each in to any available USB 2.0+ slots.

  2. Install the required software

  3. Run FAAST

  4. Create your own XML gesture files in FAAST or download and use mine

  5. Run Notepad and type in a some very simple text for pre-testing of the FAAST gesture file. Example: “aaaaaaaaaaaaaaaaaaaaa” x3 lines.

  6. Click "Connect" in FAAST to allow the FAAST software to use your Kinect device.

  7. While out of range of the connect sensor or with something (such as your hand or a piece of cloth) covering the camera, click "Start Emulator" in FAAST.

  8. Switch to notepad and move within range or uncover the Kinect's camera.

  9. Perform some of the gestures that are setup in the FAAST gesture file.

  10. Watch what is typed and/or clicked in the notepad window.

    1. If the Kinect and FAAST are responding as expected, move out of range of the Kinect or cover the camera again in prep to run Portal 2. Proceed to step 11.

    2. If the Kinect and FAAST are not responding as expected, troubleshoot as follows:

      • Adjust lighting and/or move to a different spot in the room such that the Kinect “sees” you more clearly. Stand in a “T” pose and watch the motion capture feed in FAAST until you can clearly see that you are being properly tracked.

      • Edit the gestures so that the movement you are trying to detect is more accurately captured. For example, instead of setting forward movement in game to occur when your left leg is flexed by only 30 degrees (which in my tests proved to activate too often), set the requirement to 45 degrees. Or as another example, instead of setting the primary fire command to “Left hand forward at least 15 ft/sec” (which again caused false positive activations), try using “Left hand in front of left shoulder at least 1 feet.”

      • Confirm the Kinect itself is working properly by using it on a made-for-Kinect game such as “Kinect Adventures” on the Xbox 360. If problems were discovered while playing, test the Kinect sensor on another Xbox 360. If you still have problems, your sensor may be malfunctioning. Contact Microsoft for support. Otherwise, your Kinect sensor should be fine. Plug it back into the PC and try again.

      • Close all other open programs or restart your computer and try again. Confirm that you meet at least the minimum specs to run Portal 2 and have sufficient CPU power and available RAM remaining after your computer's start-up sequence has completed. Re-run FAAST and Notepad (or Portal 2), then try again.

  1. Run Vireio Perception. Use the default settings “Oculus Rift” w/ “OculusTrack.”

  2. Run Portal 2. (Recommended settings: “Windowed (No Border)” for fast alt-tab switching between Portal 2 and FAAST, 1280x720 resolution to most closely match the Rift's native resolution of 1280x800.) Load up a saved game or start a new game.

  3. Move within range or uncover the Kinect's camera, and then begin by “calibrating” (teach it 'where' you are within 3D space) the Kinect sensor with a standing or sitting “T” pose (depending on which configuration you opted to use).

  4. Finally, test out a few gestures and then play the game!


Running the prototype – Successive runs

Additional runs with the RLE prototype after setup and the first run are completed as follows:

  1. Plug in Oculus Rift and Kinect hardware (if not already)

  2. Run FAAST and load up the FAAST XML gesture file.

  3. Click "Connect" in FAAST to allow the FAAST software to use your Kinect device.

  4. While out of range of the connect sensor or with something (such as your hand or a piece of cloth) covering the camera, click "Start Emulator" in FAAST.

  5. Run Vireio Perception and then run Portal 2

  6. Move within range or uncover the Kinect's camera, and “calibrate” the Kinect sensor

  7. Play the game!

In addition to the plethora of research I gathered during the very early stages of my project, I gained a precise understanding of the state of current motion capture and virtual reality technology. During my research and prototyping, I played in various virtual reality environments and used the Kinect in various games. These included the game levels (“test chambers”) in Portal 2 as well as games that were originally made for the Kinect by the developers, rather than retroactively implemented as I have done with Portal 2. Doing so taught how enjoyable an RLE experience can be, as well as some the leaps and hurdles that need to be resolved before the experience can be as immersive as I hope it to be one day.

First, while the Oculus Rift official demo app (“Tuscany Demo”) is extremely responsive to the movement of the Oculus, 3rd party support of the device via the “Vireio Perception” app is not quite as responsive. It's close, but better support could be implemented by using the 1st party .dll libraries that are bundled with the Ouclus Rift SDK. This would require direct support from the developer – which they have done with several other games (Half Life 2 and Team Fortress 2), but have not (yet?) done with Portal 2.

Second, the Kinect gesture tracking via FAAST made for an amazing gaming experience where I was able to use my body for 99% of all movement within the game. However, the FAAST software and Kinect often created false positive activations within the game causing me to move in undesired ways or fire a portal at incorrect times. I was able to “correct” much of this by learning how to “mesh” with the hardware and software through multiple uses of the RLE within Portal 2. However, the experience was still far from perfect. I have played games that were made for the Kinect and had a much better experience with the body tracking and gesture recognition to game input translation. Therefore, I believe that most of the issues I encountered could be corrected with a greater understanding of the Kinect SDK and/or first party support of the Kinect within Portal 2 directly from Portal 2's developer, Valve.

Third, because the game was not originally designed for the Kinect, I often needed to reorient myself to ensure both the by positioning my legs against the couch, facing what should be forward using my proprioceptional awareness of the position of my head and approximate location of true forward when facing the Kinect, and then standing in a “T” pose again for a few seconds before resuming regular gameplay. Although this eventually became second nature, it still broke my immersion in to the game and therefore degraded the experience with the RLE. Some of this was due to the unique nature of the puzzles within Portal 2. But other parts of this problem are again due to either a lack of more thorough understanding of the complexities of both the Oculus Rift and the Kinect, or a lack of 1st party support of both devices from the developers.

For additional visual aids and example videos of my project, please check out the “Phase 0.x” sections (e.g., “Phase 0.1 - Testing the Kinect alone in Portal 2”, “Phase 0.2 - First Test that Combined Kinect and Oculus Rift”, etc) sections of my Prezi: http://prezi.com/hsr4_kfunvgk/?utm_campaign=share&utm_medium=copy.


1   2   3   4   5   6


The database is protected by copyright ©dentisty.org 2016
send message

    Main page