Real Life Engine



Download 119.53 Kb.
Page2/6
Date conversion08.07.2018
Size119.53 Kb.
1   2   3   4   5   6

Project Description and Innovation Claim


The goal of this SIP is to create a system that will allow players to visually and audibly travel into the infinite worlds of fictional universes as if they had traveled there in real life. Players should have a sense of sight within the game as if they were seeing through their own eyes and a sense of sound as if they were hearing through their own ears. The built in “head tracking” technology in devices such as the Oculus Rift or the Vuzix iWear VR920 headsets will even allow players to look around and move the camera automatically, just like one can do with the Wii-mote or Wii U tablet controller (Nintendo, 2011). One of the end-game goals of the RLE (after several post-graduation phases) will be to engage even more of the senses – including proprioception – and engage them that much more realistically for these other-worldly tours.

The intended user population is everyone – not just gamers, but literally the entire world. Because of it's start in gaming, the RLE will initially draw mostly just the gaming crowd. However, once the additional applications in business and communications have been tangibly realized, the RLE could become as widespread as the mobile phone. I envision the RLE becoming a transformational success if I can complete the remaining post-graduation phases of development and produce the ultimate, unrivaled, and ultra-realistic environmental immersion rig that I envision it to be capable of being one day.

[Post-graduation task: Add more details about new tasks that are supported as additional phases are completed. These will include the 3 facets that I mentioned earlier in this document: Gaming (to a larger extent), the RLE as a tool for businesses [e.g., business-centric telepresence needs], and the RLE as a new form of communication.]
  1. Usage Scenario


This project will blur the lines between reality through its ultra-realistic technology. Future versions of this technology will enable real time telepresence interaction with the real world. Progressive iterations will explore the RLE’s potential in the areas of communication and business while further developing even greater gaming applications. In order to do this, the software client will connect to a massive server-side database that polls multiple cameras, GPS devices, and other technology to allow the player to move within a real-time updated virtual copy of the real world. Interaction with real world people will be driven by both computer algorithms that synthesize responses based on fuzzy-logic enhanced finite-state machines as well as augmented reality and will allow multiple clients to interact with each other when one, both, or neither one of them is physically located at the real world location. A still further version of the RLE in the future strives to achieve technology such as holographically projecting a player into the real world (allowing users to interact in real time with much less augmentation of reality), the coveted “smell-o-vision,” and even full duplex touch-feedback of real world objects via ultra-fidelity force feedback and further augmented reality.

However, in order to wisely divide and conquer this monumental undertaking into small and manageable portions, the general goal of this SIP is “simply” to create the foundations for a commercially viable immersion system. Players should be able to travel into the infinite worlds of fictional universes as if those realms were no different than the physical places they might visit in reality. Users can already take virtually tours through parts of the real-world using current telepresence technology. The foundations of this SIP will simply begin to go one step further. (The rest of RLE’s lofty goals will be postponed until post-graduation phases.) Gamers must get the feeling that they are, for example, back in Rome during the Renaissance, or in a wonderful world created by Disney’s imagineers, or traversing the lands of Vana’diel, conquering the realms of Azeroth, or perhaps battling the Protoss, Terran, or Zerg on a distant planet. Furthermore, potential development goals may even include retro-integration into games that can be supported.

Future iterations will also include augmented reality technology such as that found in the recently released Sony’s EyePet[1] or inviZimals[2] games, but with greater utilization of this technology such that developers can use a gamer’s living room just as easily as they can place their characters in their own levels. This forefront could allow more than just animals to join us in our gaming sessions. And this portal to our living rooms forms the basis of some of the more advanced aforementioned features.

For examples of how Phase 1 of my project can be used, please check out the “Phase 0.x” sections (e.g., “Phase 0.1 - Testing the Kinect alone in Portal 2”, “Phase 0.2 - First Test that Combined Kinect and Oculus Rift”, etc) sections of my Prezi: http://prezi.com/hsr4_kfunvgk/?utm_campaign=share&utm_medium=copy.


  1. Evaluation Criteria


  1. When you wear the Oculus Rift, how well does it visually immerse you in the game? Optically, were you able to suspend your mind's disbelief and mentally place yourself in the game? What enhanced or detracted from your attempt to do so? Was there any stutter or jitter in the display that jolted you back to reality and prevented fuller immersion?



  2. How well does it audibly immerse you in the game? Audibly, were you able to suspend your mind's disbelief and mentally place yourself in the game? What enhanced or detracted from your attempt to do so? Was there any stutter or jitter in the sound that jolted you back to reality and prevented fuller immersion?



  3. How well does it track your head movements? Did it feel realistic? If so or if not, how much so or how much not? How well did the head tracking synchronize with the video it displayed? Did you experience a jarring lag, or even motion sickness? And if so how much?



  4. How well does it track your body movements? Did you experience any lag or did your movements feel about in sync with the game?

All criteria will be tested by one of two methods:



  1. Equipping the elements of the RLE prototype, activating the applicable background software, and traversing through test chambers in Valve's Portal 2.

  2. In the event that method 1 is unavailable, drawing on one's own experiences with the Oculus Rift, the Kinect, or games in general, and analyzing the tests I've posted to YouTube will have to suffice.
1   2   3   4   5   6


The database is protected by copyright ©dentisty.org 2016
send message

    Main page