Real Life Engine



Download 119.53 Kb.
Page1/6
Date conversion08.07.2018
Size119.53 Kb.
  1   2   3   4   5   6

RLE, VER. 1.0 – TURNING FICTION INTO “REALITY”



Real Life Engine

A device to to intentionally blur the dividing line between fiction and reality via ultra-realistic telepresence technology

Nathaniel Swanson, Reality Engineer

University of Advancing Technology

Circa March 2008October 2013




  1. Abstract


The end goal of the “Real Life Engine©” (aka, the “RLE©”) is to intentionally blur the dividing line between fiction and reality via ultra-realistic telepresence technology. Initially, the primary objective of this system will be to achieve a new level of immersive interaction with games. However, post-graduation iterations will explore business and communication avenues.

The “Phase 1” (undergraduate level) goal of this SIP is to design and implement the audio/visual immersion headset portion of the RLE, including some form of motion tracking. A specialized software package should also be designed and utilized to create the beginnings of the advanced telepresence technology. Finally, this phase should create an alpha-level release candidate combination package of the hardware/software setup and should give players a small taste of what it will be like to travel into the infinite worlds of fictional universes as if those realms were no different than the physical places they might visit.

The instructions for this alpha-release include what hardware to purchase, combine, and how to demo the hardware in a specialized environment designed to showcase the finer points of the system. Phase 1 of the project was completed by utilizing the test platform created during design phases and successfully navigating through at least 3 environments. And said platform created during Phase 1 is the audio/visual immersion headset portion of the RLE, including motion tracking. Complete details on that can be found on the SIP section of the portfolio website.
Keywords

immersion, realism, simulation, controller, virtual reality, augmented reality, video game, Oculus Rift, Kinect, Omni Vuzix, Portal 2


  1. Background Information and Prior Art


Sub-Section 1: Intro

Circa June 2011 technology required players to look into a video monitor and imagine that they are in the game. It takes a strong effort to ignore the distractions of reality and truly immerse oneself in the game. Devices like the Microsoft Kinect have done wonders for releasing gamers’ hands from the immersion-diluting shackles of a physical controller, but games still haven’t reached the pinnacle of technology wherein players no longer have to “pretend” they are in the game by use of “childish” make-believe skills. The Oculus Rift, a virtual reality head-mounted display that has since been released, has brought immersion levels exponentially closer towards that goal of reality-blurring immersion, but there is still a long ways to go.



The project is a set of stepping stones toward answering the questions of that imperfect immersion. The current generation of games, for example. still require players to hold a control and/or look at a screen, thus reinforcing the idea that the game is just a story observed from afar, not an immersive experience that thrusts the user into the heart of each situation. Most of the research was done via internet searches while other info came simply from my own personal thoughts as I dreamt up the ideas. Still other experiences were gained at GDC 2011.

An end-game goal for the RLE (much later in the post-graduation phases) is to have one version of the RLE produced that can entirely use commercial off the shelf products if possible, and have another version that uses a single proprietary devices that combines all of the functionality into cohesive and practical product.

One of my grand over-arching concepts is that around 90% of the technology to do this is already there -- someone or some group just needs to have the skillset / wisdom / forward-thinking to put it all together and bring the world the next giant leap into immersion. I certainly don't have all of the answers, but I really enjoy this field of study and I'm hoping that I can at least be on the team that helps make it (the next great technological leap) happen.
Sub-Section 2: Technology That Will Make the RLE Possible and Potential Competition

Nortel’s new web.alive (2010) technology looks like it could start in the lead (prior to RLE) as the business collaboration tool of choice because of its ability to let users “see where people are looking,” “make eye contact,” and “gain a sense of presence.” This ability to immerse people in their business environment looks promising and should be researched for the entertainment scope in order to beat them at their own game.

Biggs’s 2010 article on Crunchgear.com demonstrated one of the most amazing physics engines I’ve ever seen. The technology demonstrated at this site truly does take ones breath away. It “essentially simulates real-life physics in a completely realistic way” and such realism will be absolutely mandatory in the ambitious RLE project.

Bonsor’s 2010 article from HowStuffWorks.com gives lots of details about precisely how Augmented Reality works. It details how still images and live video capture of real life scenes can be “augmented” (hence the name) with advanced image processing, manipulation, and rendering, techniques in order to inject virtual objects into the real world. This article will be especially useful not for its great augmented reality info (used for later post-graduation iterations of the RLE) but for the insight it provides into what areas I’ll need on which I’ll need to focus for the virtual reality I will be providing via the Vuzix iWear 920VR eyewear.

Another great device is the ePI Lab. (2004). The ePI Lab, or EyeTap Personal Imaging Lab is a device that can enable “mediated reality” to be streamed directly into a user's vision. Although AR will not be included in the SIP iteration of my RLE, some of the technology (such as head-tracking) that makes it possible can be used to create enhanced virtual reality viewing of games.
Sub-Section 3: Movies and Games That Inspired the RLE

The RLE was inspired in large part by the classics. Understanding games like Atari’s Pong (1972) will provide a foundational basis of one area where gaming began and how the very concept of “games” made their way into society. By emulating a modernized version of Atari’s marketing techniques, the RLE may have a better chance at being readily accepted as a new entertainment medium.

James Cameron’s Avatar (2009) showcased amazing (yet still fictional) technology used by the humans as they control their Avatar. This is not unlike what I am attempting to do with the RLE. The difference is that I do not intend to have the RLE control a physical body but rather a virtual one that is (initially) within a game.

Linden Labs’ Second Life (2010) game has done extremely well at living up to its name. I could talk about all that the RLE stands to learn from the RLE for several pages… Suffice to say, studying how Second Life draws in its community and created an entirely new world of sorts will greatly benefit the RLE and help it become the ultimate system.

Director J. Mostow’s 2009 movie Surrogates is a movie that presents another interesting possibility of what the RLE technology could become in the future. However, as with Avatar, the technology would be used to connect to a virtual presence, not (in this case) a physical robot.

Sony’s inviZimals™ game for the PSP (2010) is a truly revolutionary game (or certainly one of the first of its kind at least) that uses augmented reality features to allow in-game characters to interact with reality. In order for the RLE to reach its full potential, I will need to understand and implement this technology is the RLE system. While again, the SIP will not contain AR tech, it will however use a lot of the same camera-tracking technology found in the game.

Even Square-Enix’s 2003 (US release) game Final Fantasy 11 still has much to teach me long after I have left. My many years of experience playing FF11 provided me with lots of insight into one of the most social gaming genres, the MMORPG. This information may help refine the structure and organization of the RLE so that it is more familiar and user-friendly. Their latest MMO game, Final Fantasy 14: A Realm Reborn (2013) is on track to perform even better, and with its newer more capable graphics engine, the immersion strength is even stronger. It is one of the most stunningly realistic games to date. Further play testing (for research, of course) should continue to provide even more valuable insight.

By the same token, what would a good technology be without reference back to a classic such as Robert Zemeckis’ 1989 hit Back to the Future? Part 2 of the trio contained a great and almost prophetic line. In order to play most games today, the user still has to use a physical controller. In the immortal words of the unnamed child at that diner in Back to the Future II, “You mean you have to use your hands? That’s like a baby’s toy!” Baby's toy indeed – there is a small but growing sub-genre of motion-controlled games, but that medium is still in its infancy. The RLE engenders to enable this hands-free future, perhaps even by the year 2015, the “future” time period in BTTF. (Maybe someone else will invent flying cars too?)


Sub-Section 4: Particularly Useful Repositories of Data

Shepherd’s 2010 Second Life Grid Survey. provided the following data: Second Life had (in 2010) 510,272 acres (2065 km2) of land to explore. Imagine being able to explore this with even more realism provided by the RLE. Or, imagine having the ability to explore every other fictional world that ever was or ever will be created! The RLE can make this possible.

Next up, Rudi Volti’s book Society & Technological Change, 5th edition (2006) had several good things to say. He said that “electronic [entertainment] has advanced in conjunction with other historical changes, making it very difficult to come to an exact reckoning of [its] unique contributions to culture and society” (pg. 217). The book provides insight into the social and philosophical questions raised by the very creation of the RLE.

Another great set of data was Nick Yee’s 2005-2009 undertaking, The Daedalus Project. It is a “long-running survey study of MMO players” (approximately “10 years” worth of data) that contains an incredibly vast amount of information that should help me understand exactly what draws in players in to the MMO genre and use that information so that RLE-based games can conquer their competitors.



[For additional information, please check out the “Other Innovators” section of my Prezi: http://prezi.com/hsr4_kfunvgk/?utm_campaign=share&utm_medium=copy.]
  1   2   3   4   5   6


The database is protected by copyright ©dentisty.org 2016
send message

    Main page