Real Life Engine



Download 119.53 Kb.
Page5/6
Date conversion08.07.2018
Size119.53 Kb.
1   2   3   4   5   6

Evaluation


Because of the unique nature of my SIP and the technology required to test it, evaluating my project will be an interesting task. I was of course able to do so because I own the required hardware (the Oculus Rift and Kinect). However, it took some creative thinking to enable the external evaluations. Thankfully, both of my intended evaluators had the opportunity to test out the Oculus Rift by itself, and both of them either already have or were easily able to gain some experience with the Microsoft Kinect. (There are testable display models at thousands of retail outlets across the country, so it was simply a matter of visiting at a retail outlet that demos it.)

Furthermore, I posted YouTube links for my evaluators to watch a detailed analysis from my project's website (http://reallifeengine.azurewebsites.net/thesis-demo-student-innovation-project/). This will show them the main project demonstration video on Prezi. Additionally, for my technology feasibility evaluation, I sent links to the extended version in order to paint a more complete picture of the four tests that I performed within Phase 1 of the RLE. I've even included (rather than edited out) the troubleshooting efforts I had to do so that this evaluator might have a complete picture of the process. Finally, I have the latest draft of my Innovation Brief uploaded and available at my SIP website in case either of them really want to dig deep into the raw specifics of the concept. With those pieces of information, They should be able to answer a few basic questions about the experience:

1. When you wear the Oculus Rift, how well does it visually immerse you in the game? Optically, are you able to suspend your mind's disbelief and mentally place yourself in the game? What enhances or detracts from your attempts to do so? Is there any stutter or jitter in the display that jolts you back to reality and prevents fuller immersion?

2. How well does it audibly immerse you in the game? Audibly, are you able to suspend your mind's disbelief and mentally place yourself in the game? What enhances or detracts from your attempts to do so? Is there any stutter or jitter in the sound that jolts you back to reality and prevents fuller immersion?

3. How well does it track your head movements? Does it feel realistic? If so or if not, how much so or how much not? How well does the head tracking synchronize with the video it displays? Do you experience a jarring lag, or even motion sickness? And if so how much?

4. How well does it track your body movements? Do you experience any lag or do your movements feel roughly in sync with the game?



Finally, each evaluator should specializes in a separate industry. Getting the additional insight from the viewpoint of a professional programmer (technical feasibility studies) and from a successful real estate agent (for marketing and commercial viability studies) will further aid me in evaluating my success or failure of the SIP.
  1. Project Completion Assessment


After submitting my requests for external and faculty evaluations, I received some very valuable feedback. So before my own concluding thoughts, I'll go over the responses (edited for formatting and brevity) as follows:
External Eval #1 (James Baxter):

Ever since the movie Back To The Future 2, the dream of having "toys" where you do not use your hands has eluded us. Personally, I still want my hoverboard... Turning blue here.
Having given up my gaming life for family and work, I will focus my part more on future vision and direction.
[You said that] you want to build the holodeck. Most other Star Trek tech we have [already] seen come to life, [such as] hand held communicators - cell phones, [or] Lt. Uhura and her "Bluetooth" ear pieces. Needless to say Gene Roddenberry was ahead of his time, or maybe science fiction becomes science fact. [He designed] a road map of what we need our engineers to build…
The reality of the matter that I have found [is that] if you have insane creativity, you tend to not have the business sense to know what to do with it. Not personally being the most up to date on the motion sensitive tech, my perception is that the tech is not all there yet. Playing with the hands free stuff is it still clunky and slow - not allowing for free "normal" movement, it seems (and seeing in your videos) you have to have a "stiff" focused movement.
I kind of view it like how early voice recognition software worked back around the 2000s, whereas today, it is rather seamless. [Or] IBM's Via Voice, [which] I bought for my iMac back in '99 for $200, and now [compare] it to Siri on my iPhone 12 years later… There's no comparison. Or think about when you call in [to an] automated service ([e.g.] credit card co, airlines, etc) – [they] are really seamless.
As with Via Voice, you had to have a clunky head phone and way to much effort to make it work. (Honestly, I never used the stupid thing -- I could type faster.) But now the hardware is much more streamlined and works a million times better… I [use] voice to text all the time.
You will need to do the same with your holodeck. Nobody wants to hold anything. It will need to work seamlessly. (Think about how Siri works on the iPhone -- it just does.)
And really in thinking about a holodeck, unless your are of the likes of Bill Gates or Larry Ellison, having a room dedicated to a holodeck, is not going to be practical to the masses. Please note [that] I used the words “masses.” There will always be the fanatics that will jump on the tech band wagon like me and my Via Voice purchase. But as with the Segway, they too will pass if not made main stream.
The Question is how do you make a holodeck mainstream and appealing to the masses? I will be the last to say it is not going to happen, [but] you have to find its place and make people believe they need it.
[Another thought:] As with most of the world tech of the last 100 years, the US military has been the birthing grounds for the majority of tech. [This] in turns allows/pays for the science to find the product's real life application, i.e. microwaves, cell phones, etc.
I see the proving grounds [being] with flight sims -- it is [practically made] for this. But can you take it a step further and use it to train infantry. [The question is,] could you get the tech small enough to give a real life experience while walking in place?
Right now you have to work in the confines of the tech, TV monitors, (which work great for flight sims) and head gear. But I think to make it work you need to loose the head gear. If you could put a controller in the players hands that looked like a M16 and reacted the same {Ed note: We can. See http://www.virtuix.com/videos/}, I think you would be on to something then. You would then have a military market and a gamer market for the first person shooters.
For now [if you wanted to try to eliminate head gear] you have the confines of the TV screens, but how could we eliminate that? The Rockband franchise has proven this… Maybe an [alternate input device] and a 360 projector {Ed note: Such as Microsoft's new IllumiRoom, perhaps: http://research.microsoft.com/en-us/projects/illumiroom/} would be the immediate answer. Or what about mounting an Oculus into an army helmet? {Ed note: Project Holodeck at USC has essentially done it this way already...}
Cheers,

James Baxter

REALTOR

2011, 2012 & 2013 Five Star Agent

Richard Realty Group, INC

Carlsbad, CA

www.JamesBaxterHomes.com

http://www.youtube.com/jamesbaxterhomes

Find us on Facebook "New Encinitas"

"Average is what anybody can do. Excellence is what I strive for."

lic# 01815256
External Eval #2 (Paul Swanson):

Wow Nathaniel, your project goals are nothing short of spectacular and your results thus far are impressive. From watching your video presentation of both the Oculus Rift and Microsoft Kinect, I was astounded at how connected to the game you became. Your movements were tracked and caused actionable input for the game. Your video portion of the visual feedback from the Oculus Rift was easy to follow and I felt part of the gaming experience with you.
Being a business programmer, I haven't been exposed to this type of technology. Some of the holographic displays from TV seem like more than just a cool trick now. I can see this type of technology [being] used well beyond the gaming environment. Business presentations, face time with remote employee's and group meetings could all use these types of features to enhance the experience and blur the physical limitations of location.
I agree with your summation of the project status in that you've achieved the phase 1 goals. The gaming experience from both the Oculus Rift and MS Kinect brought the immersion level to much more intimate degree.
Can't wait to see what phase 2 holds!
Carry on -- you have much to be proud of, [in both] achievement and effort.
Sincerely,
Paul Swanson

Web Developer III

Inventory Locator Service

www.ILSMart.com

Additionally, I also received the following comments from my faculty evaluations:


Faculty Evaluation #1 (Greg Miranda): “Creating a true virtual reality simulator is a difficult task, but you've made great progress on it. All the components are completed, functional, and well-integrated.”
Faculty Evaluation #2 (Todd Spencer): “Great work getting your very ambitious SIP project implemented to its current state!”

So how close did I come to reaching my goal? “Create a system that will allow players to travel into the infinite worlds of fictional universes as if those realms were no different than physical places they might virtually visit through telepresence technology.” After working out a plethora of 'bugs' (or sorts) and learning how to mesh myself with this new system, the experience really was totally unlike anything I've ever done before. The combination of the Oculus Rift with movement tracking by the Kinect made for a truly transporting experience. It was hard to fully immerse myself because of the constant need to reorient myself in order to provide the Kinect with the input it was expecting.

Also, a game like Portal really puts the Oculus to the test in terms of teaching one's mind which way is up, which way is down, which way is forward, and etc. It wasn't even an issue in the Tuscany Demo, so I surmise that this wouldn't be nearly as much of a problem in a more game involving more 'standard' style physics such as Skyrim. A game like Portal might still be a bit difficult to play on a virtual reality rig, but other games should play just fine.

Moreover, once I have my Virtuix Omni device, many of the problems of reorientation – both for proper Kinect input and for matching of on-screen to real-life head orientation – will be mitigated or even eliminated. This might even (as it appears in the Virtuix videos linked above) eliminate the need to use the controller entirely due to enabling of 360* movement..



Still, even with those problems mitigated, basic input through the Kinect is indeed a bit “stiff,” as James put it. This would have to be refined to have lower latency between input and a response with better accuracy in order to become a more viable consumer product.
That said, I'll continue with a brief Six Hats analysis:

  • White: The Oculus Rift and the Kinect each have been thoroughly tested on their own and have been vetted by consumers as able to provide solid gameplay experiences. The current iteration of the Oculus Rift (Dev Kit 1) can enable a virtual reality experience and the Kinect can enable players to use their bodies as input 'devices' when skillfully programmed to do so.

  • Red: The Rift provided an amazingly immersive experience and the Kinect made it that much more immersive. It really felt like I was there!

  • Black: The current Rift is a bit grainy – at 'only' 720p and with the pixel size and grid is not optimal for completely immersing me in to the game. The Kinect (via FAAST) does not appear to be able to keep up with commercial production quality games.

  • Yellow: Even with it's imperfections and developer-quality (i.e., not consumer-quality) hardware/software the RLE's Oculus Rift and Kinect components provide a more immersive than anything currently on the market today.

  • Green: Adding the Virtuix Omni in the near future and the other phases of the RLE in the far future could bridge the current gaps and get that much closer to a full fledged holodeck. Other creative ideas could be to focus on what it is like to interact in the real world, and then compare that to how one is able to interact in a game world. From there, I could identify the gaps between the two and attempt to create solutions to bridge these gaps.

  • Blue: What areas could I continue to focus on, such as more creative solutions with my green hat, more pros and cons with my yellow/black hats, and more feelings with my red hat, or more facts with my white hat? I should definitely reassess this analysis periodically throughout the upcoming post-graduation stages of the RLE.


Therefore, I would conclude simply that while I've done very well with the project thus far, I have a long ways to go. Both objectives, for players to have a sense of sight and sound as if they were seeing and hearing through their own eyes and ears (respectively) have been well met, and thus the initial goal was successful. But again, there is still much work to do. This is only marks the completion of Phase 1 of the RLE. Phase 2 and beyond will pose even greater challenges and will likely require even greater cumulative time and money resources.
Given all of the above, I believe these are the steps that I need to complete next:

  1. Design precisely what the multitude of post-graduation phases of the RLE will be

  2. Produce iterative prototypes, modify and redesign the RLE as necessary, produce and test more facets of the projects, and continue this cycle through to a set point (TBD) in the RLE's development cycle wherein step 3 and/or 4 can begin.

  3. At a given point in the middle of (but not end of!) development, test pre-existing games using software wrappers to retrofit partially-completed RLE to previous audio, visual, and control packages and create a demo for use in step 4.

  4. Market RLE first to developers (and early consumers?) via Kickstarter.

    1. Release beta SDK during this stage

  5. Test pre-existing games again using software wrappers to retrofit now-completed RLE to previous audio, visual, and control packages. Produce further demos for step 6.

  6. Market to consumers using demos of platform itself and gameplay on platform.

  7. Release RLE!

    1. Release updated SDK to developers

    2. Partner with developers to set release date deadline, then wait (while assisting as needed) for release line-up of games for RLE to be completed.

    3. At or shortly after deadline, offer RLE for sale to public with strong launch titles



1   2   3   4   5   6


The database is protected by copyright ©dentisty.org 2016
send message

    Main page