Tagged: Digital Signage

Oculus Rift — The Nausea Machine

oculus

Hey Guys!

Our Oculus Rift SDK2 kits are in! I spent a great deal of time with it over the weekend so I’d like to give a quick overview of what I’ve discovered:

1) It’s pretty difficult to setup. It took about 5 hours of fiddling with settings and looking online through forums to get this thing working properly–that being said–I was using my laptop and the SDK2 is not particularly fond of laptops with dual gpus like my own. Rather than use my Nvidia GPU, it will default to the integrated Intel GPU. This is a problem on their end. Regardless–I found a weird workaround to get this thing going! The downside is that I can’t mirror the goggle’s vision to my desktop–so you can’t see what someone is playing unfortunately until their dual GPU issue is fixed. Beyond that–learning the settings to adjust pupil distance, etc. is not particularly intuitive either, which is problematic because of the nausea that occurs if the settings aren’t tweaked properly.

2) Nausea, Nausea, and more Nausea. No matter how I tweak the goggle’s settings–some demos or games will consistently make me ill every time after about 5 minutes. Half-Life 2 is one of those games. Although it is stunning to explore–I get ill quite quickly and have to put the goggles down. I will continue to feel ill for about 15-20 minutes afterwards. It’s hard to pass it up though–nothing is quite as unnerving as walking up to a person in the Half-Life 2 engine and looking at them in true 3D–they look like a moving wax statue–yet are missing a soul–maybe it’s just me being metaphysical and weird–but it kinda creeps me out. They look real–and my brain gets confused. It’s a pretty awesome experience.

3) The head tracking is pretty awesome–the provided demo that comes with it will let you rotate your head around a branch to see its underside–it’s insane. You really do feel immersion–and get frustrated when you can’t grab things–believe me I’ve tried unconsciously reaching out to something to nearly hit my friend sitting next to me in the face.

4) It’s very lightweight compared to the older model. I think it weighs around 550gs–which is pretty nice.

5) The wires are crazy–although better than the first SDK. It requires a lot of careful stringing around, but no matter how carefully you run wires around–they inevitably get tangled in a complex assortment of attachments which makes moving it around quite hassling–especially when its moved, the wires are even worse off.

Overall–I’m pretty excited about using it with the Decatur St. environment and digital signage around campus. I don’t think we’ll get to it this semester unfortunately–although who knows–but soon enough–when we get the kinks worked out, like figuring out a method to measure people and create profiles before they use it to reduce possible sickness, we’ll setup some kiosks around campus for students to explore our SIF environments in awesome detail!

Cheers,
Robert

Fun With Digital Signage

UntitledHey all!

Just wanted to give a quick update on what we’re doing with digital signage. In addition to trying out the iPad portal idea–we’re looking into making LeapMotion controllable screens where students can interact with 3d scanned objects from the MARTA collection housed in the Anthropology Department.

It’s a little rough at the moment–but through Unity I’ve built a test run that’s working pretty well. Two hands enter the screen to manipulate the object with realistic physics. We’re using a 1920s whiskey jug at the moment–luckily I can’t break it in the virtual world, because I’ve dropped it multiple times. Using the LeapMotion is a bit of a skill in itself–albeit fun to learn.

Later this coming week I’ll give more detail and some screenshots of a more finalized version!

-Robert

The Weeks Just Keep on Getting Busier!

agisoftdemo

Hey guys!

This was another pretty productive week! Andrew and I ran two workshops in how to use Agisoft PhotoScan. The first workshop had no turnout unfortunately–but our second one this past Friday had a few very interested and excited people come along. I explained how the software worked and showed some examples of running through the workflow of building a 3D model based on a set of photographs. It’s a pretty awesome software package–but also needs some finesse i understanding the settings to get better results. These settings are key, because a single set of photographs has the potential to have great alignment–or not–all dependent on which settings one uses. I beseech someone to come out to the next set of workshops we hold this semester! We have the software installed on all out computers and that means we can start doing a lot of on-the-fly modeling in the workshops with various groups working at different workstations!

Next week my goal is to finish figuring out how to create a 3 cube based on a list of points rather than just a 2D plane–I’ve tried and failed a few times already so I have to go back to reading up on the workflow surrounding the triangle stripping. I’ll be excited to share with you next week what I figure out! This will help get our buildings accurate in in the 3D reconstruction of Decatur St–because I can start inputting accurate measurements for buildings that don’t follow a strict right-angle cubic polygon–which is all the buildings and sidewalks. We have another meeting scheduled with Michael Page from Emory coming up as well to start learning how GSU and Emory can team up to get this project running faster.

The last thing I want to mention is digital signage and some cool ideas we have about it for our campus. We’re in the process of linking two iPads together through a live video stream with the goal of creating a ‘portal’ around campus. One screen may be set in the student center, while another might be set in the plaza. This will allow students to see one another an interact from different places across campus in a novel little portal-like window. If the venture goes well–we may add more so be on the look out!

Cheers,

Robert