Sunday, February 26, 2012

Back On Track

After all of the breakdowns in the lab over the past weeks, I have finally gotten back on track. The final equipment issue that we had, with the computer we use to stream Autodesk MotionBuilder, ended up being a simple faulty wire so after that was replaced I was ready to debug and get my experiment ready to go.
The other undergraduate in the lab, Stephen Bailey, helped me out by being a subject and I got my throwing experiment ready to the point where I will show my adviser, Professor Bodenheimer, tomorrow and hopefully start running subjects later in the week.

I will also work this week to get the experiment ready for the object interaction task. I think we will be measuring real world and virtual world performance for a sorting task. The measure of task performance will be the time it takes to sort the objects (most likely cups with numbers on them). There are, however, number of other measures that we will use to evaluate user performance. For example, a human hand's acceleration profile while reaching and grasping an object is a well-documented behavior and something that would be interesting to draw a comparison between in the virtual and real worlds. I think that the acceleration profiles will obviously differ; it will be interesting to see how they differ and if it is a consistent difference across subjects. I will also be administering a questionnaire to the subjects in order to gather some information that I cannot quantify. I want to measure the participants' feeling of presence in the environment and also get their feedback about what they liked/didn't like about the interface. It will also be interesting to see if the type of gesture (fist or pointing the index finger) affects their performance.

I hope that this coming week will get me started with running subjects so that I can focus on analyzing and writing up my results after spring break. Wish me luck!

Monday, February 20, 2012

Back online!...almost

Good news! The VR lab is back up and running so I have been able to make some really good progress recently on my projects. 

Last week, after the lab was ready to get running again, I ran my object interaction script for the first time, and was pleasantly surprised that it worked right away! There were a few kinks to work out, but after fixing those I was able to reach out and 'grab' virtual balls and move them to other positions. I was very happy with the results and had a lot of fun testing out the interface myself. After moving the objects around for a few minutes I lifted the HMD off of my head to see the balls in the real world--it was at that moment that I realized how immersed I had been in the environment. I have been around VR for a few years now and have always felt that I have a clear distinction in my mind between the real and the virtual worlds and this is the first time that that line has been blurred. Later that day, when I was trying the interface again, I did the same thing again. I am very excited about what this could mean for the effectiveness of VEs in the future. 

The other project I am working on is a third experiment to add to the previous two throwing experiments that I presented at APGV last August. This third experiment will be put together into a journal article that we will submit to Transactions on Applied Perception (TAP) for publication. We are hoping to gain a better perspective on the role that motor and visual feedback plays on our perception of our own actions. This experiment may have to be halted, however, as another one of our computers has decided to stop working, and we will have to install Autodesk Motionbuilder on another machine if I want to move forward with testing, piloting and then running subjects. 

Finally, I finished putting together one journal article and am waiting for two of the co-authors, in Germany, to review it and get back to me with suggestions. We will submit it in two weeks and hopefully hear back about a month after submission. 

Tuesday, February 7, 2012

Houston, we have a problem

The functionality of the LIVE lab is based on a complex system of software and hardware, all interfacing over the Vanderbilt network. As I mentioned in my previous post, I went into the lab for the first time in the semester and I realized that there were a few bugs that would have to be fixed before I could get a good start on my research. We solved the issue of the Cyberglove by calling the Vizard help desk, turns out we had to specify the correct port that the Cyberglove was streaming data on in our code. It was a simple fix and was simply a result of an updated plug-in from Vizard for the glove.

This would have been the end to my original list of problems, but one of the machines, which handles the entirety of the motion tracking functionality in the lab got a virus. The other undergraduate in the lab discovered the virus and did a system restore in order to get rid of the virus, but when the system restore was finished we discovered that the computer's networking capabilities had been corrupted and no longer worked. In our lab, this means that any data we were getting from our Vicon motion tracking system could not be sent to the computer with Vizard on it, the machine on which we create and run our virtual environments. As of now, the head of our IT department has addressed this issue and will perform a memory wipe, re-install lost programs and get the lab up and running again. Unfortunately this will not be finished for a few more days so in the meantime I will be concentrating on writing a journal article and getting that submitted as soon as possible.