Sunday, April 8, 2012

Analysis

This week was devoted to analysis of the data that I collected last week. First, we took simple means and standard deviations of the subjects' completion times. The hard part is cropping up in the analysis of the hands' motion data. We want to look at a user's acceleration profile while grabbing both real and virtual objects. In order to do this, we need to look at the hand's position data and extrapolate when a subject grabbed an object. I did this yesterday by looking at the x, y, and z coordinates separately; I categorized a grasp by a somewhat steady position in the z direction, a downward movement in the y direction and a move forward and then backward in the x direction. This was all done by hand (no pun intended!) and 6 grasps were separated from the data for each participant, 3 in the virtual environment and 3 in the real world.

The next step will be to take this data and plot the acceleration profiles of each participant. The acceleration profiles of a human in the real world when grabbing an object is well documented and shown to resemble a bell curve, speeding up in beginning, and then slowing down as the hand approaches the object. It will be interesting to see how closely the acceleration profiles from the virtual world match those in the real world. If they are similar, we can conclude that perhaps the interface is a good first step towards providing effective virtual object manipulation. We always want virtual world performance to mirror real world behavior as closely as possible, and making sure that our natural movements are not being altered by a virtual environment is important to achieving this goal. If the acceleration profiles differ greatly then we can begin to look at why this is and try to fix this problem. For example, implementing Python's physics engine would ensure that users couldn't simply stick their hand through an object and must be more accurate when approaching the location of the object.

Before any conjecturing can be done, however, I have to finish with the analysis and this means filtering the noise out of the data and looking at the results.

Saturday, March 31, 2012

Another Experiment

In previous posts, I have mentioned an experiment involving object interaction that I am conducting both as part of my independent research and for my human-computer interaction class, this week was go-time. I finally got the interface and task description ready to go and ran 12 subjects this week.

The task was to take 10 blocks, each with a number 0-9 on them, and place them in order as quickly as possible. I had participants perform the task 3 times in the real world and 6 times in a virtual environment, taking their time to completion on each trial. Running the subjects went very smoothly, after a slight mishap involving a broken battery while running subject 1, and I will look at the results in the coming week, possibly writing this experiment up as a short paper for submission to APGV '12.

Write Up

With the subjects run and data analyzed, it was time to write up the results. This most recent experiment, along with 2 experiments that I presented at the Conference on Applied Perception in Graphics and Visualization last August, will make up a well-rounded study investigating perception and action in virtual environments which we hope to write up as a journal article and publish in ACM's Transactions on Applied Perception (TAP).

My week, therefore, was spent writing up the method and results of the current experiment, as well as altering my conference paper and drawing some conclusions. The rest of the semester will be mostly devoted mostly to writing and editing the journal and finishing up class projects.

Running Subjects

The week after Spring Break was a marathon of running subjects. We ran 16 subjects in 2 days and had the results the next day.
The purpose of this experiment was to investigate the importance of visual and motor feedback when making judgments about an action. In order to do learn about these types of feedback, we placed our participants into the role of both an actor and an observer. In both roles, the subject was 4 meters behind an avatar in the virtual environment; as an actor this avatar's movements were mapped to the subject's own movements and as an observer the avatar's movements were mapped to those of a helper who was in the lab with me. In the actor role, the subject watched themselves throw a ball which disappeared after release and then upon landing reappeared, along with another ball which was displaced to appear either closer or further from the subject than the actual landing point of the ball. The subject then had to respond with a 'further' or 'closer' response, indicating which ball they believed to be the one that had been thrown. The observer role was very similar, except that the lab assistant threw the ball as opposed to the subjects themselves.
As an actor the participant had both the visual information from watching themselves throw, as well as the motor feedback from performing the action. The observer, on the other hand, had only the visual information from the assistant's body movements.
After looking at the results, we found no significant effect of viewpoint, meaning that our subjects were able to perform the task equally as well without motor information as they were with it. This result leads us to believe that visual information is enough and that any addition information is not enough to improve our perception.

Saturday, March 10, 2012

Spring Break

This past week was my senior year Spring Break, so logically the first Saturday I woke up, ate breakfast and went to the lab!
With our lab finally back in action I was able to get the script for my final experiment up and running the week before break. I demonstrated it for my adviser, Professor Bodenheimer, on Friday and we decided that it was time to start running subjects. Our Saturday meeting was set up to show Professor Rieser of the Psychology department the final experiment and get his approval.
I will start running subjects tomorrow and hope to have 16 run by the end of Wednesday so I can analyze the data and write up the journal article in the following week and a half. I hope to have the entire project finished by Monday the 26th, just so that it doesn't drag out any longer than it should. After that I will have to deal with reviews of the journal articles and finishing up my senior year.
Don't worry though, Sunday morning I packed up my car and me and my closest 10 friends went to Gatlinberg, Tennessee for the week. We hiked, rafted, ziplined and fished and had a great week, and I even saved my boyfriend from a river.

Sunday, February 26, 2012

Back On Track

After all of the breakdowns in the lab over the past weeks, I have finally gotten back on track. The final equipment issue that we had, with the computer we use to stream Autodesk MotionBuilder, ended up being a simple faulty wire so after that was replaced I was ready to debug and get my experiment ready to go.
The other undergraduate in the lab, Stephen Bailey, helped me out by being a subject and I got my throwing experiment ready to the point where I will show my adviser, Professor Bodenheimer, tomorrow and hopefully start running subjects later in the week.

I will also work this week to get the experiment ready for the object interaction task. I think we will be measuring real world and virtual world performance for a sorting task. The measure of task performance will be the time it takes to sort the objects (most likely cups with numbers on them). There are, however, number of other measures that we will use to evaluate user performance. For example, a human hand's acceleration profile while reaching and grasping an object is a well-documented behavior and something that would be interesting to draw a comparison between in the virtual and real worlds. I think that the acceleration profiles will obviously differ; it will be interesting to see how they differ and if it is a consistent difference across subjects. I will also be administering a questionnaire to the subjects in order to gather some information that I cannot quantify. I want to measure the participants' feeling of presence in the environment and also get their feedback about what they liked/didn't like about the interface. It will also be interesting to see if the type of gesture (fist or pointing the index finger) affects their performance.

I hope that this coming week will get me started with running subjects so that I can focus on analyzing and writing up my results after spring break. Wish me luck!

Monday, February 20, 2012

Back online!...almost

Good news! The VR lab is back up and running so I have been able to make some really good progress recently on my projects. 

Last week, after the lab was ready to get running again, I ran my object interaction script for the first time, and was pleasantly surprised that it worked right away! There were a few kinks to work out, but after fixing those I was able to reach out and 'grab' virtual balls and move them to other positions. I was very happy with the results and had a lot of fun testing out the interface myself. After moving the objects around for a few minutes I lifted the HMD off of my head to see the balls in the real world--it was at that moment that I realized how immersed I had been in the environment. I have been around VR for a few years now and have always felt that I have a clear distinction in my mind between the real and the virtual worlds and this is the first time that that line has been blurred. Later that day, when I was trying the interface again, I did the same thing again. I am very excited about what this could mean for the effectiveness of VEs in the future. 

The other project I am working on is a third experiment to add to the previous two throwing experiments that I presented at APGV last August. This third experiment will be put together into a journal article that we will submit to Transactions on Applied Perception (TAP) for publication. We are hoping to gain a better perspective on the role that motor and visual feedback plays on our perception of our own actions. This experiment may have to be halted, however, as another one of our computers has decided to stop working, and we will have to install Autodesk Motionbuilder on another machine if I want to move forward with testing, piloting and then running subjects. 

Finally, I finished putting together one journal article and am waiting for two of the co-authors, in Germany, to review it and get back to me with suggestions. We will submit it in two weeks and hopefully hear back about a month after submission.