Sunday, April 8, 2012

Analysis

This week was devoted to analysis of the data that I collected last week. First, we took simple means and standard deviations of the subjects' completion times. The hard part is cropping up in the analysis of the hands' motion data. We want to look at a user's acceleration profile while grabbing both real and virtual objects. In order to do this, we need to look at the hand's position data and extrapolate when a subject grabbed an object. I did this yesterday by looking at the x, y, and z coordinates separately; I categorized a grasp by a somewhat steady position in the z direction, a downward movement in the y direction and a move forward and then backward in the x direction. This was all done by hand (no pun intended!) and 6 grasps were separated from the data for each participant, 3 in the virtual environment and 3 in the real world.

The next step will be to take this data and plot the acceleration profiles of each participant. The acceleration profiles of a human in the real world when grabbing an object is well documented and shown to resemble a bell curve, speeding up in beginning, and then slowing down as the hand approaches the object. It will be interesting to see how closely the acceleration profiles from the virtual world match those in the real world. If they are similar, we can conclude that perhaps the interface is a good first step towards providing effective virtual object manipulation. We always want virtual world performance to mirror real world behavior as closely as possible, and making sure that our natural movements are not being altered by a virtual environment is important to achieving this goal. If the acceleration profiles differ greatly then we can begin to look at why this is and try to fix this problem. For example, implementing Python's physics engine would ensure that users couldn't simply stick their hand through an object and must be more accurate when approaching the location of the object.

Before any conjecturing can be done, however, I have to finish with the analysis and this means filtering the noise out of the data and looking at the results.

Saturday, March 31, 2012

Another Experiment

In previous posts, I have mentioned an experiment involving object interaction that I am conducting both as part of my independent research and for my human-computer interaction class, this week was go-time. I finally got the interface and task description ready to go and ran 12 subjects this week.

The task was to take 10 blocks, each with a number 0-9 on them, and place them in order as quickly as possible. I had participants perform the task 3 times in the real world and 6 times in a virtual environment, taking their time to completion on each trial. Running the subjects went very smoothly, after a slight mishap involving a broken battery while running subject 1, and I will look at the results in the coming week, possibly writing this experiment up as a short paper for submission to APGV '12.

Write Up

With the subjects run and data analyzed, it was time to write up the results. This most recent experiment, along with 2 experiments that I presented at the Conference on Applied Perception in Graphics and Visualization last August, will make up a well-rounded study investigating perception and action in virtual environments which we hope to write up as a journal article and publish in ACM's Transactions on Applied Perception (TAP).

My week, therefore, was spent writing up the method and results of the current experiment, as well as altering my conference paper and drawing some conclusions. The rest of the semester will be mostly devoted mostly to writing and editing the journal and finishing up class projects.

Running Subjects

The week after Spring Break was a marathon of running subjects. We ran 16 subjects in 2 days and had the results the next day.
The purpose of this experiment was to investigate the importance of visual and motor feedback when making judgments about an action. In order to do learn about these types of feedback, we placed our participants into the role of both an actor and an observer. In both roles, the subject was 4 meters behind an avatar in the virtual environment; as an actor this avatar's movements were mapped to the subject's own movements and as an observer the avatar's movements were mapped to those of a helper who was in the lab with me. In the actor role, the subject watched themselves throw a ball which disappeared after release and then upon landing reappeared, along with another ball which was displaced to appear either closer or further from the subject than the actual landing point of the ball. The subject then had to respond with a 'further' or 'closer' response, indicating which ball they believed to be the one that had been thrown. The observer role was very similar, except that the lab assistant threw the ball as opposed to the subjects themselves.
As an actor the participant had both the visual information from watching themselves throw, as well as the motor feedback from performing the action. The observer, on the other hand, had only the visual information from the assistant's body movements.
After looking at the results, we found no significant effect of viewpoint, meaning that our subjects were able to perform the task equally as well without motor information as they were with it. This result leads us to believe that visual information is enough and that any addition information is not enough to improve our perception.

Saturday, March 10, 2012

Spring Break

This past week was my senior year Spring Break, so logically the first Saturday I woke up, ate breakfast and went to the lab!
With our lab finally back in action I was able to get the script for my final experiment up and running the week before break. I demonstrated it for my adviser, Professor Bodenheimer, on Friday and we decided that it was time to start running subjects. Our Saturday meeting was set up to show Professor Rieser of the Psychology department the final experiment and get his approval.
I will start running subjects tomorrow and hope to have 16 run by the end of Wednesday so I can analyze the data and write up the journal article in the following week and a half. I hope to have the entire project finished by Monday the 26th, just so that it doesn't drag out any longer than it should. After that I will have to deal with reviews of the journal articles and finishing up my senior year.
Don't worry though, Sunday morning I packed up my car and me and my closest 10 friends went to Gatlinberg, Tennessee for the week. We hiked, rafted, ziplined and fished and had a great week, and I even saved my boyfriend from a river.

Sunday, February 26, 2012

Back On Track

After all of the breakdowns in the lab over the past weeks, I have finally gotten back on track. The final equipment issue that we had, with the computer we use to stream Autodesk MotionBuilder, ended up being a simple faulty wire so after that was replaced I was ready to debug and get my experiment ready to go.
The other undergraduate in the lab, Stephen Bailey, helped me out by being a subject and I got my throwing experiment ready to the point where I will show my adviser, Professor Bodenheimer, tomorrow and hopefully start running subjects later in the week.

I will also work this week to get the experiment ready for the object interaction task. I think we will be measuring real world and virtual world performance for a sorting task. The measure of task performance will be the time it takes to sort the objects (most likely cups with numbers on them). There are, however, number of other measures that we will use to evaluate user performance. For example, a human hand's acceleration profile while reaching and grasping an object is a well-documented behavior and something that would be interesting to draw a comparison between in the virtual and real worlds. I think that the acceleration profiles will obviously differ; it will be interesting to see how they differ and if it is a consistent difference across subjects. I will also be administering a questionnaire to the subjects in order to gather some information that I cannot quantify. I want to measure the participants' feeling of presence in the environment and also get their feedback about what they liked/didn't like about the interface. It will also be interesting to see if the type of gesture (fist or pointing the index finger) affects their performance.

I hope that this coming week will get me started with running subjects so that I can focus on analyzing and writing up my results after spring break. Wish me luck!

Monday, February 20, 2012

Back online!...almost

Good news! The VR lab is back up and running so I have been able to make some really good progress recently on my projects. 

Last week, after the lab was ready to get running again, I ran my object interaction script for the first time, and was pleasantly surprised that it worked right away! There were a few kinks to work out, but after fixing those I was able to reach out and 'grab' virtual balls and move them to other positions. I was very happy with the results and had a lot of fun testing out the interface myself. After moving the objects around for a few minutes I lifted the HMD off of my head to see the balls in the real world--it was at that moment that I realized how immersed I had been in the environment. I have been around VR for a few years now and have always felt that I have a clear distinction in my mind between the real and the virtual worlds and this is the first time that that line has been blurred. Later that day, when I was trying the interface again, I did the same thing again. I am very excited about what this could mean for the effectiveness of VEs in the future. 

The other project I am working on is a third experiment to add to the previous two throwing experiments that I presented at APGV last August. This third experiment will be put together into a journal article that we will submit to Transactions on Applied Perception (TAP) for publication. We are hoping to gain a better perspective on the role that motor and visual feedback plays on our perception of our own actions. This experiment may have to be halted, however, as another one of our computers has decided to stop working, and we will have to install Autodesk Motionbuilder on another machine if I want to move forward with testing, piloting and then running subjects. 

Finally, I finished putting together one journal article and am waiting for two of the co-authors, in Germany, to review it and get back to me with suggestions. We will submit it in two weeks and hopefully hear back about a month after submission. 

Tuesday, February 7, 2012

Houston, we have a problem

The functionality of the LIVE lab is based on a complex system of software and hardware, all interfacing over the Vanderbilt network. As I mentioned in my previous post, I went into the lab for the first time in the semester and I realized that there were a few bugs that would have to be fixed before I could get a good start on my research. We solved the issue of the Cyberglove by calling the Vizard help desk, turns out we had to specify the correct port that the Cyberglove was streaming data on in our code. It was a simple fix and was simply a result of an updated plug-in from Vizard for the glove.

This would have been the end to my original list of problems, but one of the machines, which handles the entirety of the motion tracking functionality in the lab got a virus. The other undergraduate in the lab discovered the virus and did a system restore in order to get rid of the virus, but when the system restore was finished we discovered that the computer's networking capabilities had been corrupted and no longer worked. In our lab, this means that any data we were getting from our Vicon motion tracking system could not be sent to the computer with Vizard on it, the machine on which we create and run our virtual environments. As of now, the head of our IT department has addressed this issue and will perform a memory wipe, re-install lost programs and get the lab up and running again. Unfortunately this will not be finished for a few more days so in the meantime I will be concentrating on writing a journal article and getting that submitted as soon as possible.

Monday, January 30, 2012

When things go wrong...

My goals for this semester:

  • Turn both of my conference papers into journal articles. 
  • Create and execute the experiment that I outlined in my last post. 


In order to turn a conference paper in to a journal article you need to demonstrate that you have added about 30% more work to the conference paper. This is up to interpretation by the individual and can come in many different forms. For example, in one of my papers we performed an additional small experiment so that we can make comparisons between task performance in the real world and the virtual one. We also plan on adding in a lot of detailed analysis that we had to leave out of the original paper due to length. We analyzed eye gaze of each of our participants and made some conjectures on what that meant to our results. Also, in my other paper we have performed another experiment and have one other one planned so that we can make some claims about the ties between and importance of motor and visual feedback.

This was a week that reminded me of the patience needed in order to conduct research; equipment breaks, timelines are flexible and new obstacles seem to appear at every turn.

I started the week off by talking to one of my co-authors on the first paper I published. She has worked closely with Professor Bodenheimer and I in all parts of the paper. She also conducted the real-world Stepping Stone task and I need her to help us analyze the data so that we can begin the journal paper. We setup a Skype call for Friday so that we could get the analysis finished, but she had to leave the office early and the call was rescheduled for Monday, January 30th.

The rest of the week I spent in the lab trying to get a good start on conducting the third throwing experiment and also beginning to implement object interaction using the Cyberglove. I had done some work with the Cyberglove when we first purchased the pair of gloves 2 years ago so I pulled up that code and wrote a short script in Python, hoping to have a quick and simple initial implementation of my system. Unfortunately, it seems that the use of the Cyberglove crashes the Vizard executable and I will have to contact the Vizard support team in order to see if they have a fix for this problem.

Putting the Cyberglove work to the side, I brought in a subject to perform a pilot study of the throwing experiment. This experiment will involve asking the users to judge alterations in trajectories of their own throws and other people's throws. I had my subject all suited up (so that I could provide him with an avatar character) and began the experiment when I realized that our head-tracking system had an error and was not receiving/sending out the appropriate orientation data. Without this data the Virtual Environment is useless, as the user cannot effectively interact with the environment. I was unable to come up with a quick fix for this issue and will now have to work with the graduate students in the lab to solve this problem, as it will effect all of our research.

I was able to make some progress in developing code, and there is always something to learn when solving a problem in the lab; this week's biggest lesson, however, was a reinforcement of Murphy's Law: If anything can go wrong, it will. And this only gives me more motivation to spend time in the lab this coming week and get my projects back on track!

Monday, January 16, 2012

A New Direction

Classes started one week ago, beginning my last semester as a Vanderbilt student. I am taking two classes, Topics in Software Engineering and Human Computer Interaction (HCI). Both classes are very interesting and I'm looking forward to the work that I will be doing this semester.

As I have mentioned in the past few posts I am switching the focus of my research topic from dyadic interaction to user interfaces in the virtual environment and this ties in very well with the HCI class. The class is based around helping each student develop a semester long project. The project is split into many different design phases, beginning with initial design descriptions followed by prototype developments and finally user testing. We are all encouraged to tie our project into the research that we are currently working on in other areas of our studies, so I will be using the work for this CREU project to build my HCI assignment. 

I had a meeting with Professor Bodenheimer on Friday so that we could talk about what we would like to get accomplished this semester and came up with a plan that I am very happy with. I will turning both of my previous papers into journal articles, which means expanding on the material that is presented in each one. The difference between a conference paper and a journal article is that there is usually about 30% more material in the journal article. And of course, we also discussed the work I will doing in the LIVE lab. I have mentioned that I am interested in object interaction within the virtual environment and my project will be targeted at this aspect of virtual reality. 

I want to investigate interaction with purely virtual objects. In my past research projects the objects have had virtual representations paired with physical counterparts. This meant that when my subjects were completing the object interaction task in the first paper or throwing the ball in the second paper they were actually holding the stamp tool or the ball and were therefore presented with haptic information which more than likely aided in the completion of both tasks. I mentioned in my blog post that it is not always going to be possible to provide users with physical representations of all of the objects they need to interact with, for a number of reasons, and therefore developing an intuitive way in which they can interact with the virtual objects in important. The interface that I currently have in mind is one in which the user wears a Cyberglove and must first 'touch' the virtual object and then make a fist in order to move the objects to a different location. We will be looking at how variations of the grip (making a tight fist, pointing one finger, a loose grip etc.) effects the user's feelings of ownership of the virtual object, do they believe that the object is actually there? 


I think that this project will be very interesting and that virtual reality has reached a point where this kind of project is feasible and necessary.