Volunteer Survey - 07/20/2012
Volunteer Survey Report – 07/20/2012
When the participant was presented with the volunteer question—whether or not she was willing to answer the boilerplate questions at the end of the survey—her immediate reaction was how long the questions would take her to complete. Her concern was that she was signed up for another study in the Gates/Hillman building 30 minutes after the study was supposed to end. Further probing indicated that she would be willing to answer the questions because she liked to "help people." It would be interesting to see how the manipulation plays out if it was placed in the middle of the survey, after they have committed some time to answering some questions.
I think it would be best if we utilized a timer on the recruitment documents to prevent people from mindlessly clicking on yes to continue with the survey. This participant was not the only one who had rushed through the recruitment document; there have been a few others who have acted in the same way before. After "reading" the documentation, the participant has little to no grasp of the information presented on the page. Most participants, when asked to tell me what information they were able to extract from the document, had to re-read the paragraphs out loud in order to answer my questions. In this case, the participant did not even do that. She mumbled about telephones and receiving $2 for participating. She did not even know that she was going to receive a device with electricity feedback information for a duration of a year. However, the only piece of information she correctly answered was that the study was being conducted by her utility company. Adding a timer on the page would ensure that participants spend a set amount of time assessing the recruitment document. However, keeping participants on the page does not mean that they would spend that time reading the document. It might be a good idea to tell participants that they will need to correctly answer questions based on the document to receive credit for the study.
We need to clarify the question "how many hours do you spend at home during the day?" The main question that this participant (and other participants) had about that question was whether it included the hours she was sleeping. Her original interpretation of the question was how many hours during the day time she was home.
Another question that needed to be addressed was the last question, which asks how many people you see or talk to on a regular basis. While the first two questions could be easily answered based on the 1-7 scale, the last question "how many times have you attended a party or other social gathering in the past 2 months?" could not be answered (because she said 0). I took the liberty to add 0 to the scale question in the copy of the survey, but not the survey that was posted on MTurk.
Finally, this was the first time I've encountered anyone who had a fear of smart meters causing radiation and making people sick. Although she mentioned she had only read articles on the internet, her answers to the technology questions were affected by her perception of the damage these technologies caused. Her answers to the IHD questions were based on the fact that she did not want to have the device in her home for fear of her health. This completely contradicted her answer of her willingness to participate in the program at the beginning of the survey. While this issue may be addressed by having participants really understand the recruitment document, I have a hard time believing they really understand what kinds of information an IHD will present them. When she was asked what she thought electricity feedback entailed, she was unable to give me an answer. It seems that most people who just scan the recruitment document do not know what to expect to see for agreeing to participate. I would suggest that we include an image of a Ceiva frame, with some sort of electricity feedback on it, to compliment the recruitment document.