These are the first two pretests. I have a vague concern about the research, not knowing exactly what the focus or story is. I am considering xx. Created a qualification in MTurk for one participant to allow him to be in future problem pit studies. I need to create an acceptance sampling method to keep the number of pre-test participants bounded.
- The questions they wanted to ask seemed to be very specific to the problem, much more specific than we were thinking. Their main concerns were logistic issues, issues of confidentiality, security, what to do with the frame, how it works.
- They seemed to base their questions on what would concern them, confirming the self-projection theory.
- It is going to be important to draw the citizen science sample from the same population as those who are offered the program. If people use self-projection, then those projections will be most valid from the actual sample, rather than MTurk masters participants who may be idiosyncratic.
- I feel like their ability to generate questions is not so good. They seem to pick up on the intuitive things that most people would think of: do they want the device, do they think the money is enough. What methods can we use to help them generate effective questions using their knowledge? Are we already tapping that?
- There seems to be a consistent self-projection element. People think other people would do or not do for the same reasons as themselves.
- Here is the idea for the acceptance sampling method. The distribution of new ideas (new errors) can be modeled according to a failure distribution. We want to estimate that parameter. What is the sample size needed to do that?
- Both participants were U.S. and had MTurk masters qualification.
- Both said they would not enroll. Seems like they were being more honest than the last sample.
- "The survey itself was nicely designed, but it was frustrating to have to advance the page so often. More than one question per page is preferable, in my opinion. I didn't really like the Recruitment Document being in a separate window, I think it could be included in the survey itself, as an example, and it would be easier to reference. I thought it was really too long, with too many questions that seemed like variations on the same theme. Those are pretty much the only suggestions that I have to offer, outside of what I answered in the survey itself."
- Has too many questions to make a decision to enroll.
- How does the frame work?
- How does it affect privacy?
- Who has access to photos/data? Who owns the rights?
- What happens if they withdraw from the study?
- Do they have to give the frame back at the end of the study?
- People would enroll if:
- They have the right information.
- They really need money.
- They really want the photo frame.
- They think the program could help them save energy.
- Suggested giving $5 to people who contact us rather than $2 to everyone (I disagree with this here; pre-payment works, and $2 is a special amount because of the bill).
- If contacted by mail, wouldn't make the "extra effort" to figure out the program details.
- Independent questions seemed to revolve around the desires of the participant:
- "Are you interested in finding out how to save money on your electricity costs?"
- "Would you like to participate in a study that will benefit science?"
- "Would you like the use of a free photo frame?"
- "Would you be agreeable to having the frame in your home?"
- Similarly felt like there was not enough information about the content and process of the program.
- Said the program was "a bit invasive," but might consider doing it if offered more money.
- Believed others would not enroll for the same reasons: invasiveness, too little compensation, and that it "might be difficult to remember what to do and when."
- Others would enroll if they didn't care about the invasiveness.
- The most important thing that needs to be explained is how the frame will track usage (this needs to be corrected; the smart-meter tracks the usage, so they need to know it's already being tracked, they are just now able to see it).
- Make the expectations of them clear (what is expected of me?).
- Have an image of the frame rather than a link.
- Explain why we want this info, how will it be used.
- People throw the mail away without reading it.
- Might use a telephone interview.
- Advancing page too often
- Include recruitment document in the survey itself, rather than a separate window
- Too long
- Redundant questions
- Broke the questionnaire up into multiple pages, with a random subset of two questions on each page.
- Cut the length by having participants do two-thirds (55) of the predictions.