User:Alexander L. Davis/Notebook/In the Problem Pit/2013/03/23

From OpenWetWare
Owwnotebook icon.png Project name Report.pngMain project page
Resultset previous.pngPrevious entry      Next entryResultset next.png

Entry title

  • Second round of testing.


  • I'm generally confused about procedures that could help better question generation, or help them come up with better critiques of the project. I do think that their critiques of the recruitment document have been helpful, although we are not testing this.
  • Again, the questions seem to be less than helpful. They seem based on intuitive theories one would expect.
  • Again, questions were based on what would concern them, confirming the self-projection theory.
    • It is going to be important to draw the citizen science sample from the same population as those who are offered the program. If people use self-projection, then those projections will be most valid from the actual sample, rather than MTurk masters participants who may be idiosyncratic.
  • Again, what methods can we use to help them generate effective questions using their knowledge? Are we already tapping that? Reminds me of the DiSessa work with student A, learning and epistemology.

Unexpected Observations

  • This person suggested something fairly unexpected. Offering a general prize to a wide audience for coming up with the best question or set of questions, similar to the following paper: "Lakhani, K. R., Boudreau, K. J., Loh, P. R., Backstrom, L., Baldwin, C., Lonstein, E., ... & Guinan, E. C. (2013). Prize-based contests can provide solutions to computational biology problems. Nature biotechnology, 31(2), 108-111."

New Hypotheses

  • Insert content here...

Current Protocol

  • Insert content here...

Current Materials

New Data

  • Participant was U.S. with MTurk masters qualification.

Participant 3

  • Participant wanted explanations next to or under each question.
  • Participant wanted about 10 questions per page.
  • Recommended using traditional (advertisements, tech journals, radio) and social media.
  • Focused on savings and appliance tracking as reasons for enrolling.
  • Would want to use it to check the accuracy of the bill.
  • Wants more info on who has the data.
  • Wants details on installation and implementation.
  • People will likely not open it or throw it away, seeing it as an advertisement.
  • Social media can be used to get info from people for prediction. Might be worthwhile to look at creating a facebook page to help collect data.
  • Contest for a prize if their question is used.
  • The questions were open-ended (not independent or configural). They seem to be ones that everyone would answer yes to, and thus not predict well.
    • Are you concerned about energy usage and the effect that electricity production has on the environment?
      • Concern for environment- those people would be likely to participate
    • Would you like to know how much electrical enegy you use in an easy to read format?
      • A daily weekly and monthly readout rather than just a monthly from electric company.
    • Would you like an easy way to cut down on your electrical bill every month?
      • Soft ball question but do they actually have an interest.
    • Do you know how much electricity your household uses every month?
      • Prior knowledge of usage and awareness of same.
    • Do you know what days your household energy usage is highest?
      • Weekends usage tends to be higher as people are at home more.
    • Do you unplug electronic devices when they are not in use?
      • Shows responsibility and effort.
    • Do you use power strips to plug electronic devices into electrical supply?
    • Is your electric bill accurate every month?
      • Demonstrates checking and not accepting what company says.
    • How many people live in your household?
      • Provides certain demographics.
    • What is your age?
      • Age will be something of a good predictor. 28-50 age range I think will be more likely to participate and follow through. Younger and older maybe not so much.

Participant 4

  • This participant did not send me notes about the task.
  • Nothing informative


  • Advancing page too often (2/3)
  • Include recruitment document in the survey itself, rather than a separate window (1/3)
  • Too long (1/3)
  • Redundant questions (1/3)
  • Not enough info about program (2/3)
  • Explanations close to questions (1/3)
  • Wanted info about data security (2/3)
  • Wanted info about installation process (1/3)


  • Changed to 10 questions per page.
  • Explanations come directly under questions.
  • Added to recruitment document "We will send you the device, and help you with installation if you need it."
  • Added to recruitment document "With the device, however, the information is also available to you, but not anyone else."