Skip to content

Lesson 18: Evaluating Our Own Participatory Sensing Campaign

Lesson 18: Evaluating Our Own Participatory Sensing Campaign

Objective:

Students will create statistical questions and evaluate their Participatory Sensing Campaign.

Materials:

  1. Campaign Creation handout (LMR_3.16_Campaign Creation) from previous lesson
  2. Class Campaign Information from Lesson 16

Essential Concepts:

Essential Concepts:

Statistical investigative questions guide a Participatory Sensing Campaign so that we can learn about a community or ourselves. These campaigns should be evaluated before implementing to make sure they are reasonable and ethically sound.

Lesson:

  1. Review homework by giving students about five minutes to share their classifications in their teams. They will decide as a team which classification is the most fitting.

  2. Once the five minutes have passed, have a class discussion of classifications and their justifications. Explain to the class that the campaign must be carried out by the whole class so if it has been classified in the Individual category, it must be revised. Also discuss whether the campaign is feasible. (For example, is the trigger so rare that no one will collect data? Are the questions too intrusive?).

  3. Inform students that one of the promises of PS is its potential for helping people bring about social and civic change. Ask teams to consider the following questions and report back:

    1. Does our campaign try to do this?

    2. Could it be changed or modified to do this?

    Note: Feasible campaigns fall under the groups of people or community categories. If a campaign is in the individual category, it should be modified to fall under the other categories before moving to round 4.

  4. Display the campaign information students generated (and selected as a class) the previous day or revised today: Topic, Research question, Trigger, and Type of Data needed.

  5. Now they will continue the rounds using the Campaign Creation handout LMR 3.16 from the previous lesson.

  6. Round 4: Now that the class has decided on a trigger and the type of data needed, they will create survey questions to ask when the trigger is set. The questions should consider all of the possible data they might collect at this trigger event. It's ok if the list is long; the goal is to be creative and think of lots of different ideas.

    Examples of survey questions for practicing cello are:

    “How long did you practice?”

    “What did you play?”

    “How would you rate your practice session: 1 to 5?”

    “Any thoughts or comments about your practice?”

  7. Once teams have created 4 survey questions for their group, have teams share out their survey questions. As a class, decide on no more than 10 survey questions that will be used for creating a new Campaign.

    1. Then, evaluate each survey question. For each question they should consider:

      1. What type of data will this question collect? (Numerical, discrete numerical, text, categories, photos, location).

      2. How does this question help address the research question?

      3. Does the question need to be reworded? (Is it clear what is being asked for? Do they know how to answer it?)

    2. If the survey questions need to be rewritten, assign teams to rewrite survey questions. Then, as a class, decide on the changes.

    3. Once finalized, write the survey question that goes along with that data variable, being cognizant of question bias.

  8. Round 5: In teams, now generate two to three statistical questions that they might answer with these data. Make sure your statistical questions are interesting and relevant to the class topic of interest. They may keep a record in their DS Journals. Remind students that they will also have data about the date, time, and place of data collection.

    Examples of statistical questions that can be answered for practicing cello are:

    “How frequently do I practice?”

    “When I practice more frequently, do I rate my sessions higher?”

    “Are higher-rated sessions associated with time of day?”

  9. Once teams have generated their statistical questions, have them share out with the class. Confirm that the questions are statistical and that they can be answered with the data the students propose to collect. As a class, decide on no more than 3 statistical questions to guide your campaign.

  10. Now that they have all the pieces of the campaign, evaluate whether it’s a reasonable and ethically sound campaign. Engage the class in a whole group discussion on the following questions:

    1. Are answers to your survey questions likely to vary when the trigger occurs? (If not, you'll get bored entering the same data again and again)

    2. Can the entire class carry out the campaign?

    3. Do triggers occur so rarely that you'll have very little data? Do they occur so often that you'll get frustrated entering too much data?

    4. Ethics: Would sharing these data with strangers or friends be embarrassing or undermine someone's privacy?

    5. Can you change your trigger or survey questions to improve your evaluation?

    6. Will you be able to gather enough relevant data from your survey questions to be able to answer your statistical questions?

  11. Students have collaboratively created their first Participatory Sensing campaign. Inform them that you will be demonstrating one tool used to create the campaigns that they see on their smart devices or the computer. Students should take notes in their DS journals, as they will be using the tool later.

Class Scribes:

One team of students will give a brief talk to discuss what they think the 3 most important topics of the day were.