open Secondary menu

Student Parallel Election Program (Student Vote) Evaluation

Section 3 Analysis

3.1 Qualitative Analysis

Qualitative data (i.e. key informant interviews, focus groups, site visits) were analyzed by taking a thematic approach. In general, open-ended comments from the interviews and focus groups were reviewed, coded and classified by participant group. The analyses identified the extent to which key issues were consistently identified by informants and then compared responses among groups. This approach to analysis uses inductive reasoning, by which themes and categories emerge directly from the raw responses through careful examination and comparison.

3.2 Quantitative Analysis

Quantitative analysis of the survey data occurred in multiples stages. First, a frequency analysis of demographic variables was conducted in order to generate survey profiles. Separate profiles were developed for pre- and post-surveys (see Table 2-1). After the frequency analysis was completed, a series of t-tests for continuous variables, and chi-square (two categories) or proportional z-tests (more than two categories) for categorical variables, was conducted. Survey respondents from the same respondent group (i.e. student, teacher, parent) but different treatment condition (i.e. pre-program, post-program) were tested across outcome variables to determine between-group differences. Differences that were statistically significant at the p = 0.05 level were reported in the test.

Multi-variable analyses were then employed to study the association between participation in student and outcome variables, while adjusting for relevant characteristics in each respondent group. Depending on the outcome variables, models were estimated using linear and logistic regression. The results of the analysis are discussed in the text of the report. For the complete models from the regression analyses, see Appendix A.

For student respondents, the association between outcome variables and current Student Vote participation was analyzed by controlling for:

  • Gender
  • Whether the student was born in Canada
  • School type

Additionally, to better understand the impact of the cumulative impact of Student Vote and its Democracy Bootcamps, the regressions for the outcome variables also included:

  • Student's prior experience with Student Vote
  • Teacher's/school'sFootnote 1 previous experience with Student Vote
  • Teacher's/school's participation in a Democracy Bootcamp

For teacher respondents, the association between outcome variables and Student Vote was modelled by controlling for:

  • Length of time teaching civics
  • Gender
  • Whether the teacher was born in Canada
  • School type
  • Previous participation in Student Vote

The recruitment of schools for participation in the survey excluded non-traditional and adult educational institutions. As such, school type consisted of four categories:

  • Elementary school
  • Middle school
  • Secondary school
  • Combined grades

Additionally, follow-up analyses were conducted to better assess the impact of different components of Student Vote on outcomes. In particular, previous student and teacher experience with Student Vote was investigated to assess its impact on outcomes. Teacher attendance at a Democracy Bootcamp was assessed to determine its impact on teacher outcomes.

3.3 Inappropriateness of Control Group

The initial analysis of the non-participating samples being used as a control group raised several concerns. Demographic differences were noted between the non-participating student sample and the participating student sample, raising concerns about the comparability of the groups (see Section 3.3.1). Additionally, information provided by members of the non-participating samples indicated that they had been exposed to Student Vote materials, generating concerns about the degree to which the sample was truly non-participating (see Section 3.3.2). As a result of these concerns, it was decided that it was not appropriate to include the control group in the analyses. Thus, with the exception of reviewing teacher reasons for not participating in Student Vote, the analysis of the data investigates only the pre-/post-program differences of the participating samples. The removal of the control group from the analysis, however, limits the ability to attribute changes in outcomes directly to the Student Vote program.

3.3.1 Demographic Differences between Participating and Non-participating Samples

As noted in Section 2.4, Student Vote's success in recruiting schools to participate in the program reduced the population from which the non-participating sample could be drawn. In particular, just under three-quarters (72%) of the non-participating schools were elementary schools. This limited the ability to draw an appropriate sample of secondary students for the control, resulting in a large imbalance in the final student sample.

In the participating samples, approximately 50% of the student sample (pre-program – 51%; post-program – 50%) were secondary students. However, for the student control group, only 30% of the final sample were secondary students. Additionally, when the groups were broken down by age, there were larger disparities between the two groups. In the participating samples, approximately 40% of the sample was 14 years or older (pre-program – 44%; post-program – 39%), compared to less than 10% of the control group.

These differences in age distribution between the participating and non-participating student samples generated questions about the comparability of the samples. Additionally, the smaller proportion of secondary students in the control sample raised concerns about the statistical power of the sample to detect differences between the participating and non-participating students.

3.3.2 Exposure to Student Vote

A sizeable minority of both non-participating teachers and students reported participating in Student Vote in the past. Over a quarter of teachers (30%) and students (29%) from the non-participating sample indicated that they had previously participated in a Student Vote program. As such, the degree to which these individuals can be viewed as non-participating is unclear. Previous participation in Student Vote may have increased teacher and student knowledge of, interest in or confidence in politics, thereby inflating the outcomes for the control group. These inflated outcomes would attenuate the ability of the analyses to discover differences between individuals who had completed Student Vote and those who had not been exposed to Student Vote. Even in the absence of sustained knowledge of, interest in and confidence in politics, previous participation in Student Vote may have taught teachers and students skills and abilities that they can use when elections are being held. Teachers and students were, therefore, able to use the skills they had learned in a prior Student Vote to benefit more from the 2015 federal election than they may have otherwise.

A larger concern was that teachers in the control group may have used Student Vote materials to teach politics during the election. Student Vote materials were readily available on its website during the 2015 federal election. These materials could be accessed and downloaded by all teachers, regardless of whether they were registered with CIVIX. As a result, teachers in the control group (i.e. non-registered) may not be non-participating, but rather may have used the Student Vote methods and materials without registering. As such, students in the control group may have benefited from Student Vote. This concern was initially raised as a result of interviews with non-participating teachers and was further underscored by the discovery that 30% of non-participating students had, in fact, participated in a mock vote during the election.

A series of five interviews was completed with teachers who had not participated in Student Vote for the 2015 federal election. Two of the teachers who completed these interviews reported that they had used Student Vote materials during the election. One of the teachers reported going to the Student Vote website and accessing the materials for use in class without registering for the program. The other teacher reported previously participating in Student Vote and continuing to use the older materials. Thus, students in these classes experienced Student Vote. However, since their teachers did not register with CIVIX, they were categorized as non-participating.


Footnote 1 Data collected from the student surveys and teacher surveys were linked through a school identity code. As a result, it was not possible to link specific student data with individual teachers. Since some schools had more than one teacher participate in the survey, schools were assigned to the previous experience category or Democracy Bootcamp category if one of the teachers reported being in one of the categories. Thus, some students may have been incorrectly categorized.