Youth Engagement and Mobilization in the 2010 Toronto Municipal Election
1. Research Design and Methodology
Turnout in Canadian elections has been declining for decades, and this decline is being driven by youth. As they become eligible to vote, fewer and fewer young Canadians are choosing to cast a ballot (Blais et al. 2004). Similar declines have been observed in most industrialized democracies over the last half-century (Ibid.).
In spite of this, information on the nature and causes of low youth turnout is relatively scarce. In Canada, the premier dataset for research into voting behaviour is the Canadian Election Study (CES), a national survey fielded for every federal election since 1965. However, even the CES gathers limited information about youth. For example, of the 4,495 Canadians who responded to the CES campaign-period survey in 2008, only 213 were aged 18‑24 (Canadian Election Study 2008). Even when respondents from multiple surveys are pooled to create a larger sample, making useful comparisons between youth sub-populations (e.g. rural, urban and suburban) can be difficult.
Even less information is available about youth participation in Canadian municipal elections. Most municipal authorities do not collect age-segmented turnout data, which precludes the most basic form for analysis. The minimal data available suggest that youth turnout is low. For example, Ward 27 in downtown Toronto contains much of Ryerson University and the University of Toronto's campuses. Overall turnout in the ward was 56% for the 2010 municipal election, while turnout in the four subdivisions covering the university campuses was 35% (City of Toronto 2010).2
Over the last decade, research into voter mobilization and Get Out The Vote (GOTV) initiatives has burgeoned in the United States, largely thanks to the adoption of field experiments to study turnout. The most consistent finding of these experiments is that face-to-face contact with a potential voter is the most effective way to mobilize them (Green and Gerber 2008). When they are contacted, young voters are equally responsive to these appeals (Nickerson 2006). In short, the evidence shows that youth mobilization matters.
For anyone seeking to engage youth during Canadian elections, the shortage of information about youth can be a challenge. Faced with limited resources, organizations wishing to mobilize youth during elections benefit from information about how and where to target their efforts. This translates into two basic research questions: How does electoral engagement differ among youth sub-populations? And what is the impact of existing youth mobilization initiatives?
The 2010 Toronto municipal election provided an opportunity to begin answering these questions. The City of Toronto decided to launch a new outreach initiative for 2010, including a focus on youth. As part of this initiative, Toronto Elections brought together a network of partner organizations from across the city seeking to engage youth, including community groups, NGOs, youth-serving organizations, and post-secondary institutions.
Apathy is Boring took this opportunity to conduct a survey of youth in Toronto, as well as a study of Toronto Elections' partner organizations. This mixed methodology approaches the research questions from two directions: an organizational analysis of real-world youth mobilization initiatives, as well as a quantitative analysis of what is happening on the ground.
1.1 Survey of Youth in Toronto
Sample and Distribution
The Toronto Youth Election Survey was conducted on-line both before and after the 2010 Toronto municipal election. The final sample for the survey was 796 eligible voters in the city of Toronto between the ages of 18 and 35. The survey sample was not randomly selected. Rather, the survey was promoted to the public and respondents chose to participate.
Random selection from a population is the preferred approach for any survey. However, when the population in question is youth, traditional survey methodologies typically involve high costs or significant shortcomings. For example, most telephone surveys only reach Canadians with landlines. However, the most recent Residential Telephone Service Survey by Statistics Canada, conducted in 2008, found that "34.4% of households comprised solely of adults aged between 18 and 34 relied exclusively on cell phones. Among all other households the rate was 4.5%" (Statistics Canada 2008). Although it is possible to construct a sample that includes both cell-only and landline households, this adds to the already prohibitive cost of trying to reach youth with telephone surveys.
Given these shortcomings and the focus of this research, an on-line survey was used. The survey questionnaire was available to the public, with respondents screened for eligibility before they could begin the questionnaire. To encourage participation, Apathy is Boring promoted the survey in conjunction with Toronto Elections and their partner organizations. Links to the survey were distributed through e-mail newsletters, social media, partner Web sites, and posters. To increase the survey's appeal to youth at large, it was incentivized with a contest to win one of several free iPods. This incentive was featured prominently in promotional materials.
As the respondents chose to participate, the survey sample exhibits a self-selection bias. Youth participating in the survey are more likely to be engaged than their peers, which limits external validity. The value provided by this survey comes from making relative comparisons within a sample of youth from a specific city, with respondents who tend to be more engaged than average.
Two-wave Design
The survey used a two-wave design with both waves administered on-line. The first questionnaire was available to the public from October 4 to 22 (one day before the election). The day after the election, participants received a follow-up questionnaire by e-mail. The direct e-mail system matched respondents' pre- and post-election responses, thus treating both questionnaires as a single survey case. Of the 796 first-wave respondents, 443 (56%) completed the follow-up questionnaire.
A two-wave design was chosen to schedule the initial distribution during the municipal election campaign. Respondents are more likely to complete surveys that they perceive as timely and relevant to current events (Cook, Heath and Thompson 2000). The two-wave design involved distributing the first questionnaire at the height of the election campaign, when it was most timely, while also collecting post-election information such as turnout. The two-wave design was also chosen to potentially allow for a quasi-experimental analysis in conjunction with the other information collected, although this analysis was not ultimately conducted for this report.3
Survey Questionnaires
The survey questionnaires were developed in conjunction with Elections Canada and Toronto Elections. Along with standard socio-demographic indicators, the survey included questions that spoke to both of our research questions. The questionnaire included measures of civic duty, community activity, political knowledge and political interest, to assess respondents' patterns of engagement. It also included indicators to assess election outreach and mobilization programs, ranging from contact with an election campaign to receiving a voter information card in the mail.
Election surveys face a number of limitations, the most notable being that they consistently overestimate turnout (Karp and Brockington 2005). This occurs for two main reasons. First, there is a selection bias: respondents who are willing to participate in a survey are also more likely to vote than non-responders.
Second, there is a social desirability bias: some respondents will falsely report voting on surveys because of positive social norms surrounding voting (Bernstein, Chadha and Montjoy 2001). Misreported turnout is particularly problematic for surveys because it has been associated with other respondent traits. For example, respondents who falsely report voting also report higher levels of education, civic duty and political attentiveness than honest non-voters (Presser and Traugott 1992; Karp and Brockington 2005).4
To compensate for misreporting, the survey uses an adaptation of the American National Election Studies' turnout question, which offers respondents several socially acceptable reasons for not voting. This question wording has been shown to attenuate turnout over-reporting (Duff et al. 2007). However, like most election surveys, this one has an inflated turnout rate. Overall turnout in the 2010 Toronto municipal election was 51%, whereas 71% of survey respondents reported voting.
Election surveys are also limited by their use of self-reporting to measure exposure with campaigns. Survey respondents may be unable to remember contact, or falsely remember contact where there was none. To compensate for these limitations, the questionnaire included a battery of 14 different campaign contact indicators, which collect information on the relative rates of different contact methods.
1.2 Interviews and Mobilization Assessment
Setting and Sample
Toronto Elections assembled a network of more than 40 partner organizations seeking to engage youth directly during the 2010 municipal election.5 This network included youth-serving organizations, community organizations, post-secondary institutions, student unions and other non-profit organizations. Representatives from each organization first came together at City Hall on July 22, 2010, to discuss their plans and opportunities for collaboration around the election.
Representatives from 43 of the partner organizations were invited to participate in post-election interviews and a mobilization assessment program. Twenty-two representatives did not participate, with nine of them declining because their organizations ultimately were not active during the election. The final sample of 22 interview participants is therefore skewed towards the more active organizations in the network.6
Post-election Interviews
Each participant was contacted by an Apathy is Boring staff member for an interview after the election. The goal of these interviews was to secure qualitative feedback from each organization about their work during the election.
Fifteen interviews were conducted in-person in mid-November and the remaining seven were conducted by phone in the following month. Many of the participants represent organizations that rely on funding from government agencies and departments. To encourage honest responses, the interviews were conducted on a semi-anonymous basis: by default, no comments were attributed to specific organizations or individuals. Participants also had the option to make any of their comments fully anonymous.
Each interview included a consistent set of 14 questions to secure information about each organization's election mobilization activities, as well as to solicit feedback and best practices.7 Given that the sample included some of the most active organizations in terms of youth mobilization, we wanted to identify common characteristics and patterns. Along with information on the planning and deployment of youth mobilization initiatives, the interviewer asked for background information about each organization. Participants were explicitly asked to identify any challenges they faced and to make suggestions for consideration by election agencies.
Mobilization Assessment
Representatives were asked to record their organization's mobilization activity during the election in a standardized format. Each activity was recorded individually, along with its time, date, location and estimated reach.8 For example, some organizations reported organizing election debates, while others reported canvassing specific streets or hosting workshops.
For the final mobilization assessment, only activities tied to specific geographic locations were included in the analysis. Although some information was collected on printed materials and on-line outreach, it was impossible to reliably assess their dissemination, and they were therefore excluded from the analysis.9
2 The four subdivisions and their respective individual turnout rates are: Subdivision 16 (41%), Subdivision 29 (37%), Subdivision 52 (43%), and Subdivision 67 (28%).
3 See Appendix A for an explanation of the quasi-experimental analysis and why it was not conducted.
4 Karp and Brockington also found a weak relationship between age and false reporting, but it did not appear to have a significant impact on the results of regression analysis using self-reported turnout versus actual turnout.
5 These organizations were also part of Toronto Elections' network of 121 communication partners.
6 ArtsVote Toronto, an organization from outside of Toronto Elections' partner network, was included as the 22nd participant because a number of survey respondents reported contact with their campaign.
7 See Appendix B for the full list of interview questions.
8 This information was originally supposed to be tracked by all participants with a standardized tracking spreadsheet. However, few participants complied with the full protocol. As a result, most of the data was compiled after the election through personal follow-ups and organizational records.
9 The printed materials were distributed passively (e.g. on newsstands) throughout the city, so the only reliable information available was the number of copies printed. Similarly, few organizations had reliable or consistent metrics for their on-line outreach.