open Secondary menu

Organic and Paid Content on PlatformsDiscussion Paper 2: The Impact of Social Media Platforms in Elections

During the 2019 general election, citizens used a wide swath of digital and social media platforms to create and engage with a range of content, from memes to leaders' debates. Regulated political entities, including candidates, political parties and third parties, were also active on several major platforms, as were media outlets and civil society groups. Elections Canada used organic content and digital ads to explain where, when and ways to register and vote, and participated in some platforms' special initiatives to increase election awareness and voter registration.9

The social media market in Canada is large, with approximately 25 million active social media users. The market is dominated by a handful of major proprietary platforms, the most popular being YouTube (85% of Internet users aged 16 to 64 report using it in the past month), Facebook (79%), Instagram (53%), Twitter (40%) and Pinterest (35%). Canadians are also active on LinkedIn (29%), Snapchat (28%), Reddit (25%), WeChat (9%) and TikTok (9%), among others.10

Digital and social media platforms differ from other spaces where Canadians encounter and interact with people, information and ideas. To have a productive discussion about how platforms shape the digital information sphere and impact the democratic process, and to think about addressing regulatory gaps therein, it is important to understand platforms' functionality and architecture. The following section draws on scholarship to describe how digital and social media platforms curate, moderate and disseminate organic and paid content.

Organic content

Organic content on social media platforms—content that the poster (i.e. an individual or organization) does not pay the platform to publish or distribute—is subject to relatively few regulations in Canada.11 It is subject to existing criminal and civil laws around expression. In the pre-election and election periods, it is also subject to some Canada Elections Act (CEA) provisions that forbid the impersonation of Elections Canada and political entities, the publication of misleading information for the purposes of affecting the election outcome, and some types of false statements about candidates and people associated with parties.

Otherwise, the governance of organic content largely falls to the platforms themselves. Before the 2019 general election, many platforms committed to taking measures to safeguard electoral integrity, such as by offering enhanced security for political campaigns' accounts and by setting up dedicated contact channels for political entities and Elections Canada to report election-related incidents to the platform. Some platforms also signed on to the government-led Canada Declaration on Electoral Integrity Online, a voluntary pledge to remove fake accounts, bots and inauthentic content.12

Platforms govern organic content by defining what is possible, permitted and promoted. They determine what actions are available to users, such as by offering buttons that let users "like" or "share" content with a click.13 Most platforms display engagement metrics (numbers of likes, shares, follows, etc.), identifying the popularity of content and offering users real-time feedback on the "performance" of their posts.14 Through the creation and enforcement of policies such as terms of use and community standards, platforms moderate accounts and content and remove what they deem unacceptable.15 Platforms also promote some content over others, through recommendation algorithms.16

Recommendation algorithms exercise strong influence: they determine what each user sees, sees first and does not see in their timeline (Facebook News Feed, YouTube queue or equivalent). Though algorithms differ from one platform to another, change often and are largely unknown to users and outside entities, researchers have a general idea of how they work.

Many recommendation algorithms assign a relevancy score to each piece of content, based on each user's personal and behavioural profile (described below), to predict how a given user will respond to each item.17 They then use this information to rank content in each user's timeline, placing content that generates engagement—clicks, video plays, likes, comments or shares—at the top.18

By promoting content that generates strong reactions, algorithms generally reward and increase the reach of content that is provocative, entertaining and shocking.19 Research suggests that these effects of recommendation algorithms, coupled with the human drive to emotional response, can impair users' ability to detect and access reliable information. For instance, a study of engagement on Twitter found that posts exhibiting "indignant disagreement" received nearly twice as much engagement as other types of content; each moral or emotional word used in a tweet (such as "greed," "evil" or "shame") boosts its reach by 20%.20 Another study found that users are more likely to believe and share articles with emotional headlines, even when they have not read or evaluated the article.21 A third study showed that Twitter users are 70% more likely to share untrue news—often sensational and novel—than factual news.22 While content creators in other media may also skew their content toward the sensational to get better reach, the tendency is more strongly reinforced on social media platforms, because they offer real-time feedback (likes, shares, retweets) on how audiences are reacting.23

Users may also have difficulty assessing the validity of information on these platforms because content is, for the most part, presented without obvious clues about its source, authenticity, quality, or the interests of those who created or shared it.24 Unable to examine the facts and motives behind every post they see, users can take cognitive shortcuts; they tend to believe posts that come from friends, that are repeated or that are accompanied by photos.25 Users' challenges in assessing information are compounded when an individual they trust, such as a political leader, posts or shares inaccurate or misleading information.26

"If there's one fundamental truth about social media's impact on democracy it's that it amplifies human intent—both good and bad. At its best, it allows us to express ourselves and take action. At its worst, it allows people to spread misinformation and corrode democracy. I wish I could guarantee that the positives are destined to outweigh the negatives, but I can't." Samidh Chakrabarti, Facebook's Product Manager for Civic Engagement27

Data-driven digital advertising

The digital display advertising market in Canada is significant: in 2019, 53.5% of the total amount spent on advertising was spent on digital ads, representing $8.8 billion.28 As with organic content, during the 2019 general election, Canadians encountered political content in the form of digital ads. This section provides a high-level description of how data-driven digital advertising services function, particularly in the Canadian electoral context.

Most social media platforms deliver data-driven advertising; they use automated software that lets advertisers exploit user data to target and optimize ads. Other companies, notably Google, also offer data-driven ads, displayed on search engine results pages and websites.29 In recent times, virtually all digital ads have been data-driven,30 with Google and Facebook accounting for almost three-quarters of the Canadian market.31

Ads are generally subject to more regulation than organic content. In the pre-election and election periods defined by the CEA, certain ads are subject to spending limits,32 reporting requirements and taglines disclosing who paid for them.33 Before the 2019 general election, a new CEA provision came into law: it requires platforms and websites reaching a defined threshold of visitors and running regulated ads posted in the pre-election or election period to create ad registries.34

Ad registries must make regulated ads public and provide information on who paid for each ad.35 Major platforms responded to this new requirement in various ways: Google announced it would ban all issue and partisan ads in Canada;36 Twitter announced it would accept such ads in the election period only;37 and Facebook and some other platforms continued to accept these ads and published them in ad registries.38

How data-driven digital advertising services work

Data-driven advertising services deliver, optimize and price ads in ways that are different from those of traditional advertising channels, such as broadcast television. Their features, designed for commercial purposes, are very powerful, enabling advertisers to target consumers precisely and optimize their ads based on detailed real-time feedback on their performance with targeted segments of their audience.39

At the heart of digital advertising is data: platforms may hold tens of thousands of attributes on a single user.40 This includes data that users provide directly, inferred data and behavioural data.41 It can also include psychographic data42 and highly detailed personal information purchased from data brokers.43

Platforms use these rich data to group users into segments for ad buyers—one reportedly offers 29,000 segments based on characteristics that include ethnic affinity, income, support for breastfeeding and smart device usage.44 Some platforms allow advertisers to upload email addresses, phone numbers or postal codes to target known individuals.45 Some also permit advertisers to target users who resemble a known audience, based on characteristics that the platform "thinks" are salient, but that are not made explicit to the advertiser or ad targets.46

Leveraging their detailed user data sets, machine-learning algorithms then optimize ads automatically and instantly, zeroing in on who is most receptive to their messages, when and in what format.47 Using these tools, advertisers can profile consumers based on their susceptibility to various appeals, tweaking ads through multiple iterations in an attempt to find the most influential persuasion strategy for each individual user, such as appeals to authority or identity, favourite colours or images.48 These ad services with machine-learning capabilities enable advertisers to continually iterate ads, often running thousands or tens of thousands at the same time, many to small audiences, in an effort to see which ads "stick."

The pricing model for data-driven advertising is also unique. Many ad services on platforms offer a real-time bidding system, where multiple advertisers compete to reach particular audiences. For example, if company A wishes to advertise to skiers, and company B to women in Alberta, when a female skier from Calgary logs on, the platform weighs the companies' bids and serves the woman the winning bidder's ad, based on calculations it has completed in microseconds.49

In choosing bids, some platforms consider factors other than price, such as relevancy—the likelihood that users exposed to the ad will watch, click or share it.50 As one platform puts it, "we subsidize relevant ads in auctions, so more relevant ads often cost less and see more results."51 Platforms' data-driven advertising services also let advertisers optimize campaigns to achieve specific behaviours: for example, if an advertiser optimizes for "shares," the platform shows ads to people who are the most likely to share, at a lower cost.52

Data-driven advertising services are popular among advertisers precisely due to the services' use of data and ability to optimize ad delivery at a relatively low cost to the advertiser.

Users are often not aware of how and why they see particular ads, nor are they aware of the impact that their clicks, likes and shares will have on the ads they will see in the future. Likewise, advertisers themselves may lack information on how these data-driven ads function. For instance, due to the ad-pricing models that many advertising services use, advertisers may not know how much a competitor is being charged for the delivery of their ads, and may be charged more or less than their competitors, depending on the content of the ad and who they are seeking to reach with it.53 This raises questions about how level the playing field is for political actors purchasing digital ads.

Footnotes

Footnote 9 Elections Canada has accounts in both official languages on Facebook, Twitter, Instagram, YouTube and LinkedIn. On election day, these accounts had a combined 109,000 followers. Our social media efforts were part of an extensive multi-channel awareness campaign to inform Canadians about when, where and ways to register and vote. For example, Facebook encouraged eligible Canadian users to register to vote by sharing a link to Elections Canada's online registration service.

Footnote 10 GlobalWebIndex (Q3 2019), cited in We Are Social, "Digital 2020: Canada" (2020), slide 43. https://wearesocial.com/ca/digital-2020-canada; Figures are based on internet users' self-reported behaviour.

Footnote 11 Although the distribution of organic content is subject to relatively few regulations, it should be kept in mind that costs associated with the production of the posts (e.g. for a video that is later shared organically) may constitute expenses that are subject to regulation under the political financing rules of the Canada Elections Act (hereinafter "CEA").

Footnote 12 Government of Canada, "Canada Declaration on Electoral Integrity Online" (2019). https://www.canada.ca/en/democratic-institutions/services/protecting-democracy/declaration-electoral-integrity.html/. See also Joan Bryden, "Several tech giants sign onto Canadian declaration on electoral integrity," Global News, May 27, 2019. https://globalnews.ca/news/5323084/tech-giants-electoral-integrity/

Footnote 13 Jonathan Haidt and Tobias Rose-Stockwell, "The Dark Psychology of Social Networks: Why It Feels Like Everything Is Going Haywire," The Atlantic, December 2019. https://www.theatlantic.com/magazine/archive/2019/12/social-media-democracy/600763/

Footnote 14 Ibid. One form this takes is the practice of testing multiple variants of headlines to find the version that generates the highest click-through rate, then pushing out that successful variant; this has given rise to so-called clickbait headlines of the "... you won't believe what happened next" variety.

Footnote 15 For example, see Facebook's Community Standards at https://www.facebook.com/communitystandards/objectionable_content/ or Twitter's Rules at https://help.twitter.com/en/rules-and-policies/twitter-rules/

Footnote 16 Renee Diresta, "Up Next: A Better Recommendation System," Wired, November 4, 2018. https://www.wired.com/story/creating-ethical-recommendation-engines/

Footnote 17 Will Oremus, "Who Controls Your Facebook Feed," Slate, January 3, 2016. http://www.slate.com/articles/technology/cover_story/2016/01/how_facebook_s_news_feed_algorithm_works.html/

Footnote 18 Taylor Owen, "The Case for Platform Governance," CIGI Papers No. 231 (November 2019), 3–4. https://www.cigionline.org/sites/default/files/documents/Paper%20no.231web.pdf

Footnote 19 Will Oremus, "Who Controls Your Facebook Feed," Slate, January 3, 2016. http://www.slate.com/articles/technology/cover_story/2016/01/how_facebook_s_news_feed_algorithm_works.html/. Oremus notes, for example, that Facebook's algorithms tend to reward content that is "engineered to go viral" by drowning out other types of posts on users' newsfeeds.

Footnote 20 William J. Brady, Julian A. Wills, John T. Jost, Joshua A. Tucker and Jay J. van Bavel, "Emotion Shapes the Diffusion of Moralized Content in Social Networks," Proceedings of the National Academy of Sciences of the United States of America 114, no. 28 (July 11, 2017; first published June 26, 2017): 7313–18, at 7313. https://doi.org/10.1073/pnas.1618923114/; Pew Research Center, "Partisan Conflict and Congressional Outreach," February 23, 2017. https://www.people-press.org/wp-content/uploads/sites/4/2017/02/LabsReport_FINALreport.pdf, cited in Jonathan Haidt and Tobias Rose-Stockwell, "The Dark Psychology of Social Networks: Why It Feels Like Everything Is Going Haywire," The Atlantic, December 2019. https://www.theatlantic.com/magazine/archive/2019/12/social-media-democracy/600763/

Footnote 21 Giovanni Luca Ciampaglia and Filippo Menczer, "Biases Make People Vulnerable to Misinformation Spread by Social Media," Scientific American, June 21, 2018. https://www.scientificamerican.com/article/biases-make-people-vulnerable-to-misinformation-spread-by-social-media/

Footnote 22 Soroush Vosoughi, Deb Roy and Sinan Aral, "The spread of true and false news online," Science 359, no. 6380 (March 8, 2018): 1146–51, at 4. https://science.sciencemag.org/content/sci/359/6380/1146.full.pdf

Footnote 23 Hunt Allcott and Matthew Gentzkow, "Social Media and Fake News in the 2016 Election," Journal of Economic Perspectives 31, no. 2 (2017): 211–35, at 212 and 214.

Footnote 24 Siva Vaidhyanathan, Anti-Social Media (New York: Oxford University Press, 2018).

Footnote 25 Eryn J. Newman and Lynn Zhang, "Truthiness: How Non-Probative Photos Shape Belief," in R. Greifeneder, M. Jaffé, E. J. Newman and N. Schwarz (eds.), The Psychology of Fake News: Accepting, Sharing, and Correcting Misinformation (New York: Routledge, 2020). https://www.researchgate.net/publication/337032205_Truthiness_How_Non-Probative_Photos_Shape_Belief/; American Press Institute, "'Who Shared It?': How Americans Decide What News to Trust on Social Media," March 20, 2017. https://www.americanpressinstitute.org/publications/reports/survey-research/trust-social-media/

Footnote 26 Briony Swire, Adam J. Berinsky, Stephan Lewandowsky and Ullrich K.H. Ecker, "Processing Political Misinformation: Comprehending the Trump Phenomenon," Royal Society Open Science 4, no. 3 (2017): 16. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5383823/

Footnote 27 Samidh Chakrabarti, "Hard Questions: What Effect Does Social Media Have on Democracy?" Facebook blog, January 22, 2018. https://about.fb.com/news/2018/01/effect-social-media-democracy/

Footnote 28 Paul Briggs, "Canada Digital Ad Spending 2019," eMarketer, March 28, 2019. https://www.emarketer.com/content/canada-digital-ad-spending-2019/

Footnote 29 "Ads and data," Safety Centre, Google. https://safety.google/privacy/ads-and-data/

Footnote 30 Quoted in Bree Rody, "Programmatic to Dominate in Canada by 2020: Study," Media in Canada, November 26, 2018. https://mediaincanada.com/2018/11/26/programmatic-to-dominate-in-canada-by-2020-study/

Footnote 31 Paul Briggs, "Canada Digital Ad Spending 2019," eMarketer, March 28, 2019. https://www.emarketer.com/content/canada-digital-ad-spending-2019/

Footnote 32 CEA, S.C. 2000 c. 9., ss 350(1)(b), 429.1 and 430(1).

Footnote 33 For digital ads where space does not permit a tagline, it is acceptable to put information on the page the user saw after clicking on the ad. For more details, see Elections Canada, "Election Advertising on the Internet" (Interpretation Note 2015-04), July 30, 2015. https://www.elections.ca/content.aspx?section=res&dir=gui/reg&document=index&lang=e/

Footnote 34 CEA, infra, ss 325.1 and 325.2.

Footnote 35 "New Registry Requirements for Political Ads on Online Platforms," Elections Canada. https://www.elections.ca/content.aspx?section=pol&dir=regifaq&document=index&lang=e/

Footnote 36 Tom Cardoso, "Google to ban political ads ahead of federal election, citing new transparency rules," The Globe and Mail, March 4, 2019. https://www.theglobeandmail.com/politics/article-google-to-ban-political-ads-ahead-of-federal-election-citing-new/

Footnote 37 Michele Austin, "An update on Canadian political advertising," Twitter blog, August 29, 2019. https://blog.twitter.com/en_ca/topics/company/2019/update_canadian_political_advertising_2019.html/

Footnote 38 "Updates to ads about social issues, elections or politics," Facebook for Business, updated June 25, 2019. https://www.facebook.com/business/news/updates-to-ads-about-social-issues-elections-or-politics; The Digital Advertising Alliance of Canada provides a link to ad registries for Bell Media, CBC–Radio Canada, Facebook, The Globe and Mail, Postmedia and Rogers: https://politicalads.ca/en/registries/

Footnote 39 Anthony Nadler, Matthew Crain and Joan Donovan, "Weaponizing the Digital Influence Machine: The Political Perils of Online Ad Tech," Data & Society, October 17, 2018, 18.

Footnote 40 Ibid., 11–12.

Footnote 41 Ibid., 11–13.

Footnote 42 "Psychographics" refer to data collected by analyzing consumers' "activities, interests, and opinions," such as taste in movies, restaurants and music, type of car, reading and TV viewing habits, locations and club memberships (CBInsights, 2018; Kranish, 2016). Such data go beyond standard demographic data used in marketing, such as age, gender or race, and can be used to generate highly tailored messages that "trigger a range of emotional and subconscious responses" by appealing to each consumer's vulnerabilities and biases (CBInsights, 2018; Chester and Montgomery, 2017, 6–7). Companies such as Cambridge Analytica have used psychographic data to develop messages "tailored to the vulnerabilities of individual voters" in an effort to influence the recipient's vote choice (Chester and Montgomery, 2017, 6–7). See CBInsights, "What is Psychographics? Understanding the 'Dark Arts' of Marketing that Brought Down Cambridge Analytica," CBInsights, June 7, 2018. https://www.cbinsights.com/research/what-is-psychographics/; Jeff Chester and Kathryn C. Montgomery, "The Role of Digital Marketing in Political Campaigns," Internet Policy Review 6, no. 4 (December 2017): 1–20, at 6–7; Michael Kranish, "Trump's plan for a comeback includes building a 'psychographic' profile of every voter," The Washington Post, October 27, 2016. https://www.washingtonpost.com/politics/trumps-plan-for-a-comeback-includes-building-a-psychographic-profile-of-every-voter/2016/10/27/9064a706-9611-11e6-9b7c-57290af48a49_story.html/

Footnote 43 Anthony Nadler, Matthew Crain and Joan Donovan, "Weaponizing the Digital Influence Machine: The Political Perils of Online Ad Tech," Data & Society, October 17, 2018, 13, 18.

Footnote 44 Julia Angwin, Surya Mattu and Terry Parris, Jr, "Facebook Doesn't Tell Users Everything It Really Knows About Them," ProPublica, December 27, 2016. https://www.propublica.org/article/facebook-doesnt-tell-users-everything-it-really-knows-about-them/

Footnote 45 For example, Facebook names this feature "custom audiences." https://www.facebook.com/business/help/341425252616329?id=2469097953376494

Footnote 46 For example, Facebook names this feature "lookalike audiences." https://www.facebook.com/business/help/164749007013531?id=401668390442328/

Footnote 47 Jerry Dischler, "Putting Machine Learning into the Hands of Every Advertiser," Google blog, July 10, 2018. https://www.blog.google/technology/ads/machine-learning-hands-advertisers/

Footnote 48 Matt Gay, "Machine Learning Will Transform the Advertising Industry," CMS Wire, October 26, 2017. https://www.cmswire.com/digital-marketing/machine-learning-will-transform-the-advertising-industry/

Footnote 49 This example was adapted from one provided by Facebook. Facebook, "Ad delivery," Facebook Business Help Center. https://www.facebook.com/business/help/430291176997542?id=561906377587030/

Footnote 50 Facebook, "Ad pricing," Facebook Business Help Center. https://www.facebook.com/business/help/201828586525529?id=629338044106215/

Footnote 51 Facebook, "Ad delivery," Facebook Business Help Center. https://www.facebook.com/business/help/430291176997542?id=561906377587030/

Footnote 52 According to Facebook, "When you make your 'Optimization for Ad Delivery' choice for an ad set, you're telling us to get you as many/much of that result as efficiently as possible. For example, if you optimize for link clicks, your ads are targeted to people in your audience who are most likely to click the ads' links." Facebook for Business, "About Optimizing for Ad Delivery." https://www.facebook.com/business/help/355670007911605?id=561906377587030/

Footnote 53 A former member of Facebook's advertising team claimed, in an op-ed piece, that the platform's advertisement architecture led to significantly lower advertisement fees for Trump's campaign compared to Clinton's during the 2016 election: "During the run-up to the election, the Trump and Clinton campaigns bid ruthlessly for the same online real estate in front of the same swing-state voters. But because Trump used provocative content to stoke social media buzz, and he was better able to drive likes, comments, and shares than Clinton, his bids received a boost from Facebook's click model, effectively winning him more media for less money." Antonio Garcia Martinez, "How Trump Conquered Facebook – Without Russian Ads," Wired, February 23, 2018. https://www.wired.com/story/how-trump-conquered-facebookwithout-russian-ads/