Within days of each other in May of 2025, three polls were released on the topic of Alberta separation. Pollara showed support among Albertans for separating from Canada at 24 per cent. Leger found support for leaving was much higher: a whopping 41 per cent. Then an Abacus poll came out. “Remainers” could breathe a sigh of relief. Support for separating was at only 18 per cent.
Ordinary Albertans would be forgiven for not knowing what to believe.
Further fuzzying matters was Pollara’s declaration that its figure on separation was “the highest recorded by the company” since 2021. But even this surging support for Alberta separation was dwarfed by the Leger poll. And then Abacus flatly declared “Support for separation is low.” Its report concluded: “There is not a widespread appetite among the public for such a significant constitutional rupture.”
Polling—also called surveying—takes many different forms, and in Alberta this year we’ll see poll numbers touted by separatists, and other numbers cited by their federalist opponents. With so much on the line, how useful is polling for providing accurate insights into what people want?
Public opinion polling started in Canada during the 1940s. The first national poll was conducted by the Liberal Party in 1942 to try to determine the likely outcome of a plebiscite on conscription. The first election poll was done by the Canadian Institute of Public Opinion in 1945. The first political poll—used to help craft a party’s electoral strategy—was by the Quebec Liberals in 1959. Yet only since the 1980s have national polls been conducted regularly. Polling of public opinion at the provincial level is now routine as well.
Sometimes polling itself becomes the story. For example, during Alberta’s 2012 election, most pollsters projected a win by the Wildrose, led by Danielle Smith. As it turned out, the Progressive Conservatives under Alison Redford were the victors by a comfortable margin. “Clearly something’s wrong,” pollster Bruce Cameron told the CBC afterward. “I’m concerned about the impact on the credibility of the profession.”
Similarly, the 2017 civic election in Calgary was marked by conflicting polls, one of which suggested incumbent mayor Naheed Nenshi was losing badly. A post-election investigation by the Marketing Research and Intelligence Association (MRIA) noted that “accurate public opinion polling can provide voters with information about the views of their fellow citizens [and] draw voters’ attention to particular candidates and issues.” It found that several polls by Mainstreet Research, commissioned by Postmedia, had been “seriously, methodologically flawed” and had “significantly affected the course of the campaign,” including by throwing Nenshi on the defensive and dooming the third-place candidate’s campaign.
The chief differences in polling techniques can be categorized according to the how (methodology) and the what (the content of the questions).
The “how” of polling constantly evolves. In his history of polling in Canada, Christopher Adams suggests polling had several precursors, including censuses, the first of which was conducted in 1666 by the intendant of New France. Another precursor is the market survey. In 1929, for example, the Canadian Business Research Bureau interviewed thousands of users of various products to learn what they “actually think and know about the goods.” Over time, market research acquired the sheen of academic rigour. W.W. Goforth, who taught economics at McGill, was hired by the ad agency Cockfield, Brown & Co. in 1928. In this early period, data-gathering included face-to-face interviews, telephone interviews and mail-in surveys.
The possibility of polling errors became apparent even in those early days. In 1919 the Winnipeg Free Press asked readers to complete a questionnaire about cars and accessories and mail it back. Questions included “How much have you spent for accessories [since the car purchase]?” Bertram Brooker, an ad executive, wrote in Marketing and Business Management in 1924 that surveys of this sort presented many problems, including an inability to control the sample size and to screen respondents. This survey, for example, failed to screen people who didn’t own or regularly use their car.
A major development in polling in the 21st century has been the decline in telephone surveys and the rise of the online variety. Canada’s Angus Reid Institute, for example, conducts all of its surveys and studies through the internet, citing a Pew Research study showing that response rates for phone interviews had dropped to only 6 per cent in 2018.
One pollster that remains attached to telephone surveying is Janet Brown. The company she uses, Trend Research, is based out of Edmonton. “The secret sauce is Albertans calling Albertans,” says Brown. She uses cell numbers and landlines, and says people are more inclined to stay on the line—or pick up in the first place—if they believe they’re talking to someone in Edmonton rather than in Montreal or outside Canada entirely. The company makes five attempts to reach a person before giving up.
The MRIA report on Calgary’s misleading polls in 2017 singled out Mainstreet for not calling enough cell phones, thus ignoring younger voters and creating an unrepresentative sample.
Frank Graves, founder and president of EKOS Research, however, isn’t as sold on telephone polling. “There is a tendency to overemphasize the importance of the mode of contact,” he says. “Whether or not you call someone, email them, do an intercept survey in the street or in their home—they all have different strengths and weaknesses.”
What Brown, MRIA and Graves would agree on, however, is that good polling requires a representative sample. This means, as the Pew Research Centre puts it, that any sample is “assumed to be representative of the larger population on any question we might be interested in.” Pollsters use various techniques to improve the reliability of a sample. One is weighting, or adjusting the relative contribution of respondents. People who participate in polls don’t necessarily reflect the general population. For example, they’re more likely to have a postsecondary degree. The pollster’s answer to this problem is to “weight down” the responses of postsecondary graduates in their final results.
Brown argues that polling in Alberta tends to underestimate the conservative vote and overestimate the progressive one. “Pollsters make the same mistakes over and over again,” she told the CBC in November 2020. It’s a message she reiterated to Alberta Views. Her theory is that progressive Albertans are more likely to participate in polls and that it takes strenuous attempts to reach more-conservative voters—by phone—to correct for this bias. “I sometimes joke that progressive people will tell you their opinions all day long,” she says. “Conservatives are a little bit more cautious, a little bit more reticent, and don’t want to be probed as much.”
Members of the public are sometimes invited to participate in opinion surveying. Once recruited, they’re part of what’s called a “panel”—a cohort asked to respond to survey questions. The most reliable panels are selected through probability-based sampling; the very best kind—online or offline—are those in which respondents are chosen at random. To do that perfectly would require a comprehensive list of the entire population or at least the target population. But such lists aren’t usually available, so pollsters use a variety of probability-based techniques to try to ensure that their samples (their panels) are constituted in such a way as to minimize sampling error.
It is important to distinguish between probability-based panels and opt-in samples. “Opt-in” means the respondents weren’t chosen at random; rather, they were invited to participate. The Pew Research Centre has concluded that opt-in sampling is only half as accurate as probability-based panels. However, opt-in sampling can be conducted in such a way as to improve its reliability. Canada’s Angus Reid Forum is an online panel composed of people selected through online ads on numerous, diverse websites. The company claims its panels “reflect the general population by continually verifying and recruiting so that the socio-demographic characteristics of each sampling region match actual sub-populations according to both the census and electoral data.”
Polling is, of course, ultimately about the answers to the questions—the what. Questions can be asked in numerous ways, including as neutrally as possible. Or questions can be selective, or leading, or presented in a specific order if a certain result is desired. Sometimes this takes the form of what is called “push polling.”
This is what the provincial government’s Alberta Next Panel has been accused of. In early 2025 Albertans were invited by the government to give their opinions, ostensibly to help the government choose its policies. A St. Albert Gazette editorial in August 2025 argued the surveying conducted through the Alberta Next website was more accurately described as push polling. “You can frame the questions in such a way that every answer is a version of your view, or [the questions] simply don’t include any option to oppose that view,” wrote Gazette staff.
The Alberta Next website, for example, offered ideas on “how to strengthen our sovereignty,” including ending equalization, creating an Alberta Pension Plan, forming a provincial police force, and withholding social services from immigrants. University of Calgary political science professor Lisa Young said, “The subject matter of the Alberta Next questions and videos are very much informed by the groups that we might call the UCP base.” The survey itself, which closed October 10, purported merely to consult Albertans.
On the topic of immigration, however, respondents (whether online or at the town halls) were required to first watch a short video, then were asked “Should Alberta take more control of the immigration system to counter Ottawa’s open-borders policies?” They were then presented with the following statement: “Ottawa approved 1.2 million people under the permanent and temporary immigrant streams in 2024. This is four times more than was approved in 2014 under prime minister Stephen Harper.”
Finally, respondents were asked to choose from three options:
• Far too much—immigration should be brought down [to] under 2014 levels;
• Definitely too high, and immigration needs to be brought down to 2014 levels again;
• Acceptable—I have no issue with immigration levels being this high.
Given this context, respondents’ answers were a foregone conclusion.
As of late November, none of the Alberta Next survey results had been released to the public. But premier Smith says she’ll use those results to assess which proposals will move to a referendum and which her government will legislate directly.
A investigation in 2017 found several polls in Calgary had been “seriously, methodologically flawed.”
Polling on support for Alberta political parties is conducted regularly. For those that pay for it, Janet Brown and journalist Paul McLoughlin release the monthly “Wild Ride” update, showing, e.g., how the UCP and NDP are faring. Accompanying charts track party support over time.
Polling such as this is sometimes criticized for its oversized impact on public discourse. For example, a poll by Brown in the late stage of the 2023 campaign, which contradicted other polls by showing the UCP ahead in Calgary, was leaked to the press. As political commentator David Climenhaga wrote, “This [survey] does change the narrative of the last two weeks of the election campaign—and that’s why it was leaked.”
Episodes like this colour the public perception of polling. “If you took a public opinion poll about polls, odds are that a majority would offer some rather unfavorable views of pollsters and the uses to which their work is put,” wrote E.J. Dionne Jr. and Thomas E. Mann for the Brookings Institution. The authors add that “public opinion is an illusive commodity.” Not all polls are created equal. Some are well constructed and designed to be credible; some are not. Some polls do seem to push people to give the answer that those who commissioned the polls want.
This is arguably what Brown herself did when she found support for a potential provincial pension plan. The poll, commissioned by the Smith government and conducted in April and May of 2025, asked Albertans how they would vote in a referendum on a variety of proposals, including “Replacing the Canada Pension Plan (CPP) with an Alberta Pension Plan (APP) that guaranteed all Alberta seniors the same or better benefits than the Canada Pension Plan.” Fifty-five per cent of respondents said they’d vote for the APP, while 45 per cent said they’d vote against it. But as CBC journalist Jason Markusoff noted, the question added a new nuance to the previous binary of CPP vs. APP. It posited a “guarantee of no financial risk for pensioners,” he wrote, “an assurance that could depend largely on how much of the total CPP pie Alberta would get as its starting pot, a figure that remains in dispute.” Who wouldn’t prefer something with no downside?
Whether using leading questions or not, polling has an effect on its audience. “There are concerns that inaccurate voting intention polling has a negative impact on the conduct of elections due to its influence on voters, the media and political parties,” concluded the Select Committee on Political Polling and Digital Media, struck by the UK’s House of Lords. The committee was a response to three consecutive cases of polling getting it wrong during critical moments: the UK’s 2015 and 2017 general elections and the 2016 vote on leaving the European Union—the infamous Brexit referendum.
The committee noted a number of theories on how polling results impact voters. One of these is the “bandwagon effect,” in which people get on board with an idea because they see many other people doing the same. The obverse of this is the “underdog effect,” which can encourage people to “adopt a minority view out of sympathy.” Polling can also affect voter turnout. Academics showed the committee evidence “that turnout is higher in elections that are anticipated to be close.” Conversely, if a poll tells someone that their candidate is losing badly, they might not bother voting at all.
So what do we really know about support for Alberta separation? In an interview with Alberta Views, Dennis Modry, the founder of the Alberta Prosperity Project (APP), which is leading the separatist charge, says his movement enjoys a “plurality” of support, meaning that more Albertans want to separate from Canada than remain undecided or want to stay in Canada. And he argues polling underestimates separatist support. “We still live in an era of cancel culture,” he says. “Oftentimes, people won’t respond to a question that they perceive as possibly controversial or that has any risk of cancelling them in any way.”
A further wrinkle is that not all polling companies are created the same. Cardinal Research, for example, released a poll in October 2025 suggesting that 11 per cent of decided voters in Alberta support the new provincial Republican Party. It was the first poll to show significant support for the separatist Republicans; CBC polls analyst Éric Grenier called it “a bit of a jaw-dropper.” But a Toronto Star story noted that Cameron Davies, leader of the Republican Party, had until recently been a part-owner of Cardinal Research. Until the Star story came out, media coverage omitted that potential bias. The Lethbridge Herald quoted Davies: “What [the poll] shows is our message is resonating with Albertans; we’re getting out there, we’re doing the work.”
The Canadian Research Insights Council (CRIC), which represents pollsters, cautions that the publication of political-poll results “carries with it the potential for great consequence,” including the risk of misleading voters or eroding public trust. The council asks that journalists, before publishing a poll’s results, consider how questions were phrased, sample sizes and margins of error. It also asks “Who’s the sponsor, and what’s their interest in the topic?” CRIC quotes a former US network TV director: “When assessing whether to publish the results of a poll, media need to apply the same degree of journalistic critical practices and skepticism that they would to any other source of information.”
One of Modry’s main adversaries, Thomas Lukaszuk, leader of the Forever Canadian campaign, believes support for Alberta separatism is much more modest. “I think it’s fair to say approximately 10 per cent of Albertans are 100 per cent determined to separate from Canada,” he says. Lukaszuk’s impression is based in part on months of collecting signatures from hundreds of thousands of “pro-remain” Albertans. Meanwhile, he believes separatists themselves are divided. “That 10 per cent is further subdivided between those who’d like to join the US [and] those who somehow envision forming a new country.”
Polling conducted by Janet Brown, commissioned by CBC, has offered a more nuanced picture. In her survey of 1,200 random Albertans, conducted in May 2025, 22 per cent of respondents identified as “committed separatists.” Her survey suggests that a further 14 per cent identify as “soft separatists”—frustrated, perhaps, but not especially keen to leave Canada. They tend to approve of premier Smith’s attempt to forge a new relationship with Ottawa.
A deeper dive into Brown’s polling data offers further insights. John Santos, Brown’s data scientist, explained: “Of those who are ‘not very’ or ‘not at all’ confident in their ability to save for retirement, 36 per cent would vote for separation; conversely, only 22 per cent of those who are ‘very’ or ‘somewhat’ confident in their ability to save for retirement would vote for separation. This is very much an issue of financial security.”
This survey and others suggest the drivers of separatism in Alberta aren’t cultural or linguistic as in Quebec. They’re economic. The U of C’s Lisa Young argues separatists believe “Canada has stood in the way of Alberta’s prosperity because of [federal] environmental regulations.” Lukaszuk observed something similar: “We found that cities, towns and other areas that are very much reliant on the oil and gas servicing industry tend to be more pro-separatist.”
Frank Graves of EKOS suggests that surveys on Alberta separatism are being swayed by misinformation. He shared with Alberta Views preliminary results of polling he conducted on behalf of the labour movement. “The level of misinformation in Alberta is the highest in the country,” he says. In polling on the role of false information in Alberta politics, EKOS asked a number of screening questions. Respondents were asked whether certain statements were true or false. For example: “Deaths due to COVID-19 vaccines are being intentionally hidden by the government.” This process helped EKOS identify respondents that have been swayed by falsehoods. Graves believes that susceptibility to misinformation is the most “powerful predictor” of support for separating.
Not all polls are created equal. Some seem to push people to answer questions a certain way.
So, what ultimately explains the difference between Pollara’s 24 per cent, Leger’s 41 per cent and Abacus’s 18 per cent support for Alberta separatism? Perhaps subtle changes in the wording:
“If a referendum were held on your province’s sovereignty, would you vote FOR or AGAINST?” (Pollara)
“If a provincial referendum were held tomorrow on whether or not your province should separate from Canada to form its own country, how would you most likely vote?” (Leger)
“Do you agree that the province of Alberta shall become a sovereign country and cease to be a province of Canada?” (Abacus)
Leger gave respondents the option of “strong support” or “somewhat support,” then combined these to show “support for separating.” Pollara’s choice was starker: “Stay or separate?” And Abacus seemed to be asking for a prediction. The questioners had different sample sizes, weighting strategies and survey methods. One survey prompted respondents with a question about Liberal leader Mark Carney; another prefaced its survey by asking how closely respondents have been following the news. Respondents may have been influenced by the looming federal election, or by financial insecurity, or by misinformation, or even by other surveys they’d seen.
For his part, Graves didn’t want to speculate on the differences. “If I were doing this,” he says, “I’d do random control assignment testing. A random portion is assigned to version A of the question, and the other is assigned to version B.” This would be the only way to determine whether or not the way the question was asked had a bearing on the result.
Many more such surveys will be released this year. The stakes are high—Albertans could soon face a separation referendum. And the ultimate poll will come at the ballot box.
Laurence Miall lives in Edmonton. He has written for Jacobin, the CBC and Alberta Views, and is a former editor of carte blanche.
____________________________________________

