Skip to content

CANADA: Enrolment, test scores falling as school spending increases

The big spending provinces in Canada did not necessarily get the best Program for International Student Assessment (PISA) standardized test results.
Tuition
Stock image

This article, written by Jim Marshall, University of Regina; Haizhen Mou, University of Saskatchewan, and Michael M. Atkinson, University of Saskatchewan, originally appeared on The Conversation and is republished here with permission:

The question of value for money is central to any public policy consideration. Given its scale, coupled with its critical social and economic impact, education ranks as one of the most important and challenging policies for analysis.

Canada’s school boards spent a total of $53.2 billion — about $11,300 per student — educating students in kindergarten to Grade 12 public systems in 2015 and that amount has been growing by more than $1 billion every year for several years.

Economists have recently analyzed the efficiency of education sectors in various countries. But despite the importance of schooling and school policy, policy-makers and economists in Canada have done little to analyze how well the public school system funded by provincial governments is working in terms of students’ academic achievement.

In our recent study, we examined how much money each province spends and how this stands up compared to how students achieve in the Program for Student Assessment (PISA) — a global standardized test that assesses reading, math and science. We learned that the provinces varied widely in their ability to produce academic results for the money they spent.

Converting money into performance

We looked at data from three different sources to try to analyze outcomes: first, Statistics Canada data on spending levels by school boards across Canada on a province-by-province basis; second, Statistics Canada data on the number of students in the public K-12 system in each of the provinces in Canada; third, data from the Organization for Economic Cooperation and Development (OECD), which collects data in the PISA program which it administers.

These PISA tests are conducted every three years, so we had data on Canadian student performance, by province, on the tests from six different surveys between 2000 and 2015 (the 2018 PISA result was not available when we conducted the research).

These three sets of data allowed us to construct an index of school outputs comprised of two factors: the number of students served by the public schools in each province and their performance on PISA tests.

Then we compared those output indices with the amount that school boards spent in each jurisdiction to see how the boards were doing at converting money into student performance, and how they compared among provinces.

Research suggests socio-economic status and other factors may be important determinants of what economists would term “outputs” — in this case, the number of students served and test results. Such factors considered environmental are not identical across the provinces and are not controlled by school boards.

For this reason, we wanted to allow for cost-efficiency comparisons based on a more level playing field. So, after examining provincial costs against PISA scores, we conducted a second-stage analysis of the data and we corrected for the effects of the cross-province differences in socio-economic conditions.

Student expenditures growing

If you look at PISA performance, you will see that there is substantial variation among provinces.

Alberta students scored an average of 542 on science tests in 2015, while students in Saskatchewan and Manitoba on average scored only 496. Alberta students also did very well on reading tests (scoring 536), while Saskatchewan and Manitoba lagged far behind at 496 and 495, respectively. Québec led the country in math, with an average score of 531, while Saskatchewan, Manitoba and Newfoundland and Labrador lagged far behind with scores in the 480s.

The provinces also vary widely in their spending on public education. Québec spent more than $12,400 per student in 2015, while British Columbia spent 26 per cent less at $9,200 per student.

Our index revealed some startling things: from 2000 to 2015, per student expenditures (adjusted for inflation) grew by 41 per cent in the 10 provinces. During the same time, enrolments fell by seven per cent across the provinces. But student performance on PISA tests also fell during this period by 11 points in reading (about two per cent), 23 points in math (about 4.5 per cent), and four points in science (about one per cent).

Because they served fewer students and achieved worse PISA outcomes, even though they spent more (in total and per student), the school boards became, on average, 20 per cent less efficient in turning their budgets into measurable student outcomes in the 15 years under review.

P.E.I. most cost-effective

The provinces also varied widely in their ability to produce results for money they spent. The big spending provinces did not necessarily get the best PISA results. And the provinces that spent the least did not necessarily get the worst.

Overall, Prince Edward Island was the most efficient province (set at 100 per cent on the scale of cost-efficiency scores), while Saskatchewan and Manitoba were the least efficient, at 33.9 per cent and 38.5 per cent, respectively, below P.E.I.

In between, the other provinces also fared badly. Big-spending Québec, for example, was 31.8 per cent less efficient than P.E.I., while low-spending B.C. was only 7.2 per cent behind P.E.I.

Equity measures and efficiency

To gauge the extent to which performance measures, such as PISA scores, factored into budgeting decisions, we also interviewed 28 budget managers in 10 Canadian provinces and two territories (12 from the department of finance or treasury board of governments, and 16 from the budget offices or equivalents of departments of education). We did this to probe what criteria budget officers use to allocate education resources.

We found that the preponderant pattern of budgeting used by these managers is an “increments-based-on-formula” approach — meaning that new expenditures are based on the previous expenditures, and the adjustment is mainly according to changes in student numbers and/or salary costs.

We argue that this formula often takes account of equity imperatives (such as allocating adequate resources to schools with more students with special needs), but is not particularly responsive to efficiency concerns.

It is unclear how the fact that more money is being spent may relate to particular, new or emerging needs in schools. What is known is that increased spending does not seem to be raising student achievement on PISA tests.

The Fraser Institute finds that compensation (salaries, wages, pensions and fringe benefits) “accounts for most of the increase in spending” in education in Canada. But their research does not parse what exactly this compensation covers: for example, whether it relates to additional positions or roles to meet student needs.

The results of our study leave several unanswered questions.
There may be many factors impacting some provinces’ cost-to-achievement performance such as challenges related to inclusive classrooms, geography and socio-economic conditions, but more money has not led to better measurable academic results.

It may be time to find out why.The Conversation

Jim Marshall, Lecturer, Johnson Shoyama Graduate School of Public Policy, University of Regina; Haizhen Mou, Associate Professor, Johnson Shoyama Graduate School of Public Policy, University of Saskatchewan, and Michael M. Atkinson, Public Policy Professor Emeritus, University of Saskatchewan

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Comments

Verified reader

If you would like to apply to become a verified commenter, please fill out this form.