In Part 1 of this post, I looked at a few criticisms of the methodology used in the Fraser Institute’s high school rankings. Here, I’m going to explain what I think is the real problem with the rankings: they’re not necessary.

The Fraser rankings, released annually and regularly reported by the media, have largely shaped public perception of school performance. The authors have stated they want to make it possible for parents and educators to easily compare and monitor the academic performance of Alberta high schools. They could have done this by creating a better interface for Alberta diploma exam data. Comparisons made with that data would be easy to understand and evaluate. Instead, they came up with a complicated (and arbitrary) scoring formula to rate and rank schools that essentially shifted the conversation from how schools are doing academically to how schools are doing in the Fraser rankings.

The rankings are arbitrary because the scoring formula is arbitrary. There is no theoretical reason for using the indicators the authors have selected (like gender gap or courses taken per student) or for combining them in the complicated manner they’ve chosen. (We tried to recreate the ratings from the 2012 Fraser report by following their methodology, but we had limited success.) A simpler alternative, such as comparing schools on their average diploma exam mark, would make much more sense.

This is why we collect data on a group of objects after all. We want to use it to make comparisons, track progress, and study relationships. Introducing a derived measure like a rating needlessly complicates things. Ratings take us away from the patterns and relationships in the original achievement data.

But let’s say that you agree that delayed advancement rate (defined by the authors as the extent to which schools keep students progressing in a timely manner toward the completion of their diploma program) is a relevant indicator of academic achievement that’s under the school’s control and that it can be accurately calculated using the authors’ convoluted algorithm.

How, then, do you go about weighing delayed advancement rate against average diploma exam mark? Given that you’re using average diploma exam mark as just another indicator, there’s no external standard of academic performance (ground truth) to test your decisions against or to help you tune the structure and parameters of your model.

So what do you do? Like the authors, you probably just make it up. A little intuition here, some personal judgement there, and, pretty soon, you’ve made up the entire formula — which leads us to the second problem.

The Fraser scoring formula, with its mix of academic indicators and ad hoc weightings, produces a different set of rankings than those produced from the raw exam results alone. We compared the rankings of Alberta high schools based on their average diploma exam mark to the school rankings in the Fraser Institute’s 2012 report. The chart below shows the results, and the differences are illuminating.

Fraser Institute High School Rankings

How to read the chart: The circles represent Alberta high schools. The x-axis measures Gr. 12 enrollment. The y-axis measures the difference between the the average diploma exam mark ranking and the Fraser ranking. Schools above the zero line did better in the Fraser rankings than on the average diploma exam mark ranking; schools below the zero line did worse. For example, Rundle College Academy ranked 45th in the Fraser rankings and 95th in the average diploma exam mark ranking. The school did better in the Fraser rankings, and its value on the y-axis is 95 – 45 = 50. The color of the circles represents the school authority, the size of the circles represent Gr. 12 enrollment.

Differences between the Fraser Institute rankings and the average diploma exam mark rankings:

  • The Fraser rankings favour small schools over big schools. Small schools, as a group, rank higher in the Fraser rankings than they do in the average diploma exam rankings. Big schools, as a group, rank lower.
  • The Fraser rankings punish big public schools. Big public schools are more likely to place lower in the Fraser rankings than in the average diploma exam rankings. There are more large green circles below the zero line than above.
  • The Fraser rankings favour separate schools. Separate schools, and big separate schools in particular, fare better in the Fraser rankings than in the average diploma exam rankings. There are more large red circles above the zero line than below.
  • Private schools, as a group, do better in the Fraser rankings. There aren’t that many private high schools in Alberta, but, as a group, they do better in the Fraser rankings than the average diploma exam rankings.
  • Public schools, as a group, do worse in the Fraser rankings. Although small public schools generally benefit, when looked at overall, public schools come out worse than they would in the average diploma exam rankings.

When a conservative think tank (one that advocates for choice, competition, and markets) ranks schools in a way that’s systematically biased against public schools, you can be forgiven for seeing the results reflecting agenda more than truth. As such, the information the Fraser Institute provides parents and educators is not as useful as it could be and certainly not as useful as it’s made out to be.

Sticking to Diploma Exam Data

There are many interesting insights we can gain into the educational experience of Alberta high school students by looking closely at Alberta Education’s diploma exam results, a rich and detailed dataset. A case can be made, for example, that competition from private schools has helped Calgary public schools outperform public schools in other parts of the province, particularly Edmonton. A case can also be made that many large public schools are delivering quality education to a diverse population of students in less than ideal circumstances.

Alberta Education publishes the data but doesn’t provide an interface for comparing school performance. The Fraser Institute has made comparisons possible but only through a biased index that appears more agenda-driven than useful. So what can Eight Leaves do to help parents and educators easily compare and monitor the performance of Alberta high schools?

We took Alberta Education’s diploma exam data for the 2011-2012 year, and we built a visualization using Tableau. You can find it here: Comparing Alberta High Schools. We think it’s a much better use of a very interesting dataset. Let us know what you think.

Update: We also created a ranking of Alberta high schools based solely on Alberta diploma exam results and a dashboard to track how Alberta high schools have performed in different academic subjects over time.

Related Links

Leave a Reply