58
Photo Credit: Gary A. Vasquez / USA Today Sports

The case for firing Glen Gulutzan, Part 2: Process vs. results

When offered the opportunity to write an article for FlamesNation, I told Ari that the research may take me down a rabbit hole or two before it was all over. This is one of the rabbit holes I found myself buried in.

(Here is Part 1.)

Why using results and goal differential is how a team should be evaluated

An early Google search produced an article written by Rob Found from Edmonton, and posted online at The Sports Journal, “Goal-based Metrics Better Than Shot-based Metrics at Predicting Hockey Success.” It is a statistical evaluation of 10 years of NHL data to compare goal-based versus shot-based metrics to see what methods are best at determining team success in the NHL. It was written in 2016.

Here is a summary: 14 hockey statistical models were used in the study, 10 based on shots (Corsi) and four based on goals. Some of the models used a single statistic (i.e. shots for), while others were “composite” in that the model used two or more combined statistics (i.e. goal differential).

Based on his research, Found concluded that goal-based models outperform shot-based models in predicting team success, both in terms of team results and individual contributions to a team’s success, and that defence-based models are better than offence-based models (goals allowed is superior to goals for and shots allowed is superior to shots for). As for individual contributions to team success, only forward contributions were statistically significant, and only when using goal-based models. Contributions by defenceman were not statistically significant using goal- or shot-based models. 

I took the statistics from the 2016-17 season and applied them to the standings. While a simple evaluation in comparison to the research done for the article, the results are very good at showing how effective certain stats are at predicting where a team will finish in the standings.

2016-17 NHL standings relationship to shot- and goal-based statistics (data from Natural Stat Trick)

Standings Team In Playoffs Points CF% CF% Rank Shot Differential SD Rank Goal Differential GD Rank PDO PDO Rank
1 Capitals Yes 118 51.66 6th +213 4th +84 1st 1.027 1st
2 Penguins Yes 111 49.80 16th +80 9th +49 4th 1.015 5th
3 Blackhawks Yes 109 49.89 15th -66 22nd +28 7th 1.013 6th
4 Blue Jackets Yes 108 49.57 18th +52 12th +54 3rd 1.020 2nd
5 Wild Yes 106 49.79 17th +63 11th +57 2nd 1.020 3rd
6 Ducks Yes 105 49.57 19th -4 18th +23 9th 1.010 7th
7 Canadiens Yes 103 51.36 8th +31 15th +25 8th 1.009 9th
8 Oilers Yes 103 50.44 11th +131 7th +36 6th 1.010 8th
9 Rangers Yes 102 49.06 21st -27 19th +37 5th 1.016 4th
10 Blues Yes 99 49.50 20th +4 16th +17 13th 1.007 11th
11 Sharks Yes 99 51.38 7th +187 5th +19 11th 1.001 15th
12 Senators Yes 98 48.56 24th -3 17th -4 19th 0.998 18th
13 Bruins Yes 95 54.48 2nd +524 1st +23 10th 0.990 21st
14 Maple Leafs Yes 95 50.11 14th -56 21st +16 14th 1.008 10th
15 Flames Yes 94 50.39 12th +33 14th +3 16th 1.000 17th
16 Predators Yes 94 51.10 9th +85 8th +18 12th 1.004 14th
17 Islanders 94 47.53 27th -136 24th +1 17th 1.005 12th
18 Lightning 94 51.05 10th -38 20th +6 15th 1.004 13th
19 Flyers 88 52.15 4th +251 3rd -19 22nd 0.983 27th
20 Jets 87 48.96 22nd -100 23rd -9 20th 1.000 16th
21 Hurricanes 87 52.29 3rd +173 6th -18 21st 0.986 25th
22 Kings 86 54.77 1st +434 2nd -2 18th 0.983 28th
23 Panthers 81 51.69 5th +48 13th -26 23rd 0.989 22nd
24 Stars 79 50.34 13th +64 10th -38 25th 0.982 29th
25 Red Wings 79 48.52 25th -175 25th -46 26th 0.988 23rd
26 Sabres 78 47.19 28th -324 28th -32 24th 0.998 19th
27 Devils 70 47.07 29th -294 26th -61 27th 0.985 26th
28 Coyotes 70 44.93 30th -516 30th -67 29th 0.991 20th
29 Canucks 69 48.17 26th -335 29th -63 28th 0.986 24th
30 Avalanche 48 48.65 23rd -299 27th -111 30th 0.966 30th
Playoff Prediction Success 60.00% 66.67% 93.33% 80.00%
Average Error in Predicting Standings 8.4 7.13 1.8 3.53

The results speak for themselves. Corsi is horrible at predicting team success. Shot differential is better, but still ineffective. On the other hand, goal differential correctly predicted whether a team was in or out of the playoffs for 28 of 30 teams. PDO was also very effective. If you believe the Corsi proponents, this would suggest that playoff appearances are nothing more than luck.

But what is most impressive is that both PDO and goal differential determined who the top nine teams were, although not necessarily in exact finishing order. Goal differential also predicted the worst eight teams, although not perfectly in order.

In the case of PDO, Found’s evaluation determined that PDO did not even out over 10 years of NHL data, as proponents say it will. He accounted this to some teams taking better shots when compared to other teams, teams allowing better shots than others, and teams having better goaltending than others. PDO is not luck.

On the subject of goaltending, Found pointed out that of the 40 teams to reach the Stanley Cup finals in the 20 years prior to his study, 28 teams had a goalie that received Vezina Trophy votes in the season they appeared in the final, while the other 12 had goaltenders good enough to earn Vezina votes in other seasons. In Smith we trust.

Corsi was so far from being accurate, that four of the top five CF% teams missed the playoffs, and ranked Boston as second despite finishing 13th in the standings. Boston was only one point from missing the playoffs: hardly an endorsement for Corsi being a good tool.

Found best evaluated the results of his findings as such: “There are meaningless shots, but no meaningless goals…”

My point in going down this rabbit hole? Shot-based statistics and goal-based statistics are tools to measure how the “process” is working in achieving team success. But at the end of the day, results come down to wins, losses, and overtime and shootout losses. In other words it comes down to points, and more specifically, points in relation to the teams in the division, and conference.

Increasing shots for and decreasing shots against – controlling possession – is the “process” the Calgary Flames have chosen to use to improve team success. We cannot confuse “results” with “process.”

The Gulutzan Flames: Results and significant statistics

In the NHL, just under 12% of all regular season games go into overtime. Based on this, the average points per team is just under 92. Setting the bar correctly, that means that evaluating teams by the correct standard is to relate everything to 92 points, not 82 and a .500 record.

The 2016-17 Flames finished the season with 94 points, 12 games over .500, but only two games over the average. The 2017-18 Flames are on pace for 97 points, five games over average. The good news: they are better than average.

The playoffs are played and won at 5v5 and on special teams. Relying on 3v3 overtime is not a good strategy, but it has become the norm for Calgary this year. The Flames have been to overtime 17 times this season, second most in the NHL behind Arizona, and in 30% of all games played. They are truly playing for a tie.

In terms of regulation wins and losses in the regular season, the Flames were 32-33 last season. They were the only team to make the playoffs that was below .500 in that category; no other playoff team was fewer than six games over .500. Two Eastern teams missed the playoffs despite being above .500 in regulation.

This season, the Flames are 20-19 in regulation wins and losses, 16th in the NHL. The Gulutzan Era Flames are just above .500 in regulation, and when you include the playoffs, they are three games below .500.

This season, in terms of regulation wins, the Flames have 20, tied for 18th. Vancouver and Chicago are only one win back at 19. They are 11th in regulation wins in the Western Conference.

Even the worst team in the NHL last season, Colorado, is 24-21, three games over .500 and four regulation wins better.

Setting expectations: Roster upgrades

The Flames were upgraded in the summer. On defense, replacing Dennis Wideman and Deryk Engelland with Travis Hamonic and a re-signed Michael Stone made the defensive core one of the strongest on paper. Up front, the core remained intact, with a few minor changes to the bottom six.

The biggest improvement has been the goaltending. Mike Smith was the biggest question entering the season, but he has done better than anyone could have hoped. The backup question has also been answered: although it took some time to figure things out, David Rittich has save and goals against numbers that are better than Smith’s.

There is little argument that Smith has been an upgrade over Brian Elliott and Chad Johnson. As for the rest of the roster, you may disagree that there have been improvements. What we think does not matter: Brad Treliving believes the roster has been upgraded, so much so he leveraged the future to bring in players he believes can take this team to a better finish in the regular season, and on a playoff run in the spring.

Setting expectations: Points

Last season, the Flames finished 45-30-5 for 94 points. They came out of the gate very slow then as they adjusted to Gulutzan’s system. Roughly one-quarter of the season in, they were 8-12-1, and were beginning to show signs of improvement. From that point on, the team ran up a 37-21-3 record, 77 points in 61 games, a 0.631 win percentage. Once the kinks were ironed out, this is the level of play last year’s team was capable of. It translates into a 104-point season.

There is no excuse for a slow start this year; of the core, only Hamonic is new to the system. With the offseason moves, 94 points would be a step backwards. At the very least, this year’s team should be capable of matching the level of play they achieved last season, or 104 points.

Setting expectations: Pacific Division standings

If you look at the division, most of the other teams are in a period of decline or rebuild mode. Whether it has been making cap room by moving out players, teams starting the season with key players on injured reserve, or aging players on the decline, only the Flames made significant upgrades. For the most part, things are playing out very close to what should have been expected.

2016-17

Rank

Team 2016-17 Points Projected

2017-18

1 Anaheim 105 94
2 Edmonton 103 75
3 San Jose 99 100
4 Calgary 94 97
5 Los Angeles 86 97
6 Arizona 70 56
7 Vancouver 69 73

Only Los Angeles has shown significant improvement of those in the hunt for a playoff spot. A new coach and improved play following a disappointing season does make this a predictable result. It should not be enough to catch this edition of the Flames.

If we have set the expectations correctly at 104 points, and a 100-point season is what even Gulutzan predicted was possible, then the Flames should be leading the division. To be third of the perceived contenders, and in a dogfight for their playoff lives, is disappointing. A 97-point pace is a step backwards.

And then there was Vegas. That they are in the conversation at all for a playoff spot is the biggest surprise of the year; to be leading the Western Conference is just insane. It has also put a playoff spot in jeopardy for the Flames.

Vegas is on pace for 113 points, a testament to what is possible with solid coaching. At worst, the Flames should be in second, on pace for 104 points and home ice advantage in the first round of the playoffs. But that was last years team’s potential. Improvements were made. This team should be battling with Vegas for first in the Pacific.

The team is underperforming. Next week, I will look at what is causing the team to be sitting below expectations.



  • Cup hope

    Thanks for the excellent analysis Skylardog. I have been following several team nation sites and flamesnation seems far superior in quality of discussion.