- Aug 18, 2009
- 4,229
- 17,496
- AFL Club
- Richmond
[Edit! 2015 squiggles start here. Interactive squiggle (with tips, predictor & FAQ) is here. 2014 squiggles started here.]
I like to chart things for no good reason, so I decided to make a scatterplot of teams from their scores for and against over the course of the 2013 season. It looks like this:
Each team's flag represents its current position, with the line tracking their journey since Round 1.
I added premiers from the last 20 years, too, since they mostly wind up in the same area.
The middle is an unreadable mess, but there are interesting results, too.
Sydney is currently sitting right where they were when they won the Cup last year. Hawthorn has tracked in a very small area all year, with an exceptionally strong attack but a weaker defense than any premiership-winning team of the last two decades. Geelong is making a late run. Freo look exactly like a Ross Lyon team.
There's a bunch of teams all around the same area, then a fairly large gap to Port Adelaide, Brisbane, St. Kilda, Gold Coast, and the Western Bulldogs. Then GWS and Melbourne sit a long way behind.
I also decided to plot the Top 4 teams of the last 20 years who failed to win the premiership and see where they wound up. So in this bunch we have many strong teams. I was interested in seeing whether there was a noticeable difference between flag winners and their closest competition:
And there does seem to be. In fact, it's possible to draw a "premiership curve" that encompasses 13 of the 20 flags, and 75% of the time over the last 20 years, the team closest to or farthest beyond that curve has won the flag. (The five exceptions: according to this theory, Hawthorn should have won in 2012, Geelong should have won in 2008, St Kilda should have won in 2005, Brisbane should have won in 1999, and St Kilda should have won in 1997.)
Model Stuff: Each data point is calculated by taking a team's offensive rating and dividing it by the other team's defensive rating. For example, in a match between Hawthorn (OFF: 74.23%) and Essendon (DEF: 54.55%), Hawthorn is expected to produce a score 1.36 times higher than average. Each data point is a weighted average, representing 9% of the most recent round, 8.1% of the round before that (91% of 9%), then 7.45%, and so on. Aside from scores for and against, the only adjustment the model makes is for interstate games, where it assumes a 12-point advantage to the home team. This model has correctly tipped 117 winners this year (72.2%).
Update! Play with Dynamic Squiggles here.
I like to chart things for no good reason, so I decided to make a scatterplot of teams from their scores for and against over the course of the 2013 season. It looks like this:
After Round 18
Each team's flag represents its current position, with the line tracking their journey since Round 1.
I added premiers from the last 20 years, too, since they mostly wind up in the same area.
The middle is an unreadable mess, but there are interesting results, too.
Sydney is currently sitting right where they were when they won the Cup last year. Hawthorn has tracked in a very small area all year, with an exceptionally strong attack but a weaker defense than any premiership-winning team of the last two decades. Geelong is making a late run. Freo look exactly like a Ross Lyon team.
There's a bunch of teams all around the same area, then a fairly large gap to Port Adelaide, Brisbane, St. Kilda, Gold Coast, and the Western Bulldogs. Then GWS and Melbourne sit a long way behind.
I also decided to plot the Top 4 teams of the last 20 years who failed to win the premiership and see where they wound up. So in this bunch we have many strong teams. I was interested in seeing whether there was a noticeable difference between flag winners and their closest competition:
Premiers vs Top 4 Also-Rans
And there does seem to be. In fact, it's possible to draw a "premiership curve" that encompasses 13 of the 20 flags, and 75% of the time over the last 20 years, the team closest to or farthest beyond that curve has won the flag. (The five exceptions: according to this theory, Hawthorn should have won in 2012, Geelong should have won in 2008, St Kilda should have won in 2005, Brisbane should have won in 1999, and St Kilda should have won in 1997.)
Model Stuff: Each data point is calculated by taking a team's offensive rating and dividing it by the other team's defensive rating. For example, in a match between Hawthorn (OFF: 74.23%) and Essendon (DEF: 54.55%), Hawthorn is expected to produce a score 1.36 times higher than average. Each data point is a weighted average, representing 9% of the most recent round, 8.1% of the round before that (91% of 9%), then 7.45%, and so on. Aside from scores for and against, the only adjustment the model makes is for interstate games, where it assumes a 12-point advantage to the home team. This model has correctly tipped 117 winners this year (72.2%).
Update! Play with Dynamic Squiggles here.
Last edited: