the spread the (data) science of sports

How consistent are these ratings?

Sun 25 May 2014

Historical rankings and stability

It's not time yet for the next ranking algorithm (power rankings!), so I wanted to take time for a quick digression. This was prompted when I was asked where Baltimore ranked in 2012. After all, no one picked them to make it to, much less win, the Super Bowl.

Good question. Now, obviously, the best team in the league doesn't always win the Super Bowl. 2013 was somewhat of an anomaly in that way. Remember all of the "The #1 offense vs. the #1 defense" and "the best two teams in the league, as it should be" narratives leading up to the game?

To my surprise, Houston was the #1 team in the Colley ratings (using adjusted win percentage) in 2012, and Baltimore didn't even check into the top 10! Seattle, who many thought was the best team to exit the playoffs, is a meager #7. However, if we include the margin of victory vector in our calculation to produce the Colley-Massey ratings, Seattle shoots up to #2, New England takes the #1 spot, and Houston tumbles to #9. If you think back to 2012, Houston won a number of close games, including two in overtime, and had two blowout losses.

This got me to thinking. How consistent are teams from year to year in the ratings? To find out, I computed the Colley and the Colley-Massey ratings over the 2002-2013 seasons (2002 to cover expansion). The new code is on Github. Here's the Colley ratings over time, along with each team's biggest year-to-year change. The presentation isn't ideal, sorry.

[table id=11 /]

The biggest hero-to-zero story, according to Colley's simple method, is actually the 2012 Texans, dropping 0.51 from their all-time high in 2012 to 0.22 in 2013. In the other direction, the 2004 Steelers shot up 0.54 points from the previous year.

If we factor in margin of victory and use the combined Colley-Massey ratings, we see a slightly different story.

[table id=12 /]

The 2013 Kansas City Chiefs are the biggest gainers, up 17.85 points from their 2012 rating of -12.25. Interestingly, this still leaves them with a 2013 rating of 5.59, which isn't super impressive. If you'll recall, we can interpret this as their expected margin of victory against an average team. Lack of offensive production strikes again.

In the other direction, the 2004 San Francisco 49ers plummeted 14.51 points from an already unimpressive rating of 2.73 in 2003. I'm sure many 49ers fans won't be surprised to find out that they aren't in the black again until 2009 (just barely, at 0.58), and don't have two consecutive years with a positive rating until 2011 and 2012. Quite the turnaround.

Standings

Over the entire time period, you won't be surprised to learn that New England and Indianapolis are the top-ranked teams (who can forget the endless stream of Brady-Manning matchups?) with average ratings of 0.74 and 0.67, respectively. Detroit is the bottom-rated team, with Oakland and Cleveland following. The median team is somewhere between the Bengals and the Falcons.

When factoring in margin of victory, New England is first, and it's not even close. Their average rating is 8.43. The second place team is Pittsburgh at 3.83. Something something RUTS.

Volatility

How volatile are teams' ratings? Depends on which system you use. To find out, I took the standard deviation of the year-to-year difference for each team 2003-2013. Using the win-percentage based Colley ratings, Pittsburgh is the most volatile, with an average rating of .60 (3rd overall!), but a standard deviation of 0.24. Baltimore is the most volatile when using margin of victory, with a standard deviation of 7.64. Cleveland, is unfortunately for Browns fans, the most consistent, with a standard deviation difference of 3.09.

That's a lot of numbers and lists to throw at you. Next up, we'll tackle power rankings and try and figure out how we can get some predictive power out of all these ratings.

Happy Memorial Day!

blog comments powered by Disqus