Two thoughts on the Box Office Power Rankings:
- I believe I’ve been miscalculating and/or misrepresenting one element of the rankings. I’ve been calling the second box-office component per-screen average, when I now think that I’ve been using the number of theaters instead of screens. It probably wouldn’t make that much of a difference; movies on multiple screens per theater would be the beneficiaries, but smaller films have still done pretty well in this measure. It will be called per-theater average from this point forward.
- I’m toying with the idea of trying to use the Box Office Power Rankings as a Best Picture Oscar predictor. The hypothesis would be that longevity, peak position, and peak score in the rankings might name the winner once the nominees are known. Alternatively, some aggregate score could be devised.
And in this week’s rankings, 3:10 to Yuma staved off The Brave One to retain its crown, assisted in large part by movie critics giving the Jodie Foster vehicle mediocre reviews.
Box Office Power Rankings: September 14-16, 2007
(Rank) Movie (last week; box office, per-theater, Rotten Tomatoes, Metacritic: total)
(1) 3:10 to Yuma (1; 9, 8, 8, 9: 34)
(2) The Brave One (-; 10, 10, 6, 7: 33)
(3) Superbad (2; 7, 6, 9, 8: 30)
(4) The Bourne Ultimatum (3; 4, 4, 10, 10: 28)
(5) Mr. Woodcock (-; 8, 9, 1, 3: 21)
(6) Dragon Wars (D-War) (-; 6, 7, 5, 2: 20)
(7) Mr. Bean’s Holiday (6; 1, 3, 7, 7: 18)
(7) Halloween (5; 5, 5, 3, 5: 18)
(9) Balls of Fury (8; 3, 1, 4, 2: 10)
(9) Rush Hour 3 (10; 2, 2, 2, 4: 10)
Methodology
Culture Snob’s Box Office Power Rankings balance box office and critical reception to create a better measure of a movie’s overall performance against its peers.
The weekly rankings cover the 10 top-grossing movies in the United States for the previous weekend. We assign equal weight to box office and critical opinion, with each having two components. The measures are: box-office gross, per-theater average, Rotten Tomatoes score, and Metacritic score.
Why those four? Box-office gross basically measures the number of people who saw a movie in a given weekend. Per-theater average corrects for blockbuster-wannabes that flood the market with prints, and gives limited-release movies a fighting chance. Rotten Tomatoes measures critical opinion in a binary way. And Metacritic gives a better sense of critics’ enthusiasm (or bile) for a movie.
For each of the four measures, the movies are ranked and assigned points (10 for the best performer, one for the worst). Finally, those points are added up, with a maximum score of 40 and a minimum score of four.
I’ll be interested to see how your ranking does as an Oscar predictor! It certainly seems feasible. Though – being a longtime nerd about these things – I think you may find that very high Metacritic and RT scores can sometimes be counterindicators of Oscar success, and that a lot of the Best Picture contenders tend to come in the 60-80 range. On the other hand, the aggregation with box office may correct for that. Interesting!