Speaking from personal experience as a critic, it’s definitely gamed by PR. Have been approached to ensure positive reviews get counted and even been haggled with to flip a rotten to a fresh. It’s a good reminder of why Metacritic is a vastly superior barometer of a movie’s critical standing, but I cannot deny that RT has a great internal team working on it that does a lot to expand the voices that get recognition for opining about film (even if it’s cynically co-opted for other purposes).
Great post which is very relevant to the film reviewing I do. Just yesterday I received an email from a PR person wanting me to review a particular film. I asked when and where it was playing and received a vague response that listed a couple of cities in California along with Boise, Idaho. She then asked if I was a Rotten Tomato credited critic. I'm not, but I'm sure she's looking for reviews that give the film a "fresh" score. This begs the question, how do they determine if something is "fresh" or "rotten?" I'd assume that if I give a film three out of four stars it would be considered "fresh" but what about a rating below that? I've been suspicious of Rotten Tomatoes for a while and will go to Metacritic if I really want to see how a film is being reviewed. The critics there seem more legitimate. Thanks again for all of your great research.
One confounding factor that you don't seem to be including in the analysis is cultural shifts among critics. We've seen a period of intense political polarization and heavy-handed political messaging in many projects, with many media reviewers considering themselves to carry important partisan roles in that culture war, and an obligation to promote the correct sorts of media.
There's a long string of RT-memes of products receiving great reviewer scores with terrible viewer scores and vice-versa, and whenever see an outrageous schism in those scores, just looking at what side the reviewers are on will tell you the perceived political alignment of the product.
Some simple examples would be Dave Chapelle getting near 100% audience score, with terrible reviewer scores, or Hannah Gadsby getting 100% reviewer score with 26% audience score.
And it goes on for a lot of media. The deluge of less appreciated Marvel and Star Wars content coincides heavily with the types of messaging that have many critics feeling that supporting it is mandatory. E.g. The Acolyte was an unmitigated disaster but despite an audience rating of 19% has a critic rating of 79%. Meanwhile Ahsoka was a much better show, but where the audience rating is 45 percentage points higher, the critic rating has minimal differentiation at only 6 percentage points of difference between the disastrously bad show with "correct" messaging and the quite decent show with "correct" messaging.
These dynamics seem to be a massive factor in their own right, to the extent that the most accurate analysis might actually be to account for them in particular and try to analyze much of the remaining system in isolation.
I think Star Wars is a tougher issue. The Acolyte looks good, and might even be a good story in its own universe, but it's only good Star Wars if you're not a Star Wars fan, much less a superfan or purist; to big Star Wars fans, it's pretty bad.
I can't comment as much on Ahsoka, since I stopped about three episodes in. From what my Star Wars superfan friends tell me, it IS good Star Wars, but only if you're a superfan. To the rest of us, it's just a lot of stuff happening to characters I think we're supposed to already know from games or Saturday morning cartoons. I'm only half joking.
This type of disparity between mainstream and fandoms has got to mess with the RT system on certain projects, I'd think.
3. About a month or two ago, my 17-year-old son swore off Rotten Tomatoes forever upon learning that MEGAN has a higher critic's score than Shawshank Redemption. It comes up more often in conversation than it probably should in normal family homes.
I've never been a rotten tomatoes fan and never found it reliable, for the reason that I've never trusted or agreed with most film critics. Critics over-index on extreme views which makes sense as it's their job to have a distinct PoV but it doesn't make for balanced takes. Then there's the trust issue, of which this is just one example. The film industry has been manufacturing fame and manipulating attention/opinion since it began. Look at Anora winning Best Picture...
I have always found IMDB far more reliable and it's been my go-to for years, although you usually have to wait a few weeks to months from a film's release for the score to settle closer to its real rating. I'm guessing IMDB attracts a slightly more sophisticated or niche audience than the audience on RT (but I'm obviously biased there!).
I think that Rotten Tomatoes, like so many things, needs to be understood to be useful. A film can be "100% Fresh" with all of those reviews being just "it's okay".
And disparity between critic rating and audience rating can be just as telling as the number itself, and not always an indication of PR inflation. I get curious when a movie is rotten from the audience and fresh from the critics - often it's doing something weird, unexpected, or interesting. I always use the example of Spencer (2021), which was much better than I thought it would be (and isn't a straight biopic of Princess Diana - there's more going on), which has a 52% Audience Score and 83% Critic Score.
This is a little bugbear of mine but You cant use pre mid 2019 rt score and post and call it the same variable even if thats rt’s game. “Verified audience scores” (fandango required proof of ticket to rate) are a different score with a different fundamental dynamic.
You should compare to scores over time on Metacritic. Are the scores inflating over there too, over the same time period? That might shed light on whether this change is because of Fandango, or because of broader taste changes among the critical class.
To make the case this might be driven by changes among critics:
Americans from 2016-Present have become very politically polarized by education. Do movie critics mostly have college degrees? Then their worldviews got a lot more similar in that period. It would make sense for the movies they like to have gotten more similar too. The audience score would diverge as a result.
Also, the official Rotten Tomatoes rationale you quoted deserves to be interrogated more. "These reviewer additions were made to diversify its critic pool by including more women, people of color, and underrepresented groups". This sounds like a Diversity, Equity, and Inclusion (DEI) initiative.
One thing I've noticed about DEI initiatives is that while the people hired tend to look very different, their social-political worldviews are usually the same. It makes sense they'd like the same movies. Especially if those movies are culture-war flashpoints.
Curious to see if Letterboxd review system is a better proxy for general sentiment of a film's public opinion (rather than the RT audience score). With a more wisdom of the crowd approach with Letterboxd, it may be more of a signal of a "quality" of a film. Although RT does this with their audience score, I don't think a mere cutoff of positive or not positive is particularly useful (in RT's case). Then again you don't get the filtering of a critic score like with RT. Also Letterboxd reviews probably are skewed as the user base is probably fairly younger.
You complain about “outdated” platforms but then complain certain sites “don’t work” on your mobile browser? Substack and Rotten tomatoes don’t work on MY browser and they have zero customer service. These so called “outdated platforms are keeping the net alive and it’s your attitude that’s the problem.
Speaking from personal experience as a critic, it’s definitely gamed by PR. Have been approached to ensure positive reviews get counted and even been haggled with to flip a rotten to a fresh. It’s a good reminder of why Metacritic is a vastly superior barometer of a movie’s critical standing, but I cannot deny that RT has a great internal team working on it that does a lot to expand the voices that get recognition for opining about film (even if it’s cynically co-opted for other purposes).
Great post which is very relevant to the film reviewing I do. Just yesterday I received an email from a PR person wanting me to review a particular film. I asked when and where it was playing and received a vague response that listed a couple of cities in California along with Boise, Idaho. She then asked if I was a Rotten Tomato credited critic. I'm not, but I'm sure she's looking for reviews that give the film a "fresh" score. This begs the question, how do they determine if something is "fresh" or "rotten?" I'd assume that if I give a film three out of four stars it would be considered "fresh" but what about a rating below that? I've been suspicious of Rotten Tomatoes for a while and will go to Metacritic if I really want to see how a film is being reviewed. The critics there seem more legitimate. Thanks again for all of your great research.
https://www.masterclass.com/articles/begs-the-question
One confounding factor that you don't seem to be including in the analysis is cultural shifts among critics. We've seen a period of intense political polarization and heavy-handed political messaging in many projects, with many media reviewers considering themselves to carry important partisan roles in that culture war, and an obligation to promote the correct sorts of media.
There's a long string of RT-memes of products receiving great reviewer scores with terrible viewer scores and vice-versa, and whenever see an outrageous schism in those scores, just looking at what side the reviewers are on will tell you the perceived political alignment of the product.
Some simple examples would be Dave Chapelle getting near 100% audience score, with terrible reviewer scores, or Hannah Gadsby getting 100% reviewer score with 26% audience score.
And it goes on for a lot of media. The deluge of less appreciated Marvel and Star Wars content coincides heavily with the types of messaging that have many critics feeling that supporting it is mandatory. E.g. The Acolyte was an unmitigated disaster but despite an audience rating of 19% has a critic rating of 79%. Meanwhile Ahsoka was a much better show, but where the audience rating is 45 percentage points higher, the critic rating has minimal differentiation at only 6 percentage points of difference between the disastrously bad show with "correct" messaging and the quite decent show with "correct" messaging.
These dynamics seem to be a massive factor in their own right, to the extent that the most accurate analysis might actually be to account for them in particular and try to analyze much of the remaining system in isolation.
I think Star Wars is a tougher issue. The Acolyte looks good, and might even be a good story in its own universe, but it's only good Star Wars if you're not a Star Wars fan, much less a superfan or purist; to big Star Wars fans, it's pretty bad.
I can't comment as much on Ahsoka, since I stopped about three episodes in. From what my Star Wars superfan friends tell me, it IS good Star Wars, but only if you're a superfan. To the rest of us, it's just a lot of stuff happening to characters I think we're supposed to already know from games or Saturday morning cartoons. I'm only half joking.
This type of disparity between mainstream and fandoms has got to mess with the RT system on certain projects, I'd think.
You should tackle game reviews next. That is a cluster...mess of note. Especially Metacritic, which I think became something it was never meant to be.
1. Love this.
2. Love Dobrenko.
3. About a month or two ago, my 17-year-old son swore off Rotten Tomatoes forever upon learning that MEGAN has a higher critic's score than Shawshank Redemption. It comes up more often in conversation than it probably should in normal family homes.
Great piece.
I've never been a rotten tomatoes fan and never found it reliable, for the reason that I've never trusted or agreed with most film critics. Critics over-index on extreme views which makes sense as it's their job to have a distinct PoV but it doesn't make for balanced takes. Then there's the trust issue, of which this is just one example. The film industry has been manufacturing fame and manipulating attention/opinion since it began. Look at Anora winning Best Picture...
I have always found IMDB far more reliable and it's been my go-to for years, although you usually have to wait a few weeks to months from a film's release for the score to settle closer to its real rating. I'm guessing IMDB attracts a slightly more sophisticated or niche audience than the audience on RT (but I'm obviously biased there!).
I think that Rotten Tomatoes, like so many things, needs to be understood to be useful. A film can be "100% Fresh" with all of those reviews being just "it's okay".
And disparity between critic rating and audience rating can be just as telling as the number itself, and not always an indication of PR inflation. I get curious when a movie is rotten from the audience and fresh from the critics - often it's doing something weird, unexpected, or interesting. I always use the example of Spencer (2021), which was much better than I thought it would be (and isn't a straight biopic of Princess Diana - there's more going on), which has a 52% Audience Score and 83% Critic Score.
Very good journalism, thank you.
This is a little bugbear of mine but You cant use pre mid 2019 rt score and post and call it the same variable even if thats rt’s game. “Verified audience scores” (fandango required proof of ticket to rate) are a different score with a different fundamental dynamic.
Hey, what's up with the jump in all the graphs in 2006?
Hey, what's up with the jump in all the graphs in 2006?
Hey, what's up with the jump in all the graphs in 2006?
You should compare to scores over time on Metacritic. Are the scores inflating over there too, over the same time period? That might shed light on whether this change is because of Fandango, or because of broader taste changes among the critical class.
To make the case this might be driven by changes among critics:
Americans from 2016-Present have become very politically polarized by education. Do movie critics mostly have college degrees? Then their worldviews got a lot more similar in that period. It would make sense for the movies they like to have gotten more similar too. The audience score would diverge as a result.
Also, the official Rotten Tomatoes rationale you quoted deserves to be interrogated more. "These reviewer additions were made to diversify its critic pool by including more women, people of color, and underrepresented groups". This sounds like a Diversity, Equity, and Inclusion (DEI) initiative.
One thing I've noticed about DEI initiatives is that while the people hired tend to look very different, their social-political worldviews are usually the same. It makes sense they'd like the same movies. Especially if those movies are culture-war flashpoints.
Curious to see if Letterboxd review system is a better proxy for general sentiment of a film's public opinion (rather than the RT audience score). With a more wisdom of the crowd approach with Letterboxd, it may be more of a signal of a "quality" of a film. Although RT does this with their audience score, I don't think a mere cutoff of positive or not positive is particularly useful (in RT's case). Then again you don't get the filtering of a critic score like with RT. Also Letterboxd reviews probably are skewed as the user base is probably fairly younger.
You complain about “outdated” platforms but then complain certain sites “don’t work” on your mobile browser? Substack and Rotten tomatoes don’t work on MY browser and they have zero customer service. These so called “outdated platforms are keeping the net alive and it’s your attitude that’s the problem.